I have been suffering from a massive memory leak in one of my applications: my computer would go very slow whenever the leak happened. Before i fix the leak, i would like to understand why that happens.
Take, for example, the following small C++ code with a leak:
size_t size = 1024 * 1024 * 1024;
char* buf = new char[size];
std::fill_n(buf, size, 'o');
std::string pause;
std::getline(std::cin, pause);
From my understanding of virtual memory, disk caches, etc., i would expect that while the above code waits for user input at the last line, its 1-gigabyte buffer is not being used anymore, so the operating system should gradually swap it to disk and "forget" it. I (the user) would suffer a slowdown for some time, but things would return to normal after some 开发者_JAVA百科time.
This is not what happens on my system (Windows XP, 32-bit, with 2 GB of RAM). When i run the above code (twice, in 2 separate cmd
windows, in order to waste all available memory), i feel a great slowdown of my system; it gets better after a few minutes but doesn't get close to maximal performance. The system returns to normal after i terminate the leaking "applications".
Just to show some numbers, i used compilation of some source code as a performance test. I compiled it several times in a row to make several measurements (in seconds).
- Before the leak: 14, 2, 2, 3, 2, ...
- After the leak: 183, 40, 9, 7, 9, ...
- After closing the leaking "applications": 12, 2, 2, ...
A slowdown of 3x where i would expect none. How can this be explained?
The Windows swap file is of finite size. If you fill most of it up with your 1GB buffer, then the system has to work harder, swapping the rest of memory in and out of what little remains.
精彩评论