When my application crashes it generates a minidump to allow post mortem debugging. I use the options MiniDumpWithIndirectlyReferencedMemory and MiniDumpWithPrivateReadWriteMemory.
It all worked great until recently when the dumpfiles started getting over 500MB in size. Visual Studio throws the following error at me the moment I want to start debugging a dump that size.
"Not enough storage is available to process this command"
I have WinXP 32 bit, with 4GB of RAM in it of which I'm using less then 1GB. 开发者_如何学运维Depending on how economical VS2008 is with it's memory in this procedure, it should have plenty of addressable space.
What I do not want to do:
- Hack WinXP to get more memory: The app only keeps growing so this will only work temporarily. Here's a list of possible actions I found: http://www.msfn.org/board/topic/62001-not-enough-storage-is-available-to-process-this-command/
- Switch to 64-bit OS
- Omit the MiniDumpWithPrivateReadWriteMemory option
So how should I solve this?
- Omit some dll's from the dump? Split the included memory from groups of dll's over different dumps? Any idea on how to do this? (if this is even possible)
- ...?
You could try using WinDbg from the Debugging Tools for Windows package, to see if it does not manage the memory better than Visual Studio. However I would suggest trying 64-bit OS, even though you do not want to do it. In these days and times you should provide your application also as native 64 bit Windows application, and for that you need 64 bit Windows anyway.
I'm running into the same issue on VS2010 using windows 7 64-bit. Trying to load any dump created with MiniDumpWithPrivateReadWriteMemory set fails with "Not enough storage is available to process this command".
I don't think its actually a memory issue because opening a dump with MiniDumpWithFullMemory works fine and that actually creates larger dumps.
This appears to be a bug with visual studio and its disappointing that it still exists several versions later.
精彩评论