I started to write a large application in C# for the first time in my life. I wrote sample module to test the idea behind my software. 开发者_开发问答This module contained several dozen C# dictionaries and lists of objects that each had several members and properties.
I was shocked that after initializing core objects it end up utilizing about 40MB of RAM.
I tested and found out that more than 30MB is allocated after object initialization, but I was under the impression that given the size of my objects, no more than a few hundred kilobytes should have been consumed.
Have I done something wrong or is .NET naturally memory intensive compared to native code applications?
How did you determine how much memory that was used? .NET applications are known to eagerly reserve more memory than they need, as long as there is plenty of memory available, and release memory back to the system if it starts to run short on memory.
I think you may get some pointers in this MSDN article.
Using Task Manager to look at your memory usage is likely to be horrendously inaccurate.
Instead, grab a proper memory profiling tool (dotTrace is very good, and has a 10 day trial) and take the time to see your actual memory consumption.
The sorts of things I've seen in my own code include
- Underestimating how much memory I'm actually using (not counting separate objects properly, not allowing for lists to have "spare" capacity)
- Keeping references to transient objects that I don't need
- Not allowing for transient objects created during operation that haven't yet been garbage collected
- Not allowing for unallocated memory - memory that has been claimed from the OS (and therefore shows as a part of the process in Task Manager) but which has't been allocated to any one object yet
- Not allowing for thread stacks (each thread gets a stack of its own; .NET applications are always multi-threaded as the framework may create several threads of its own).
In the days where we have several GB of RAM on a machine, the idea that a user-facing app like you would generally build in C# should only use a few 100K of ram is backwards. Any unused ram on your system is wasted.
With that in mind, C# allocates memory using very different strategy from C++, such that larger chunks are allocated and freed less often.
That said, 40 seems a little high. I'm used to something closer to 10-14Mb for very simple console apps. Maybe there's a form taking up ram, or maybe I'm on a different version of the framework (2.0).
The .Net runtime has a certain large overhead - we've found that even simple applications will tend to use much more memory that similar applications written in C++. Fortunately this overhead is quickly disipated in the noise as the size of the overall code increases. The second factor is that of garbage collection, the garbage collector runs "whenever", so by comparison to C++, memory allocations are not typically freed right away, but rather when it feels the requirement to do so.
I'd say that unless you have a very good reason to worry about RAM usage, don't. Like optimizations, you should only optimize where there is a customer/end-user need to do so. I don't believe users are so constrained by memory these days (unless you're talking about an embedded system) that they're going to notice or care much about a missing 38 MB.
It's not always about the language, it's usually about how it is used.
Good code can use memory efficiently in any language.
Bad code will use memory inefficiently in every language
If you are using Task Manager to look at the memory it can be misleading. Working Set is the amount of virtual memory mapped to the process. Not necessarily how much your application is using - this is especially true in .NET's garbage-collected environment. As your program allocates memory the .NET CLR/GC will usually request more memory than it actually needs from the OS so that it can efficiently allocate that memory to managed objects in your program in the future.
A quick and dirty way (very dirty) to see if this is affecting you is to set Process.MaxWorkingSet property to 0. This is similar to using SetProcessWorkingSetSize in Win32 to try and trim the amount of pages mapped to the process. If you immediately see a drop in the memory usage then you know what is going on. However, as soon as you start allocating memory again via the GC/CLR it will go back up - and usually that is a good thing. Really you shouldn't worry about it and give the GC a chance to do it the right way.
To both optimize your program's memory usage and get a better idea of how memory allocation works in the CLR I suggest you start messing with dotTrace (my preference), Ants Profiler (who incidentally publishes a cool video on this topic here). CLRProfiler is interesting too, but it is a bit dated these days, but it is free.
Doing a quick calc you are probably allocating around 340K objects (subtracting the usual c12MB hit for single console apps on 32bit, yikes Steve!) and seeing [removing the reference to a typical scenario eating 8bytes per object + then some for other usual suspects] of runtime and System.Object waste for reference types by the bloated runtime tech copyright @ Sun.com..
Try and design for value types / structs if at all possible.. if you cannot, tough, tell your users not to run more than 10 .NET apps or their machine will feel slower than C64. If it makes you feel better, try WPF or Silverlight and feel c100MB penalty for a few buttons and flashy animations.
(downvote feel)
精彩评论