开发者

Determine limit for large object heap programmatically

开发者 https://www.devze.com 2023-02-07 04:17 出处:网络
Since it is recommended to use the IDisposal pattern for large objects, I am wondering, why there seems to be no reliable way to determine the limit, up from which an object is to be considered \"larg

Since it is recommended to use the IDisposal pattern for large objects, I am wondering, why there seems to be no reliable way to determine the limit, up from which an object is to be considered "large"?

Internally such distinction exist: the lower limit for objects being allocated on the LOH. W开发者_如何学Pythonhenever it is communicated publicly as 85k, one at the same time is prevented from relying on that number.

Especially for applications handling a lot of "larger" arrays, that limit is necessarily needed in order to implement proper memory management and preventing LOH fragmentation. For "smaller" arrays on the other hand, IDisposal does not make sense from a memory consumption point of view. Here, the compacting GC does a lot better.

Why is there no such thing as

GC.GetLOHLimit() 

or even better:

bool GC.ArrayTargetForManualDisposal(Type type, int length); 

Edit: I know, the IDisposable pattern is just a recommendation for proper handling of special objects (f.e. "large" or unmanaged objects). My question is not assuming, there would be any special handling for those objects by the runtime. I rather ask for a runtime support for implementors of the pattern (may be others as well), to know, when an object should follow special memory management or not.


IDisposable has no relation to managed memory management. Disposing of an object is a user convention\pattern, it is not used internally by the runtime, and has no effect on memory management. The runtime has no knowledge of IDisposable. The only special treatment\recognition of IDisposable is when used with the using keyword, but this is recognized by the compiler (at least the C# compiler), and not the runtime.

As for the LOH, there are no public guarantees with LOH algorithms - that's why there is no API to get at parameters such as maximum object size in the first place. It is quite feasible that in a future version of the CLR that object size for LOH consideration could indeed be dynamic. The CLR designers do not want users to couple themselves to internal details of memory management as this would make it harder or impossible for them to change it without breaking lots of existing programs.

If you are concerned with CLR memory-management I would first suggest getting to grips with the fact that IDisposable is not involved with it.


As chibacity said, IDisposable is not related with LOH management. Here is a good article on LOH: Large Object Heap Uncovered.

That being said, there is not any public API to determine the LOH size, as far as I know. You can find a reference to the 85000 bytes limit in SSCLI "rotor" (source available here: Shared Source Common Language Infrastructure 2.0 Release Download):

in clr/src/vm/gc.h:

#define LARGE_OBJECT_SIZE   85000

Although this source is a CLR 2.0 equivalent source, not CLR 4, I don't think they have changed this, as it would certainly have deep implication on existing code.

So if you want to do something smart with this value, you probably can put it safely in a constant, or make it configurable, but it will certainly not change dynamically during the course of the running process.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号