Imagine you have some memory containing a bunch 开发者_运维问答of bytes:
++++ ++-- ---+ +++-
-++- ++++ ++++ ----
---- ++++ +
Let us say +
means allocated and -
means free.
I'm searching for the formula of how to calculate the percentage of fragmentation.
Background
I'm implementing a tiny dynamic memory management for an embedded device with static memory. My goal is to have something I can use for storing small amounts of data. Mostly incoming packets over a wireless connection, at about 128 Bytes each.
As R. says, it depends exactly what you mean by "percentage of fragmentation" - but one simple formula you could use would be:
(free - freemax)
---------------- x 100% (or 100% for free=0)
free
where
free = total number of bytes free
freemax = size of largest free block
That way, if all memory is in one big block, the fragmentation is 0%, and if memory is all carved up into hundreds of tiny blocks, it will be close to 100%.
Calculate how many 128 bytes packets you could fit in the current memory layout. Let be that number n.
Calculate how many 128 bytes packets you could fit in a memory layout with the same number of bytes allocated than the current one, but with no holes (that is, move all the + to the left for example). Let be that number N.
Your "fragmentation ratio" would be alpha = n/N
If your allocations are all roughly the same size, just split your memory up into TOTAL/MAXSIZE
pieces each consisting of MAXSIZE
bytes. Then fragmentation is irrelevant.
To answer your question in general, there is no magic number for "fragmentation". You have to evaluate the merits of different functions in reflecting how fragmented memory is. Here is one I would recommend, as a function of a size n
:
fragmentation(n) = -log(n * number_of_free_slots_of_size_n / total_bytes_free)
Note that the log
is just there to map things to a "0 to infinity" scale; you should not actually evaluate that in practice. Instead you might simply evaluate:
freespace_quality(n) = n * number_of_free_slots_of_size_n / total_bytes_free
with 1.0
being ideal (able to allocate the maximum possible number of objects of size n
) and 0.0
being very bad (unable to allocate any).
If you had [++++++-----++++--++-++++++++--------+++++] and you wanted to measure the fragmentation of the free space (or any other allocation) You could measure the average contiguous block size Total blocks / Count of contiguous blocks.
In this case it would be 4/(5 + 2 + 1 + 8) / 4 = 4
Based on R.. GitHub STOP HELPING ICE's answer, I came up with the following way of computing fragmentation as a single percentage number:
Where:
n
is the total number of free blocksFreeSlots(i)
means how manyi
-sized slots you can fit in the available free memory spaceIdealFreeSlots(i)
means how manyi
-sized slots would fit in a perfectly unfragmented memory of sizen
. This is a simple calculation:IdealFreeSlots(i) = floor(n / i)
.
How I came up with this formula:
I was thinking about how I could combine all the freespace_quality(i)
values to get a single fragmentation percentage, but I wasn't very happy with the result of this function. Even in an ideal scenario, you could have freespace_quality(i) != 1
if the free space size n
is not divisible by i
. For example, if n=10
and i=3
, freespace_quality(3) = 9/10 = 0.9
.
So, I created a derived function freespace_relative_quality(i)
which looks like this:
This would always have the output 1
in the ideal "perfectly unfragmented" scenario.
After doing the math:
All that's left to do now to get to the final fragmentation formula is to calculate the average freespace quality for all values of i
(from 1
to n
), and then invert the range by doing 1 - the average quality
so that 0 means completely unfragmented (maximum quality) and 1 means most fragmented (minimum quality).
精彩评论