I'm developing a game which uses an activity with a Bitmap from a 2000x2000 px Image.
In my HTC Hero when I run the activity I'm using 12MB.
However if I try to run the app in any emulator with 16MB heap, the VM crashes because of a OutOfMemoryError trying to allocate 16,4 MB; how it is possible?
I tried too with other emulator and got 20MB used.
To measure the amount of memory I'm using this :
int usedMegs = (int) (Debug.getNativeHeapAllocatedSize() / 1048576L);
String usedMegsString = String
.format(" - Memory Used: %d MB", usedMegs);
getWindow().setTitle(usedMegsString);
Why does the same bitmap n开发者_Python百科eed 12MB in a HTC Hero and 20MB in other devices?
Edit : I figured out it's cause of the density. Density 1 = 12MB , Density 0.75 = 8MB and Density 1.5 = 20MB ( Not exactly, some MB are from other activities)
Can I say to a 1.5 density device to use 1 to decode the Bitmap???
Edit 2 : I had the image in /drawable so when I loaded it with 1.5 was made bigger. If I put the image in /drawable-hdpi the image need less memory (12 MB) becaus it isn't scaled.
How are you loading your bitmap into memory? This is just speculation on my part, but perhaps the HTC device is loading the bitmap using a 24-bit colorspace while other devices are using a 32-bit colorspace. A 2000x2000 bitmap at 24 bits per pixel would use roughly 12 MB of memory, while the same bitmap at 32 bits per pixel would need closer to 16 MB or memory.
精彩评论