I have a window in my app (containing a webview with some elements sized via javascript based on the size of the window). When I run the app without a target SDK, everything is sized perfectly. The results from:
getWindowManager().getDefaultDisplay().getMetrics(dm);
x = dm.widthPixels;
y = dm.heightPixels;
... are very different from when I run the application with a target sdk of 4+.
The sizes from when I run the app as not having a target sdk make the resizing of elements in my webview work, but otherwise getMetrics() reports that the window is significantly smaller than it is. (I end up with a webview that fills the window, but all my elements squished up in the top-lefthand corner.)
How can I get开发者_Go百科 the full view size when a target sdk is specified, and why are the numbers so different?
To answer my own question:
When the target SDK isn't specified, Android renders the windows in emulation mode and automatically adjusts for the screen density. When the target is specified, the Window Manager reports density independent pixels so you get measurements that aren't scaled for the resolution of the device. To get the actual pixel sizes, you can do this:
getWindowManager().getDefaultDisplay().getMetrics(dm);
density = dm.densityDpi;
x = (int) dm.widthPixels * 160 / density;
y = (int) dm.heightPixels * 160 / density;
精彩评论