I don't know if it is possible using Objective-C, but I would like to find all the pixels/points that have been sensibilized in a touch event.
For example, if I use my thumb to touch the screen, then I will 'activate' certain pixels. If I use a pen to do the same thing, then I should get fewer pixels activated.
Is there a 开发者_运维知识库way to retrieve the coordinates of these pixels?
This is currently not supported in the SDK (I actually submitted a request for the very same feature).
Ten One Design created a framework and demo'd it: http://www.macrumors.com/2010/07/01/pressure-sensitive-sketching-on-ipad-demoed/
Unfortunately, they used private APIs, and have yet to release it to the public (although it would still not be very valuable for a distribution build).
We plan to release this capability as a free software library so it can be included in any application. However, this may not be possible for a while as the library now uses a private function call to access the required information.
its important to remember how touch detection works in iOS, the hardware and abstraction layer must detect the exact size of the touch in order to work out what the touch actually touches, iOS internally just checks to see which of the responders highest up the stack has the touch within its bounds. if it handles the touch, it goes from there and uses the first one as the responder.
this is total speculation but maybe you could set up a pseudo grid view in iOS that is transparent and detect which of your grid cells are within the touch ??? just an idea, and maybe fraught with performance issues but maybe a start ..?
精彩评论