At a high level (or low level if you'd like), what's a good way to implement a smudge affect for a d开发者_StackOverflow中文版rawing program on the iPad using Quartz2D (Core Graphics)? Has anyone tried this?
(source: pixlr.com)Thanks so much in advance for your wisdom!
UPDATE I found this great article for those interested, check it!
Link now at: http://losingfight.com/blog/2007/09/05/how-to-implement-smudge-and-stamp-tools/
I would suggest implementing a similar algorithm to what is detailed in that article using OpenGL ES 2.0 to get the best performance.
- Get the starting image as a texture
- Set up a render-to-texture framebuffer
- Render initial image in a quad
- Render another quad the size of your brush with a slightly shifted view of the image, multiplied by an alpha mask stored in a texture or defined by, for example, a gaussian function. Use alpha-blending with the background quad.
- Render this texture into a framebuffer associated with your CAEAGLLayer-backed view
- Go to 1 on the next -touchesMoved event, with the result from your previous rendering as the input. Keep in mind you'll want to have 2 texture objects to "ping-pong" between as you can't read from and write to the same texture at once.
I think it's unlikely you're going to get great performance on the CPU, but it's definitely easier to set up that way. In this setup, though, you can have essentially unlimited brush size, etc and you're not looping over image drawing code.
Curious about what sort of performance you do get on the CPU, though. Take care :)
精彩评论