I'm currently involved in a hardware project where I am mapping triangular shaped LED to traditional bitmap images. I'd like to overlay a triangle vector onto an image and get the average pixel data within the bounds of that vector. However, I'm unfamiliar with the math needed to calculate this. Does anyone have an algorithm or a link that could send me in the right direction? (I tagged this as Python, which is pr开发者_Go百科eferred, but I'd be happy with the general algorithm!)
I've created a basic image of what I'm trying to capture here: http://imgur.com/Isjip.gif
Will this work: http://www.blackpawn.com/texts/pointinpoly/default.html ?
You can do line rasterization on the lineparts to determine for each pixel at each horizontal scanline lie within your triangle. Sum and divide their RGB values to get the average.
精彩评论