开发者

Getting pixel averages of a vector sitting atop a bitmap

开发者 https://www.devze.com 2022-12-31 06:47 出处:网络
I\'m currently involved in a hardware project where I am mapping triangular shaped LED to traditional bitmap images. I\'d like to overlay a triangle vector onto an image and get the average pixel data

I'm currently involved in a hardware project where I am mapping triangular shaped LED to traditional bitmap images. I'd like to overlay a triangle vector onto an image and get the average pixel data within the bounds of that vector. However, I'm unfamiliar with the math needed to calculate this. Does anyone have an algorithm or a link that could send me in the right direction? (I tagged this as Python, which is pr开发者_Go百科eferred, but I'd be happy with the general algorithm!)

I've created a basic image of what I'm trying to capture here: http://imgur.com/Isjip.gif


Will this work: http://www.blackpawn.com/texts/pointinpoly/default.html ?


You can do line rasterization on the lineparts to determine for each pixel at each horizontal scanline lie within your triangle. Sum and divide their RGB values to get the average.

0

精彩评论

暂无评论...
验证码 换一张
取 消