Okay,
So I have been all over the net trying to find ways to correctly render using normals, and directional light (origonally found in one of learningwebgl.com tutorials). In the learningwebgl tutorials the normals are all setup in an array. In my program, I need to be able to load in wavefont OBJ files and then generate normals. I am wondering if it likely to be my normal generation code, or possibly a problem with my shaders. The code is a little confusing (as all the vertex/normal/indices data are in a single array each) but here is my normal generation code:
for(var i=0;i<d["vertices"].length;i++)d["normals"][i] = 0;
for(var i=0;i<d["indices"].length/3;i++){
var a = [d["vertices"][d["indices"][(i*3)]], d["vertices"][d["indices"][(i*3)]+1], d["vertices"][d["indices"][(i*3)]+2]];
var b = [d["vertices"][d["indices"][(i*3)+1]], d["vertices"][d["indices"][(i*3)+1]+1], d["vertices"][d["indices"][(i*3)+1]+2]];
var c = [d["vertices"][d["indices"][(i*3)+2]], d["vertices"][d["indices"][(i*3)+2]+1], d["vertices"][d["indices"][(i*3)+2]+2]];
var e = vec3.cross(vec3.subtract(b, a), vec3.subtract(c, a));
d["normals"][d["indices"][(i*3)]] += -e[0];
d["normals"][d["indices"][(i*3)]+1] += -e[1];
d["normals"][d["indices"][(i*3)]+2] += -e[2];
d["normals"][d["indices"][(i*3)+1]] += -e[0];
d["normals"][d["indices"][(i*3)+1]+1] += -e[1];
d["normals"][d["indices"][(i*3)+1]+2] += -e[2];
d["normals"][d["indices"][(i*3)+2]] += -e[0];
d["normals"][d["indices"][(i*3)+2]+1] += -e[1];
开发者_开发百科 d["normals"][d["indices"][(i*3)+2]+2] += -e[2];
}
for(var i=0;i<d["normals"].length/3;i++){
var old = vec3.normalize([d["normals"][(i*3)],d["normals"][(i*3)+1],d["normals"][(i*3)+2]]);
d["normals"][(i*3)] = old[0];
d["normals"][(i*3)+1] = old[1];
d["normals"][(i*3)+2] = old[2];
}
Important part of the (vertex)shader:
// where uNMatrix = inverse of model view matrix
vec3 transformedNormal = uNMatrix * aVertexNormal;
// vec3 - light pos
float directionalLightWeighting = max(dot(transformedNormal, uLightingDirection), 1.0);
// vec3 = light color
vLightWeighting = uAmbientColor + uDirectionalColor * directionalLightWeighting;
I have tried many normal algorithm's to no avail. I have also found if I don't normalize the normals at the very end, the color/shades do in fact change, it is just obviously incorrect shading.
For an example of what it is currently looking like (with the bottom loop commented) follow this link, select teddy from the dropdown, then click load, then click "(re)generate normals", you can then rotate around the teddy by dragging the mouse:
http://webdesignscript.net/assignment/graphics_a3/
For a look at the shaders they are here:
http://webdesignscript.net/assignment/graphics_a3/scripts/shaders.js
I have been stuck on this for many hours, and am starting to wonder if it might be something shader related, however I am still new to graphicaly programming and would greatly appreciate any help :)
*the matrix library used is glMatrix
Cheers, Josh
I can't load your demo (allocation size overflow on model_engine.js line 107), but I threw your shader code into my engine (shameless plug: check out Jax, it's really cool!) and it worked fine.
Then I took a close look at your JS code, and... well, I believe it's pretty much entirely wrong. It looks like the first thing you do is take the normal of each face -- a good start, but I don't understand why you are negating the value of e
. You should also normalize e
at this point, because right now it's just an arbitrary-length vector. Don't know if that really matters, though.
The next thing you're doing is taking the normal of the sum of all e
s for a given vertex. Not quite right: you need to normalize the average of all e
s, rather than the sum.
In the end, here's what I came up with. It works great in my own engine, and it seems to run considerably faster than the original version to boot. (Disclaimer: there may still be some optimizations to be made. I wrote it for clarity, not speed.)
var i, j, normals = {};
// calculate face normals. Note that each vertex will have a number of faces
// adjacent to it, so we accumulate their normals into an array. We'll take
// the average of them all when done.
var tmp1 = vec3.create(), tmp2 = vec3.create();
var a, b, c;
function pushNormal(index, normal) {
normals[index] = normals[index] || [];
normals[index].push(normal);
}
for (i = 0; i < d["indices"].length; i += 3) {
// get points a, b, c
var aIndex = d["indices"][i], bIndex = d["indices"][i+1], cIndex = d["indices"][i+2];
var aOffsetX = aIndex * 3, aOffsetY = aIndex * 3 + 1, aOffsetZ = aIndex * 3 + 2;
var bOffsetX = bIndex * 3, bOffsetY = bIndex * 3 + 1, bOffsetZ = bIndex * 3 + 2;
var cOffsetX = cIndex * 3, cOffsetY = cIndex * 3 + 1, cOffsetZ = cIndex * 3 + 2;
a = [d["vertices"][aOffsetX], d["vertices"][aOffsetY], d["vertices"][aOffsetZ]];
b = [d["vertices"][bOffsetX], d["vertices"][bOffsetY], d["vertices"][bOffsetZ]];
c = [d["vertices"][cOffsetX], d["vertices"][cOffsetY], d["vertices"][cOffsetZ]];
// calculate face normal
vec3.subtract(b, a, tmp1);
vec3.subtract(c, a, tmp2);
var e = vec3.normalize(vec3.cross(tmp1, tmp2, vec3.create()));
// accumulate face normal for each of a, b, c
pushNormal(a, vec3.create(e));
pushNormal(b, vec3.create(e));
pushNormal(c, vec3.create(e));
}
// now calculate normalized averages for each face normal, and store the result
for (i = 0; i < d["vertices"].length; i += 3) {
a = [d["vertices"][i], d["vertices"][i+1], d["vertices"][i+2]];
if (normals[a]) {
var avg = vec3.create();
for (j = 0; j < normals[a].length; j++) {
vec3.add(normals[a][j], avg, avg);
}
vec3.scale(avg, 1/normals[a].length);
vec3.normalize(avg);
d["normals"][i] = avg[0];
d["normals"][i+1] = avg[1];
d["normals"][i+2] = avg[2];
}
}
// sanity check
if (d["normals"].length != d["vertices"].length)
alert("ERROR "+d["normals"].length+" != "+d["vertices"].length);
Hope this helps!
精彩评论