开发者

Why isn't this orthographic vertex shader producing the correct answer?

开发者 https://www.devze.com 2023-03-26 06:46 出处:网络
My issue is that I have a (working) orthographic vertex and fragment shader pair that allow me to specify center X and Y of a sprite via \'translateX\' and \'translateY\' uniforms being passed in.I mu

My issue is that I have a (working) orthographic vertex and fragment shader pair that allow me to specify center X and Y of a sprite via 'translateX' and 'translateY' uniforms being passed in. I multiply by a projectionMa开发者_如何学Pythontrix that is hardcoded and works great. Everything works as far as orthographic operation. My incoming geometry to this shader is based around 0, 0, 0 as its center point.

I want to now figure out what that center point (0, 0, 0 in local coordinate space) becomes after the translations. I need to know this information in the fragment shader. I assumed that I can create a vector at 0, 0, 0 and then apply the same translations. However, I'm NOT getting the correct answer.

My question: what I am I doing wrong, and how can I even debug what's going on? I know that the value being computed must be wrong, but I have no insight in to what it is. (My platform is Xcode 4.2 on OS X developing for OpenGL ES 2.0 iOS)

Here's my vertex shader:

// Vertex Shader for pixel-accurate rendering
attribute vec4 a_position;
attribute vec2 a_texCoord;

varying vec2 v_texCoord;

uniform float translateX;
uniform float translateY;

// Set up orthographic projection 
// this is for 640 x 960
mat4 projectionMatrix = mat4( 2.0/960.0, 0.0, 0.0, -1.0,
                             0.0, 2.0/640.0, 0.0, -1.0,
                             0.0, 0.0, -1.0, 0.0,
                             0.0, 0.0, 0.0, 1.0);                        

void main()
{
    // Set position
    gl_Position = a_position;

    // Translate by the uniforms for offsetting
    gl_Position.x += translateX;
    gl_Position.y += translateY;

    // Translate
    gl_Position *= projectionMatrix;


    // Do all the same translations to a vector with origin at 0,0,0
    vec4 toPass = vec4(0, 0, 0, 1); // initialize.  doesn't matter if w is 1 or 0
    toPass.x += translateX;
    toPass.y += translateY;
    toPass *= projectionMatrix;

    // this SHOULD pass the computed value to my fragment shader.
    // unfortunately,  whatever value is sent, isn't right.
    //v_translatedOrigin = toPass;

    // instead, I use this as a workaround, since I do know the correct values for my
    // situation.  of course this is hardcoded and is terrible.
    v_translatedOrigin = vec4(500.0, 200.0, 0.0, 0.0);
}

EDIT: In response to my orthographic matrix being wrong, the following is what wikipedia has to say about ortho projections, and my -1's look right. because in my case for example the 4th element of my mat should be -((right+left)/(right-left)) which is right of 960 left of 0, so -1 * (960/960) which is -1.

EDIT: I've possibly uncovered the root issue here - what do you think?

Why isn't this orthographic vertex shader producing the correct answer?


Why does your ortho matrix have -1's in the bottom of each column? Those should be zeros. Granted, that should not affect anything.

I'm more concerned about this:

gl_Position *= projectionMatrix;

What does that mean? Matrix multiplication is not commutative; M * a is not the same as a * M. So which side do you expect gl_Position to be multiplied on?

Oddly, the GLSL spec does not say (I filed a bug report on this). So you should go with what is guaranteed to work:

gl_Position = projectionMatrix * gl_Position;

Also, you should use proper vectorized code. You should have one translate uniform, which is a vec2. Then you can just do gl_Position.xy = a_position.xy + translate;. You'll have to fill in the Z and W with constants (gl_Position.zw = vec2(0, 1);).


Matrices in GLSL are column major. The first four values are the first column of the matrix, not the first row. You are multiplying with a transposed ortho matrix.


I have to echo Nicol Bolas's sentiment. Two wrongs happening to make things work is frustrating, but doesn't make them any less wrong. The fact that things are showing up where you expect is likely because the translation portion of your matrix is 0, 0, 0.

The equation you posted is correct, but the notation is row major, and OpenGL is column major:

Why isn't this orthographic vertex shader producing the correct answer?

I run afoul of this stuff every new project I start. This site is a really good resource that helped me keep these things straight. They've got another page on projection matrices.

If you're not sure if your orthographic projection is correct (right now it isn't), try plugging the same values into glOrtho, and reading the values back out of GL_PROJECTION.

0

精彩评论

暂无评论...
验证码 换一张
取 消