开发者

Help to rectify line scaling in Android OpenGL 2.0 w/ QCAR

开发者 https://www.devze.com 2023-03-24 21:18 出处:网络
I\'m working w/ the QCAR AR SDK in Android, which uses OpenGL 2.0, and I\'m new to 2.0. The QCAR SDK is for CV based AR apps and utilizes OpenGL for rendering onto images.

I'm working w/ the QCAR AR SDK in Android, which uses OpenGL 2.0, and I'm new to 2.0. The QCAR SDK is for CV based AR apps and utilizes OpenGL for rendering onto images.

I'd simply like to draw a small X at the center of the screen, and am using the following code. But rather than draw an X to the correct coordinates, the X extends to the edges of the screen. This occurs regardless of what values I assign to the vertices. I can't make out if this is a scaling issue or some confusion in the coordinate system that I'm using.

Any ideas as to why these lines aren't being drawn properly?? - I know that this would be easier in 1.1, but I've got to use 2.0.

thnx

// Clear color and depth buffer 
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);


    GLfloat diagVertices[12];

    diagVertices[0] = -10;
    diagVertices[1] = -10;
    diagVertices[2] = 0.0f;

    diagVertices[3] = 10;
    diagVertices[4] = 10;
    diagVertices[5] = 0.0f;

    diagVertices[6] = -10;
    diagVertices[7] = 10;
    diagVertices[8] = 0.0f;

    diagVertices[9] = 10;
    diagVertices[10] = -10;
    diagVertices[11] = 0.0f;

    glUseProgram(diagonalShaderProgramID);
// map the border vertices
    glVertexAttribPointer(diagVertexHandle, 3, GL_FLOAT, GL_FALSE, 0, (const GLvoid*) &diagVertices[0]);
// draw it
glEnableVertexAttribArray(diagVertexHandle);

glLineWidth(3.0f);
glDrawArrays(GL_LINES, 0, 4);
glDisableVertexAttribArray(diagVertexHandle);

Here's the shader that I'm using..

static const char* diagLineMeshVertexShader = " \
  \
attribute vec4 vertexPosition; \
 \
void main() \
{ \
   gl_Position = vertexPosition; \
} \
";

static const char* diagLineFragmentShader = " \
 \
precision mediump float; \
 \
void main() \
{ \
   gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); \
} \
";

Update:

So I've established a build environment in Windows 7 (64) using Eclipse and Cygwin, and have tested the same approach - drawing vertex attribute arrays. The codebase is derived from a simple lighthouse3D sample demonstrating GSLS. I'd compiled and run the sample to confirm that it's rendering as expected. Then I'd implemented vertex arrays, as above. I'm seeing exactly the same problem. The lines extend to the edges of the window regardless of their vertex values.

This is for GL_VERSION 2.1.2. The implementation of vertex attribute arrays, and the method for rendering them, appears to be identical to other examples that I've found through reference resources.

Here's the code.. - I've commented out the sections of lighthouse3d code that I've altered.

#define WIN32

#include <stdio.h>
#include <stdlib.h>

#include <GL/Glee.h>
#include <GL/glut.h>
#include "textfile.h"


GLuint v,f,f2,p;
float lpos[4] = {1,0.5,1,0};
GLfloat crossVertices[12];
GLint lineVertexHandle = 0;

void changeSize(int w, int h) {

// Prevent a divide by zero, when window is too short
// (you cant make a window of zero width).
if(h == 0)
    h = 1;

float ratio = 1.0* w / h;

// Reset the coordinate system before modifying
glMatrixMode(GL_PROJECTION);
glLoadIdentity();

// Set the viewport to be the entire window
glViewport(0, 0, w, h);

// Set the correct perspective.
//gluPerspective(45,ratio,1,1000);
gluPerspective(45,ratio,1,10);
glMatrixMode(GL_MODELVIEW);


}


void renderScene(void) {

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

glLoadIdentity();
gluLookAt(0.0,0.0,5.0,
          0.0,0.0,0.0,
          0.0f,1.0f,0.0f);

glLightfv(GL_LIGHT0, GL_POSITION, lpos);
//glutSolidTeapot(1);

// map the border vertices
glVertexAttribPointer(lineVertexHandle, 3, GL_FLOAT, GL_FALSE, 0, (const GLvoid*) &crossVertices[0]);
glEnableVertexAttribArray(lineVertexHandle);


glLineWidth(1.0f);
glDrawArrays(GL_LINES, 0, 4);
glDisableVertexAttribArray(lineVertexHandle);
glutSwapBuffers();


}

void processNormalKeys(unsigned char key, int x, int y) {

if (key == 27) 
    exit(0);
}


void setShaders() {

char *vs = NULL,*fs = NULL,*fs2 = NULL;

v = glCreateShader(GL_VERTEX_SHADER);
f = glCreateShader(GL_FRAGMENT_SHADER);
f2 = glCreateShader(GL_FRAGMENT_SHADER);


vs = textFileRead("toon.vert")开发者_StackOverflow社区;
fs = textFileRead("toon.frag");
fs2 = textFileRead("toon2.frag");

const char * ff = fs;
const char * ff2 = fs2;
const char * vv = vs;

glShaderSource(v, 1, &vv,NULL);
glShaderSource(f, 1, &ff,NULL);
glShaderSource(f2, 1, &ff2,NULL);

free(vs);free(fs);

glCompileShader(v);
glCompileShader(f);
glCompileShader(f2);

p = glCreateProgram();
glAttachShader(p,f);
glAttachShader(p,f2);
glAttachShader(p,v);

glLinkProgram(p);
glUseProgram(p);
}

void defineVertices(){
crossVertices[0]=       10.0f;
crossVertices[1]=0.0f;
crossVertices[2]=0.0f;
crossVertices[3]=       -1 * 10.0f;
crossVertices[4]=0.0f;
crossVertices[5]=0.0f;
crossVertices[6]=0.0f;
crossVertices[7]=       10.0f;
crossVertices[8]=0.0f;
crossVertices[9]=0.0f;
crossVertices[10]=      -1 * 10.0f;
crossVertices[11]=0.0f;
}


int main(int argc, char **argv) {
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
glutInitWindowPosition(100,100);
glutInitWindowSize(320,320);
glutCreateWindow("MM 2004-05");

glutDisplayFunc(renderScene);
glutIdleFunc(renderScene);
glutReshapeFunc(changeSize);
glutKeyboardFunc(processNormalKeys);

glEnable(GL_DEPTH_TEST);
glClearColor(1.0,1.0,1.0,1.0);
glEnable(GL_CULL_FACE);
    /*
glewInit();
if (glewIsSupported("GL_VERSION_2_0"))
    printf("Ready for OpenGL 2.0\n");
else {
    printf("OpenGL 2.0 not supported\n");
    exit(1);
}
*/
setShaders();
defineVertices();

glutMainLoop();

// just for compatibiliy purposes
return 0;
}

and here is the vertex shader, which is from the lighthouse3D example...

varying vec3 normal, lightDir;

void main()
{   
lightDir = normalize(vec3(gl_LightSource[0].position));
normal = normalize(gl_NormalMatrix * gl_Normal);

gl_Position = ftransform();
}

Any ideas on what's likely to be causing this?


In your vertex shader you just pass the vertex positions through to the rasterizer, without transforming them by a modelview or a projection matrix. Whereas this is perfectly valid, you still have to care for the range your coordinates are in.

After the vertex processing stage your coordinates have to be in the [-1,1]-cube, everything out there is clipped away and this cube is then transformed by the vieport transform into screen space, e.g. [0,w]x[0,h]x[0,1]. So your coordinates range from -10 to 10, so your line is actually 10x the screen size. If you mean pixels, you should scale your x,y-values from [-w/2,w/2]x[-h/2,h/2] down to [-1,1] in the vertex shader.

This is the same problem in the desktop GL project you provided, you call ftransform in the shader, but your projection matrix is a simple perspective matrix that doesn't down-scale your coordinates that much. So in this project substitute the call to gluPerspective with glOrtho(-0.5*w, 0.5*w, -0.5*h, 0.5*h, -1.0, 1.0) if you want the line coordinates to be pixels.

And also keep in mind, that the y in OpenGL goes from bottom to top by default. So if you want this to behave differently (which many image processing frameworks do), then you have to negate your y-coordinate in the vertex shader, too (and likewise interchange the 3rd and 4th value in the glOrtho call in the other project). But keep in mind that this will reverse the orientation of any triangles you render, if any.

So for example in your vertex shader just do something like:

uniform vec2 screenSize;           //contains the screen size in pixels
attribute vec2 vertexPosition;     //why take 4 if you only need 2?

void main()
{
    gl_Position = vec4(2.0*vertexPosition/screenSize, 0.0, 1.0);
}

This gives you a coordinate system in pixels, with the origin in the center and the y-axis going bottom to top. If this is not what you want, then feel free to play around with the transformation (it can also be optimized a bit by precomputiong 2.0/screenSize on the CPU). But always keep in mind, after the vertex shader the screen space is actually the [-1,1]-cube and this is then transformed to the real screen space in pixels by the viewport transformation (whichever values you gave to glViewport).

0

精彩评论

暂无评论...
验证码 换一张
取 消