开发者

glGetIntegerv returning garbage value

开发者 https://www.devze.com 2023-03-18 03:19 出处:网络
#include<iostream> #include\"Glew\\glew.h\" #include\"freeGlut\\freeglut.h\" using namespace std; int main(int argc, char* argv[])
#include<iostream>
#include"Glew\glew.h"
#include"freeGlut\freeglut.h"
using namespace std;

int main(int argc, char* argv[])
{
    GLint ExtensionCount;
    glGetIntegerv(GL_NUM_EXTENSIONS, &ExtensionCount);
    cout << ExtensionCount << endl;

    return 0;
}
  1. The output of this program is, -858993460. Why? It should return the number of extensions supported.

  2. If I remove the freeglut.h header file, the program doesn't run and throws an error message,

    开发者_如何学Python

error LNK2019: unresolved external symbol __imp__glGetIntegerv@8 referenced in function _main

But, glGetIntegerv is inside glew.h. Why removing freeglut.h would cause an unresolved external error?

EDIT I have OpenGL 3.3 support. Using Radeon 4670 with catalyst 11.6.


@mario & @Banthar yes, thanks. I have to create a context first to use the any Opengl functionality.(yes, even for Opengl 1.1 which comes default with windows.)

glGetIntegerv is not returning garbage. glGetIntegerv returns either a good value, or it does not touch the pointed to address at all. The reason why you see garbage is because the variable is not initialized. This seems like a pedantic comment, but it is actually important to know that glGetIntegerv does not touch the variable if it fails. Thanks @Damon

This bare bone works fine.

int main(int argc, char* argv[])
{
    glutInit(&argc, argv);

    glutInitContextVersion(3,3);
    glutInitContextProfile(GLUT_FORWARD_COMPATIBLE);
    glutInitContextProfile(GLUT_CORE_PROFILE);

    glutCreateWindow("Test");

    GLint ExtensionCount;
    glGetIntegerv(GL_NUM_EXTENSIONS, &ExtensionCount);
    cout << ExtensionCount << endl;

    return 0;
}


Are you sure you have opengl 3.0? AFAIK, GL_NUM_EXTENSIONS was added in OpenGL 3.0.


I guess your rendering context is using a OpenGL version prior to 3.0 (from what I've read GL_NUM_EXTENSIONS was introduced in OpenGL 3.0; just because your card supports it doesn't mean you're actually using it). You could retrieve the string GL_EXTENSIONS and then split/count the elements yourself. But I don't think that's available everywhere either (2.0+?).

What are you trying to do (besides returning the number of extensions)?


Maybe your library headers expect you to include <GL/gl.h>


In my Windows SDK (7.1) the included GL/GL.h defines the symbol GL_VERSION_1_1. I suspect that this is the version that is really relevant for the purposes of using glGetIntegerv with arguments such as GL_MAJOR_VERSION, GL_MINOR_VERSION or GL_NUM_EXTENSIONS.

Actually, none of these is defined in GL/GL.h, while for instance GL_VERSION and GL_EXTENSIONS are. But when including GL/glew.h all these constants are available.

With respect to GL_VERSION_1_1, the three constants GL_MAJOR_VERSION, GL_MINOR_VERSION or GL_NUM_EXTENSIONS are not valid enumeration values, and actually if you call glGetError after trying to use one of them with glGetIntegerv you get an error 0x500 (1280 in decimal) which is the GL_INVALID_ENUM error.

0

精彩评论

暂无评论...
验证码 换一张
取 消