开发者

OpenGL white textures on other PC

开发者 https://www.devze.com 2023-02-16 22:47 出处:网络
I\'ve made this small game using SDL + OpenGL. The game runs fine on my PC, but on a friend\'s PC, he just gets white boxes and blank screen.

I've made this small game using SDL + OpenGL. The game runs fine on my PC, but on a friend's PC, he just gets white boxes and blank screen.

I thought it might be an issue due to my textures being non power of 2 in dimensions. I cannot change the texture dimensions, so after some searching, I found that using GL_ARB_texture_non_power_of_two would somehow force(?) npot textures. But, to my surprise, the white boxes and stuff appear on my PC and they aren't even gone on my friends. I'm unable to understand what the problem is. Any help would be greatly appreciated.

Code:

   numColors = images[i]->format->BytesPerPixel;
   if ( numColors == 4 )
   {
       if (images[i]->format->Rmask == 0x000000FF)
           textureFormat = GL_RGBA;
       else
           textureFormat = GL_BGRA;
   }
  开发者_JS百科 else if ( numColors == 3 )
   {
       if (images[i]->format->Rmask == 0x000000FF)
           textureFormat = GL_RGBA;
       else
           textureFormat = GL_BGRA;
   }
   glPixelStorei(GL_UNPACK_ALIGNMENT,4);
   glGenTextures( 1, &textures[i] );
   glBindTexture( GL_ARB_texture_non_power_of_two, textures[i] );
   glTexParameteri(GL_ARB_texture_non_power_of_two,GL_TEXTURE_MIN_FILTER, GL_LINEAR);
   glTexParameteri(GL_ARB_texture_non_power_of_two,GL_TEXTURE_MAG_FILTER, GL_LINEAR);
   glTexImage2D(GL_ARB_texture_non_power_of_two, 0, numColors, images[i]->w, images[i]->h, 0, textureFormat, GL_UNSIGNED_BYTE, images[i]->pixels);


Your friend's video card may not support non power of two textures, therefore the output is still wrong despite using the GL_ARB_texture_non_power_of_two extension.

If your game relies on specific OpenGL extensions to display correctly, you should check for those extensions at start up and tell the user he can't run the game if his hardware is lacking the features.


Don't use GL_ARB_texture_non_power_of_two instead of GL_TEXTURE_2D. Just check if the extension is supported then send NPOT textures using glTexImage(GL_TEXTURE_2D, w, h, ...).

Call glGetError() to see if you're getting error. You should, since GL_ARB_...npot is not a valid value as you use it.

GL_ARB_NPOT is also used for 1D and 3D textures.


Additionally to ARB_texture_non_power_of_two there's also another extension: GL_ARB_texture_rectangle; quite old, it's been supported by GPUs for ages. Using that your code would look like

glPixelStorei(GL_UNPACK_ALIGNMENT,4);
glGenTextures( 1, &textures[i] );
glBindTexture( GL_TEXTURE_RECTANGLE_ARB, textures[i] );
glTexParameteri(GL_TEXTURE_RECTANGLE_ARB, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_RECTANGLE_ARB, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_RECTANGLE_ARB, 0, numColors, images[i]->w, images[i]->h, 0, textureFormat, GL_UNSIGNED_BYTE, images[i]->pixels);

BTW: GL_ARB_texture_non_power_of_two is a extension name, not a valid token to be used as texture target; OpenGL should have issued an GL_INVALID_ENUM error.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号