开发者

SDL_image/C++ OpenGL Program: IMG_Load() produces fuzzy images

开发者 https://www.devze.com 2022-12-11 21:04 出处:网络
I\'m trying to load an image file and use it as a texture for a cube. I\'m using SDL_image to do that.

I'm trying to load an image file and use it as a texture for a cube. I'm using SDL_image to do that.

SDL_image/C++ OpenGL Program: IMG_Load() produces fuzzy images

I used this image because I've found it in various file formats (tga, tif, jpg, png, bmp)

The code :

SDL_Surface * texture;

//load an image to an SDL surface (i.e. a buffer)

texture = IMG_Load("/Users/Foo/Code/xcode/test/lena.bmp");

if(texture == NULL){
    printf("bad image\n");
    exit(1);
}

//create an OpenGL texture object 
glGenTextures(1, &textureObjOpenGLlogo);

//select the texture object you need
glBindTexture(GL_TEXTURE_2D, textureObjOpenGLlogo);

//define the parameters of that texture object
//how the texture should wrap in s direction
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
//how the texture should wrap in t direction
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
//how the texture lookup should be interpolated when the face is smaller than the texture
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
//how the texture lookup should be interpolated when the face is bigger than the texture
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

//send the texture image to the graphic card
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, texture->w, texture->h, 0, GL_RGB, GL_UNSIGNED_BYTE, texture-> pixels);

//clean the SDL surface
SDL_FreeSurface(texture);

The code compiles without errors or warnings !

I've tired all the files formats but this always produces that ugly result :

SDL_image/C++ OpenGL Program: IMG_Load() produces fuzzy images

I'm using : SDL_imag开发者_如何学Pythone 1.2.9 & SDL 1.2.14 with XCode 3.2 under 10.6.2

Does anyone knows how to fix this ?


The reason the image is distorted is because it's not in the RGBA format that you've specified. Check the texture->format to find out the format it's in and select the appropriate GL_ constant that represents the format. (Or, transform it yourself to the format of your choice.)


I think greyfade has the right answer, but another thing you should be aware of is the need to lock surfaces. This is probably not the case, since you're working with an in-memory surface, but normally you need to lock surfaces before accessing their pixel data with SDL_LockSurface(). For example:

bool lock = SDL_MUSTLOCK(texture);
if(lock)
    SDL_LockSurface(texture);  // should check that return value == 0
// access pixel data, e.g. call glTexImage2D
if(lock)
    SDL_UnlockSUrface(texture);


If you have an alpha chanel every pixel is 4 unsigned bytes, if you don't it's 3 unsigned bytes. This image has no transpareny and when I try to save it, its a .jpg.

change

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, texture->w, texture->h, 0, GL_RGB, GL_UNSIGNED_BYTE, texture-> pixels);

to

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, texture->w, texture->h, 0, GL_RGB, GL_UNSIGNED_BYTE, texture-> pixels);

That should fix it.

For a .png with an alpha channel use

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, texture->w, texture->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, texture-> pixels);

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号