开发者

GLubyte vs GLshort for Indices

开发者 https://www.devze.com 2023-01-25 09:51 出处:网络
Looking through documentation for vertex arrays in OpenGL, two of the most common memory types used for indices I found were GLubyt开发者_JS百科e (GL_UNSIGNED_BYTE) and GLshort (GL_SHORT).

Looking through documentation for vertex arrays in OpenGL, two of the most common memory types used for indices I found were GLubyt开发者_JS百科e (GL_UNSIGNED_BYTE) and GLshort (GL_SHORT). I was wondering if there was any actual difference between using the two for indices

Thanks, Dragonwrenn


GL_UNSIGNED_BYTE is OK for models which have at most 256 vertices - that's really not many.

GL_UNSIGNED_SHORT, taking 2 bytes, would limit you to 65536 vertices - still that's kind of few.

I'd say the most common variant is GL_UNSIGNED_INT, as even 2 bytes may not be enough for mid-poly and high-poly models.


GL_UNSIGNED_BYTE is 1 byte, GL_SHORT is 2 bytes. The only advantage of bytes is that they're smaller so they take less memory to store and less time to transfer to the graphics memory (assuming vertex arrays or VBOs).

Beware that not all types are available for all uses: You can't have GL_UNSIGNED_BYTE vertices, for example.


GPUs only can handle 16 or 32 bit indices, so there is additional overhead when using GL_UNSIGNED_BYTE and no memory saving.

0

精彩评论

暂无评论...
验证码 换一张
取 消