I have many OpenGl shaders. We try to use as many different hardware as possible to evaluate the portability of our product. One of our customer recently ran into some rendering issues it seems that the target machine only provide version shaders model 2.0 all our development/build/test machine (even oldest ones run version 4.0), everything else (OpenGl version, GSLS version ...) seems identical.
I didn't find a way to downgrade the shaders model version since it's automatically provided by the graphic card driver.
Is there a way to manually install or select OpenGl/GLSL/Shader model version in use for develpment/test purpose ?
N开发者_Go百科OTE: the main target are windows XP SP2/7 (32&64) for both ATI/NVIDIA cards
OpenGL does not have the concept of "shader models"; that's a Direct3D thing. It only has versions of GLSL: 1.10, 1.20, etc.
Every OpenGL version matches a specific GLSL version. GL 2.1 supports GLSL 1.20. GL 3.0 supports GLSL 1.30. For GL 3.3 and above, they stopped fooling around and just used the same version number, so GL 3.3 supports GLSL 3.30. So there's an odd version number gap between GLSL 1.50 (maps to GL 3.2) and GLSL 3.30.
Technically, OpenGL implementations are allowed to refuse to compile older shader versions than the ones that match to the current version. As a practical matter however, you can pretty much shove any GLSL shader into any OpenGL implementation, as long as the shader's version is less than or equal to the version that the OpenGL implementation supports. This hasn't been tested on MacOSX Lion's implementation of GL 3.2 core.
There is one exception: core contexts. If you try to feed a shader through a core OpenGL context that uses functionality removed from the core, it will complain.
There is no way to force OpenGL to provide you with a particular OpenGL version. You can ask it to, with wgl/glXCreateContextAttribs. But that is allowed to give you any version higher than the one you ask for, so long as that version is backwards compatible with what you asked for.
精彩评论