开发者

What is the PIXELFORMATDESCRIPTOR parameter in SetPixelFormat() used for?

开发者 https://www.devze.com 2022-12-24 07:44 出处:网络
Usually when setting up OpenGL contexts, I\'ve simply filled out a PIXELFORMATDESCRIPTOR structure with the necessary information and called ChoosePixelFormat(), followed by a call to SetPixelFormat()

Usually when setting up OpenGL contexts, I've simply filled out a PIXELFORMATDESCRIPTOR structure with the necessary information and called ChoosePixelFormat(), followed by a call to SetPixelFormat() with the returned matching pixelformat from ChoosePixelFormat(). Then I've simply passed the initial descriptor without giving much thought of why.

But now I use wglChoosePixelFormatARB() instead of ChoosePixelFormat(), because I need some extended traits like sRGB and multisampling. It takes an attribute list of integers, just like XLib/GLX on Linux, not a PIXELFORMATDESCRIPTOR structure. So, do I really have to fill in a descriptor for SetPixelFormat() to use? What does SetPixelFormat() use the descriptor for when it already has the pixelformat descriptor index? Why do I h开发者_运维知识库ave to specify the same pixelformat attributes in two different places? And which one takes precedence; the attribute list to wglChoosePixelFormatARB(), or the PIXELFORMATDESCRIPTOR attributes passed to SetPixelFormat()?

Here are the function prototypes, to make the question more clear:

/* Finds a best match based on a PIXELFORMATDESCRIPTOR,
and returns the pixelformat index */
int ChoosePixelFormat(HDC hdc, const PIXELFORMATDESCRIPTOR *ppfd);

/* Finds a best match based on an attribute list of integers and floats,
and returns a list of indices of matches, with the best matches at the head.
Also supports extended pixelformat traits like sRGB color space,
floating-point framebuffers and multisampling. */
BOOL wglChoosePixelFormatARB(HDC hdc, const int *piAttribIList,
    const FLOAT *pfAttribFList, UINT nMaxFormats, int *piFormats,
    UINT *nNumFormats
);

/* Sets the pixelformat based on the pixelformat index */
BOOL SetPixelFormat(HDC hdc, int iPixelFormat, const PIXELFORMATDESCRIPTOR *ppfd);

EDIT: MSDN says this about the SetPixelFormat() parameter:

Pointer to a PIXELFORMATDESCRIPTOR structure that contains the logical pixel format specification. The system's metafile component uses this structure to record the logical pixel format specification. The structure has no other effect upon the behavior of the SetPixelFormat function.

But I have no idea what this means, or how it relates to my question(s).


I believe you won't have to use PIXELFORMATDESCRIPTOR and ChoosePixelFormat, just use wglChoosePixelFormatARB and set the returned int as pixelformat. Those functions are used just to query windows for matching pixel format, the SetPixelFormat function has no idea which function you used to obtain the desired pixel format.

Conclusion from the discussion below:

I have looked into Wine's opengl implementation, they completly ignore that parameter.. (internal_SetPixelFormat), but who knows what ms does with it, the only safe way I see is to find pixelformat with wglChoosePixelFormatARB function, and then use DescribePixelFormat to fill in the structure which you pass back to SetPixelFormat.

0

精彩评论

暂无评论...
验证码 换一张
取 消