开发者

API to get the graphics or video memory

开发者 https://www.devze.com 2022-12-13 08:18 出处:网络
I want to get th开发者_开发问答e adpater RAM or graphics RAM which you can see in Display settings or Device manager using API. I am in C++ application.

I want to get th开发者_开发问答e adpater RAM or graphics RAM which you can see in Display settings or Device manager using API. I am in C++ application.

I have tried seraching on net and as per my RnD I have come to conclusion that we can get the graphics memory info from 1. DirectX SDK structure called DXGI_ADAPTER_DESC. But what if I dont want to use DirectX API. 2. Win32_videocontroller : But this class does not always give you adapterRAM info if availability of video controller is offline. I have checked it on vista.

Is there any other way to get the graphics RAM?


There is NO way to directly get graphics RAM on windows, windows prevents you doing this as it maintains control over what is displayed.

You CAN, however, create a DirectX device. Get the back buffer surface and then lock it. After locking you can fill it with whatever you want and then unlock and call present. This is slow, though, as you have to copy the video memory back across the bus into main memory. Some cards also use "swizzled" formats that it has to un-swizzle as it copies. This adds further time to doing it and some cards will even ban you from doing it.

In general you want to avoid directly accessing the video card and letting windows/DirectX do the drawing for you. Under D3D1x Im' pretty sure you can do it via an IDXGIOutput though. It really is something to try and avoid though ...

You can write to a linear array via standard win32 (This example assumes C) but its quite involved.

First you need the linear array.

    unsigned int* pBits = malloc( width * height );

Then you need to create a bitmap and select it to the DC.

    HBITMAP hBitmap = ::CreateBitmap( width, height, 1, 32, NULL );
    SelectObject( hDC, (HGDIOBJ)hBitmap );

You can then fill the pBits array as you please. When you've finished you can then set the bitmap's bits.

    ::SetBitmapBits( hBitmap, width * height * 4, (void*)pBits )

When you've finished using your bitmap don't forget to delete it (Using DeleteObject) AND free your linear array!

Edit: There is only one way to reliably get the video ram and that is to go through the DX Diag interfaces. Have a look at IDxDiagProvider and IDxDiagContainer in the DX SDK.


Win32_videocontroller is your best course to get the amount of gfx memory. That's how its done in Doom3 source.

You say "..availability of video controller is offline. I have checked it on vista." Under what circumstances would the video controller be offline?

Incidentally, you can find the Doom3 source here. The function you're looking for is called Sys_GetVideoRam and it's in a file called win_shared.cpp, although if you do a solution wide search it'll turn it up for you.


User mode threads cannot access memory regions and I/O mapped from hardware devices, including the framebuffer. Anyway, what you would want to do that? Suppose the case you can access the framebuffer directly: now you must handle a LOT of possible pixel formats in the framebuffer. You can assume a 32-bit RGBA or ARGB organization. There is the possibility of 15/16/24-bit displays (RGBA555, RGBA5551, RGBA4444, RGBA565, RGBA888...). That's if you don't want to also support the video-surface formats (overlays) such as YUV-based.

So let the display driver and/or the subjacent APIs to do that effort.

If you want to write to a display surface (which not equals exactly to framebuffer memory, altough it's conceptually almost the same) there are a lot of options. DX, Win32, or you may try the SDL library (libsdl).

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号