I am making a 2.5D rendering engine. I assume (with the knowledge that I've gotten from reading online) that doing the rendering work using the GPU will be faster than using the CPU.
Is this possible with .net? If so, is this a correct situation to utilize the GPU?
I'm also using VB.net开发者_高级运维, however all .net examples will apply.
I think it might help to define terminology.
Microsoft .NET is a framework, encompassing a suite of developer technologies. It consists of languages, a runtime, and libraries that provide common functionality.
If you're asking, are there .NET libraries specific to rendering on a GPU, the answer is yes, there are.
If you're asking can you write any .NET application, and have it run on a GPU, no, as the GPU is not a general purpose processor. You can use other libraries to offload some tasks to the GPU.
If you're asking, can you write shaders in VB, C# or other CLR languages, yes you can do that to.
Your .NET program isn't going to be able to run on the GPU, no.
But using Direct3D or OpenGL (both of which have .NET bindings available), you can program that GPU to do the rendering for you.
But you won't be writing .NET code to run on the GPU. Both D3D and OGL take the form of API functions you can call to pass instructions to the GPU, as well as shader programs written in dedicated C-like languages (HLSL for D3D, and GLSL for OpenGL).
精彩评论