开发者

Multiple textures doesn't show

开发者 https://www.devze.com 2023-03-22 08:09 出处:网络
I\'m a newbie of DirectX10. Now I\'m developing a Direct10 application. It mixes two textures which are filled manually according to user\'s input. The current implementation is

I'm a newbie of DirectX10. Now I'm developing a Direct10 application. It mixes two textures which are filled manually according to user's input. The current implementation is

  1. Create two empty textures with usage D3D10_USAGE_STAGING.
  2. Create two resource shader view to bind to the pixel shader because the shader needs it.
  3. Copy the textures to the GPU memory by calling CopyResource.

Now the problem is that I can only see the first texture but I don't see the second. It looks to me that the binding doesn't work for the second texture.

I don't know what's wrong with it. Can anyone here shed me a light on it?

Thanks, Marshall

The class COverlayTexture takes responsible for creating the texture, creating resource view, fill the texture with the mapped bitmap from another applicaiton and bind the resource view to the pixel shader.

HRESULT COverlayTexture::Initialize(VOID)
{
D3D10_TEXTURE2D_DESC texDesStaging;
texDesStaging.Width = m_width;
texDesStaging.Height = m_height;
texDesStaging.Usage = D3D10_USAGE_STAGING;
texDesStaging.BindFlags = 0;
texDesStaging.ArraySize = 1;
texDesStaging.MipLevels = 1;
texDesStaging.SampleDesc.Count = 1;
texDesStaging.SampleDesc.Quality = 0;
texDesStaging.MiscFlags = 0;
texDesStaging.Format = DXGI_FORMAT_B8G8R8A8_UNORM;
texDesStaging.CPUAccessFlags = D3D10_CPU_ACCESS_WRITE;  
HR( m_Device->CreateTexture2D( &texDesStaging, NULL, &m_pStagingResource ) );

D3D10_TEXTURE2D_DESC texDesShader;
texDesShader.Width = m_width;
texDesShader.Height = m_height;
texDesShader.BindFlags = D3D10_BIND_SHADER_RESOURCE;
texDesShader.ArraySize = 1;
texDesShader.MipLevels = 1;
texDesShader.SampleDesc.Count = 1;
texDesShader.SampleDesc.Quality = 0;
texDesShader.MiscFlags = 0;    
texDesShader.Format = DXGI_FORMAT_B8G8R8A8_UNORM;
texDesShader.Usage = D3D10_USAGE_DEFAULT;    
texDesShader.CPUAccessFlags = 0;
HR( m_Device->CreateTexture2D( &texDesShader, NULL, &m_pShaderResource ) );

D3D10_SHADER_RESOURCE_VIEW_DESC viewDesc;
ZeroMemory( &viewDesc, sizeof( viewDesc ) );
viewDesc.Format = texDesShader.Format;
viewDesc.ViewDimension = D3D10_SRV_DIMENSION_TEXTURE2D;
viewDesc.Texture2D.MipLevels = texDesShader.MipLevels;
HR( m_Device->CreateShaderResourceView( m_pShaderResource, &viewDesc, &m_pShaderResourceView ) );
}

HRESULT COverlayTexture::Render(VOID)
{
m_Device->PSSetShaderResources(0, 1, m_pShaderResourceView);

D3D10_MAPPED_TEXTURE2D lockedRect;
m_pStagingResource->Map( 0, D3D10_MAP_WRITE, 0, &lockedRect );

// Fill in the texture with the bitmap mapped from shared memory view

m_pStagingResource->Unmap(0);

m_Device->CopyResourc开发者_C百科e(m_pShaderResource, m_pStagingResource); 
} 

I use two instances of the class COverlayTexture each of which fills its own bitmap to its texture respectively and renders with sequence COverlayTexture[1] then COverlayTexture[0].

COverlayTexture* pOverlayTexture[2];

for( int i = 1; i < 0; i++)
{
     pOverlayTexture[i]->Render()
}

The blend state setting in the FX file is definedas below:

BlendState AlphaBlend
{
AlphaToCoverageEnable = FALSE;
BlendEnable[0] = TRUE;
      SrcBlend = SRC_ALPHA;
      DestBlend = INV_SRC_ALPHA;
      BlendOp = ADD;
      BlendOpAlpha = ADD;
      SrcBlendAlpha = ONE;
DestBlendAlpha = ZERO;
RenderTargetWriteMask[0] = 0x0f;
};

The pixel shader in the FX file is defined as below:

Texture2D txDiffuse;
float4 PS(PS_INPUT input) : SV_Target
{
float4 ret = txDiffuse.Sample(samLinear, input.Tex);
return ret;
}

Thanks again.

Edit for Paulo:

Thanks a lot, Paulo. The problem is that which instance of the object should be bound to alpha texture or diffuse texture. As testing, I bind the COverlayTexture[0] to the alpha and COverlayTexture[1] to the diffuse texture.

Texture2D txDiffuse[2];
float4 PS(PS_INPUT input) : SV_Target
{
float4 ret = txDiffuse[1].Sample(samLinear, input.Tex);
float alpha = txDiffuse[0].Sample(samLinear, input.Tex).x;

return float4(ret.xyz, alpha);
} 

I called the PSSetShaderResources for the two resource views.

g_pShaderResourceViews[0] = overlay[0].m_pShaderResourceView;
g_pShaderResourceViews[1] = overlay[1].m_pShaderResourceView;
m_Device->PSSetShaderResources(0, 2, g_pShaderResourceViews);

The result is that i don't see anything. I also tried the channel x,y,z,w.


Post some more code.

I'm not sure how you mean to mix these two textures. If you want to mix them in the pixel shader you need to sample both of them then add them (or whatever operation you required) toghether.

How do you add the textures toghether? By setting a ID3D11BlendState or in the pixel shader?

EDIT:

You don't need two textures in every class: if you want to write to your texture your usage should be D3D10_USAGE_DYNAMIC. When you do this, you can also have this texture as your shader resource so you don't need to do the m_Device->CopyResource(m_pShaderResource, m_pStagingResource); step.

Since you're using alpha blending you must control the alpha value output in the pixel shader (the w component of the float4 that the pixel shader returns).

Bind both textures to your pixel shader and use one textures value as the alpha components:

Texture2D txDiffuse;
Texture2D txAlpha;
float4 PS(PS_INPUT input) : SV_Target
{
    float4 ret = txDiffuse.Sample(samLinear, input.Tex);
    float alpha=txAlpha.Sample(samLinear,input.Tex).x; // Choose the proper channel
    return float4(ret.xyz,alpha); // Alpha is the 4th component
}
0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号