Search Issue Tracker
By Design
Votes
0
Found in
2018.4
2019.4
2020.3
2020.3.12f1
2021.1
2021.2
2022.1
Issue ID
1352733
Regression
No
SetRenderTarget does not correctly display the texture when depthSlice is set to -1
Reproduction steps:
1. Open the attached project "texarraytest" and load Scene "SampleScene"
2. Enter Play Mode
3. Observe the Game View
Expected result: The Screen becomes red when entering Play Mode
Actual result: The Screen becomes black when entering Play Mode
Reproducible with: 2018.4.36f1, 2019.4.29f1, 2020.3.15f1, 2021.1.16f1, 2021.2.0b5, 2022.1.0a3
Notes:
- In this reproduction project, the issue can be worked around by setting the following values to be the same:
The depthSlice parameter of SetRenderTarget() in Line 43 of Tex2DArrayTest.cs,
The Index variable in Line 104 of ArrayTest.shader (setting this to 0 allows -1 to be left as the value of depthSlice).
Add comment
All about bugs
View bugs we have successfully reproduced, and vote for the bugs you want to see fixed most urgently.
Latest issues
- var VisionOSEDRHeadromm has a comma instead of a dot when building with Metal Rendering App Mode and local OS localization is set to German
- IAP Catalog remove product “x” and add product “+” buttons are not consistent with other remove and add buttons in the Editor
- Performance issues in Play Mode when quickly hovering the mouse cursor over Hierarchy GameObjects
- Frame Debugger displays incorrect output when FidelityFX Super Resolution or Spatial-Temporal Upscaler is used with Temporal Anti-aliasing or Subpixel Morphological Anti-aliasing
- Crash with “Fatal Error! The file ‘MemoryStream’ is corrupted!” when adding a large number in Font Character Rects Size field
Resolution Note:
Closing this by design because you cannot set a TextureArray as active RenderTexture in this way.
First off, a TextureArray is a single resource (with multiple subresources, one for each layer) just like a pixel shader output (SV_TARGET0 for example) is also a single resource. Even if it was allowed to bind an array in this way to each SV_TARGET, it still does not inform the GPU about what layer you wish.
Typically you would call SetRenderTarget X times (x == number of layers in your array), each time specifying the slice index.
Specifying DepthSlice as -1 (== all slices) is only supported when "Layered Rendering" is supported (see https://docs.unity3d.com/Manual/class-Texture2DArray.html). Layered rendering in this context simple means that you can use SV_RenderTargetArrayIndex in your shader to specify to what slice of the array you wish to render.
While initially only supported from inside geometry shaders, more recent hardware allows it inside the vertex shader (see SystemInfo.supportsRenderTargetArrayIndexFromVertexShader).
Here is an excellent blog post about how to achieve this: http://xdpixel.com/how-to-render-to-a-texture-array-in-unity/