Search Issue Tracker
Won't Fix
Votes
2
Found in
2021.3.22f1
2022.2.13f1
2023.1.0b9
2023.2.0a8
Issue ID
UUM-31738
Regression
No
Compute Shader GPU Readback is incorrect when using an integrated GPU
How to reproduce:
1. Change the Editor’s used GPU to the integrated GPU
2. Open the "Integrated_GPU" project
3. Open the "Main" scene
4. Enter Play Mode and observe the Console
Expected result: Console prints "GPU Readback at position (1,1) Expected = WOOD (9) Actual: WOOD (9)"
Actual result: Console prints "GPU Readback at position (1,1) Expected = WOOD (9) Actual: NONE (0)"
Reproduced with: 2021.3.22f1, 2022.2.13f1, 2023.1.0b9, 2023.2.0a8
Could not test with: 2020.3.46f1 (could not resolve scripting errors)
Reproduced on: Windows 11
Not reproduced on: macOS Ventura 13.0 (M1)
GPU reproduced with:
Intel(R) Core(TM) i9-11900H @ 2.50GHz, 2496 Mhz, 8 Core(s), 16 Logical Processor(s) → Intel(R) UHD Graphics
Intel(R) Core(TM) i5-9400 CPU @ 2.90GHz: 6 cores 2904hz → Intel(R) UHD Graphics 630 Intel Direct3D 11.0 (by reporter)
Intel(R) Core(TM) i5-7267U CPU @ 3.10GHz: 4 cores 3096hz → Intel(R) Iris(R) Plus Graphics 650 Intel Direct3D 11.0 (by reporter)
Intel(R) Core(TM) i3-10110U CPU @ 2.10GHz: 4 cores 2592hz → Intel(R) URD Graphics Intel Direct3D 11.0 (by reporter)
GPU not reproduced with:
NVIDIA GeForce RTX 3050 Ti Laptop GPU
Apple M1
Notes:
1. To force the Unity Editor to use an integrated GPU on Windows 11: Start > Settings > Display > Graphics > Unity Editor / “Unity.exe” (make sure that you select the version you will open the project with) > Power Saving > Save
2. When forcing the Unity Editor to use a dedicated GPU, the Console will log the expected result
3. Also reproduced in Player
-
cchute
Jul 18, 2023 20:52
Don't you think this warrants a shader compiler warning?
It's these type of platform specific traps, combined with poor documentation, that makes Unity a nightmare to work with.
-
cchute
May 22, 2023 16:40
Hello Unity is there any updates on this? Please resolve my business is crippled until this gets fixed
-
cchute
Mar 31, 2023 01:50
Good to see it is under consideration 2021.3.X!
Please consider making this a priority. This is not a cosmetic bug or something that can be worked around. Any game that relies on Compute Shaders is 100% broken for these users as they cannot get any compute results back to the GPU.
This is absolutely game breaking, and just leads to bad reviews for myself and less trust in Unity from the player base.
Please resolve ASAP
-
cchute
Mar 29, 2023 12:48
Hello!
Any idea when this might get fixed? We have a game in early access and this bug is crushing our fanbase. A loose timeline would be much appreciated so we can plan our release and stop having to tell everyone "It's a Unity bug we're waiting on them to fix it"
Thanks
Add comment
All about bugs
View bugs we have successfully reproduced, and vote for the bugs you want to see fixed most urgently.
Latest issues
- TransformAccessArray.Add behaves differently when the argument is null and the argument is an int
- [iOS] App crashes when trying to use WebCamTexture depth data on back triple camera
- [Mobile] Only second base Camera out of two base Cameras with priority 1 and -1 is rendered in the Player
- Highlighter.Highlight does not find window when using class name as window title
- Highlighter.Highlight highlights a different component when multiple components have the same property path
Resolution Note:
In the ComputePixelShaderController.cs, the user script writes:
const string outputBufferKey = "outputBuffer";
const int pixelTypeStride = 4;
outputBuffer = new ComputeBuffer(map.gpuOutput.Length, pixelTypeStride);
In Unity Doc from ComputeBuffer , it is mentioned that "On the shader side, ComputeBuffers with default ComputeBufferType map to StructuredBuffer<T> and RWStructuredBuffer<T> in HLSL."
Therefore, in the compute shader, they must use RWStructuredBuffer instead of RWBuffer. This is a user script error.