Search Issue Tracker
Not Reproducible
Votes
0
Found in
6000.0.45f1
6000.1.0b14
Issue ID
UUM-102361
Regression
No
[Vulkan][Quest] FrameBufferFetch sample causes framebuffer corruption artifacts on Meta Quest 2/3 devices when using Vulkan
Steps to reproduce:
1. Open the attached project "Fetch.zip"
2. Open "SampleScene.unity" and enter Play Mode to observe correct rendering (Output is tinted blue)
3. Switch to Android Platform and make sure Graphics Api is Vulkan
4. Build to a Quest 2/3 device
5. Observe the corrupted rendering
Expected Results: framebuffer fetch does not produce artifacts with Vulkan on Quest 2/3
Actual Results: framebuffer fetch produces artifacts with Vulkan on Quest 2/3
Reproducible with: 6000.0.45f1, 6000.1.0b14
Could not effectively test on 6000.2.0a8 due to the output not matching the Editor (no blue tint) and no artifacts appearing
Could not test on 2021.3.50f1, 2022.3.61f1 due to RenderGraph scripting errors
Reproducible with these devices:
VLNQA00609 - Oculus Quest 3 (Quest 3), CPU: Snapdragon XR2 Gen 2 (SM8550), GPU: Adreno 740, OS: 12
VLNQA00417 - Oculus Quest 2 (Quest 2), CPU: Snapdragon XR2, GPU: Adreno 650, OS: 10
Environment tested: Windows 11 24H2
Notes:
-Issue only reproduces with Vulkan, on OpenGLES3 left eye does not render but the artifacts are not present
-Discussion thread: https://discussions.unity.com/t/urp-framebuffer-fetch-from-backbuffer/1617802/10
Add comment
All about bugs
View bugs we have successfully reproduced, and vote for the bugs you want to see fixed most urgently.
Latest issues
- UI Builder windows text rendering breaks when using Rich text combined with an underlined emoji
- [Linux] Crash on mono_runtime_invoke when executing a specific script
- Installing the HDRP Water Samples doesn't notify the user to enable the "Water" on HDRP Asset
- Persistent Memory Leak in `ChunkDrawCommandOutput` and `CullingSplits.InitializeSplits`
- 1 KB or more of GC is allocated when Debug logging
Resolution Note:
We tried to repo this on Quest 2 firmware is 79.1028 (August 26) and 6000.0.56f1 , hence closing. If this still continues to be an issue on 6000.0.56f1 or higher, please re-open the bug.