Search Issue Tracker
Fixed in 2018.1.X
Votes
2
Found in
5.0.0b22
Issue ID
674107
Regression
No
[Shadows] Camera.Render() clears _CameraDepthTexture if there's a shadow casting directional light in the scene
To reproduce:
1. Open attached project
2. Open scene "scene"
3. Comment out line 47 "QualitySettings.shadowDistance = 0;" in DeferredParticles.cs
4. Notice messed up edge detection
Comments (1)
Add comment
All about bugs
View bugs we have successfully reproduced, and vote for the bugs you want to see fixed most urgently.
Latest issues
- [Android] Stage information is not logged when Log Shader Compilation is enabled
- [Vulkan] The memory allocation increases rapidly when there are multiple (three or more) Real-Time Reflection Probes in the Scene
- [macOS] Library folder of the opened project can be deleted which leads to the crash
- “Default Scene” dropdown field contains a spelling mistake “Default Builtin”
- Editor crashes on PPtr<Mesh> after adding Text Mesh and Cloth Components to the same GameObject
jbooth
Dec 02, 2015 16:45
So, this just cost me a day of fooling around trying to figure out why my image processing effect (which renders some data with a camera) wasn't working correctly.
I get the desire to share buffers between things; but as long as Unity supports multiple cameras it should be assuming we can share the depth buffer between the shadow pass and the regular pass. It's also incredibly frustrating when you come across these types of optimizations, because figuring our what is going on can be very difficult. I'm much prefer the camera system be more explicit about what you want to do, such as being able to Camera.Render(CameraPass.ForwardBase); and know that shows, etc aren't going to be rendered. I often just use cameras to render things into buffers.