Search Issue Tracker
Fixed in 2018.1.X
Votes
2
Found in
5.0.0b22
Issue ID
674107
Regression
No
[Shadows] Camera.Render() clears _CameraDepthTexture if there's a shadow casting directional light in the scene
To reproduce:
1. Open attached project
2. Open scene "scene"
3. Comment out line 47 "QualitySettings.shadowDistance = 0;" in DeferredParticles.cs
4. Notice messed up edge detection
Comments (1)
Add comment
All about bugs
View bugs we have successfully reproduced, and vote for the bugs you want to see fixed most urgently.
Latest issues
- “Remove Unused Overrides” available on not loaded Scene and throws “ArgumentException: The scene is not loaded” warning
- Adaptive Probe Volume occlusion edge is calculated incorrectly when viewing probes near geometry edges
- Sampling a texture using an HLSL file throws shader errors and the code does not compile
- "Graphics.CopyTexture called with null source texture" error when Base Camera of an Overlay Camera is removed with DX11 Graphics API and Compatibility Mode enabled
- WebGL sends wrong value with large numbers when SendMessage function is used
jbooth
Dec 02, 2015 16:45
So, this just cost me a day of fooling around trying to figure out why my image processing effect (which renders some data with a camera) wasn't working correctly.
I get the desire to share buffers between things; but as long as Unity supports multiple cameras it should be assuming we can share the depth buffer between the shadow pass and the regular pass. It's also incredibly frustrating when you come across these types of optimizations, because figuring our what is going on can be very difficult. I'm much prefer the camera system be more explicit about what you want to do, such as being able to Camera.Render(CameraPass.ForwardBase); and know that shows, etc aren't going to be rendered. I often just use cameras to render things into buffers.