Search Issue Tracker
Won't Fix
Votes
8
Found in
5.3.3p2
Issue ID
775291
Regression
No
Graphics.Blit(renderTexture, null) doesn't blit to screen unless main camera target texture is null
Graphics.Blit(renderTexture, null) doesn't blit to screen unless main camera target texture is null
Documentation says that blitting render texture to a null render target will blit it to screen:
http://docs.unity3d.com/ScriptReference/Graphics.Blit.html
However, if you set Camera.main.targetTexture to a renderTexture, and then blit that render texture to "null" render target, it will complain that "Scene is missing a fullscreen camera". A workaround is to set Camera.main.targetTexture to null before doing the blit.
Comments (1)
Add comment
All about bugs
View bugs we have successfully reproduced, and vote for the bugs you want to see fixed most urgently.
Latest issues
- Crash on CollectAllSceneManagerAndObjectIDs when opening a specific Scene
- GPU device suspended error when replaying a video in the Video Player with NVIDIA Vertical Sync set to Fast
- GPU device suspended error when replaying a video in the Video Player with NVIDIA Vertical Sync set to Fast
- SetComponentEnabled uses class instead of struct when constraining Enableable Component type
- [iOS] Application.deepLinkActivated does not get invoked while app is running when UIApplicationSceneManifest is added in Info.plist
MattRix
Oct 05, 2016 16:19
Note that even the workaround has issues, because any image effects on the camera will then get rendered at the full screen resolution rather than the targetTexture resolution, even if you set targetTexture to the proper texture again after the Blit. (this is probably a separate bug, but it's worth noting here)