Search Issue Tracker
Duplicate
Votes
0
Found in
5.5.2f1
Issue ID
892119
Regression
No
Texture2DArray.Apply() fails to free texture's system memory after uploading it to GPU
To reproduce:
1. Download and open attached "TestTextureArray2.zip (4.8 MB)" project.
2. Open "main" scene.
3. Enable "LoadTextureArray" object.
4. Build project on any Player (Desktop/Android/iOS).
5. Profile the player with profiler and notice how much total memory uses "Unity" and "GfxDriver".
6. Go back to project and disable LoadTextureArray object.
7. Enable LoadTexture2D object.
8. Repeat steps 4 and 5.
9. Notice the difference in "Unity" memory usage.
Expected result: "Unity" total memory usage should be the same no matter if Loading Texture2Darray or Texture2D.
Actual result: "Unity" total memory usage is significantly bigger when loading texture with Texture2Darray.
Reproduced with: 5.5.3p2, 5.6.1p1, 2017.1.0b5
Reproduced on: Standalone player, iPad Mini 2 (iOS 9.2.1), Samsung SM G903F (GalaxyS5neo)
Notes: Could not test on earlier versions because of shader compilation errors.
Add comment
All about bugs
View bugs we have successfully reproduced, and vote for the bugs you want to see fixed most urgently.
Latest issues
- “Remove Unused Overrides” available on not loaded Scene and throws “ArgumentException: The scene is not loaded” warning
- Adaptive Probe Volume occlusion edge is calculated incorrectly when viewing probes near geometry edges
- Sampling a texture using an HLSL file throws shader errors and the code does not compile
- "Graphics.CopyTexture called with null source texture" error when Base Camera of an Overlay Camera is removed with DX11 Graphics API and Compatibility Mode enabled
- WebGL sends wrong value with large numbers when SendMessage function is used
This is a duplicate of issue #919162