Search Issue Tracker

By Design

Votes

0

Found in

2018.4

2019.1

2019.2

2019.3

2020.1

Issue ID

1195549

Regression

No

Lightmap textures are broken when lightmap quality is set to Normal or Low

Progressive Lightmapper

-

If user imports a texture to use with texture type Lightmap, using any built-in (non-lightmapping aware) shader, e.g. the "Default" shader, that texture will look correct only on platforms that have Lightmap texture quality High, and the texture is broken on all platforms that use lightmap quality Medium or Low.

Data:

1. Users can import textures to Unity that have a texture type to set to be Lightmap. Such textures will be internally and automatically converted to dLDR, RGBM or FP16 based on the target platform's lightmap quality setting.

2. Each platform has a setting "defaultLightmapQuality" (in BuildTargetDiscovery.cpp) that defaults to:
- Low on iOS, AppleTV, Android and Lumin. Low lightmap quality uses dLDR encoding that enables a [0,2] range, compressed down to [0,1].
- Medium on Nintendo Switch and WebGL, (although very soon WebGL will be changed to default to High). Medium quality uses RGBM encoding where a scaling magnitude is encoded in the alpha channel of a RGBA32 UNorm texture.
- High on all other platforms, where a RGB FP16 floating point format is used.

3. "Standalone" platforms have an extra option in Player Settings->Player->Other Settings->Lightmap Encoding, that allows overriding defaultLightmapQuality for the current project. Non-"standalone" platforms do not allow the option to be overridden (why?) (PlayerSettingsEditor.cs, ~1604, "bool customLightmapEncodingSupported = (targetGroup == BuildTargetGroup.Standalone);", but always are fixed to use the defaultLightmapQuality setting specified in BuildTargetDiscovery.cpp.

4. Lightmap textures can be attached as inputs to any material/shader texture bind point. That does not make the shader automatically aware of the encoding that the texture was given, but it will still sample it as was handwritten in shader. As a result, that the lightmap is only properly read if it had FP16 format, since that format has no encoding/decoding step needed. The other formats dLDR and RGBM would need decoding, but there is no machinery in the shader to know when to do such decoding.

One could argue that it is a user error to use Lightmap textures in shader/material bind points that are not aware to read the input texture as a lightmap. However:
- we do not currently prevent such connections from being made (if a shader texture input bind point should not support a Lightmap, we should give an error if user attempts to do that)
- it works on platforms where default lightmap quality is high, giving a positive reinforcement of "this is how you attach lightmaps to shaders". Case 1163130 suggests people do such things.
- users cannot know what happens with the automatic defaultLightmapQuality conversion scheme under the hood ("it worked on Windows builds, but not when I switched platforms") since we hide the behavior from user

  1. Resolution Note (2020.1.X):

    Please use DecodeLightmap function from UnityCG.cginc to decode the values of the lightmap after manually reading the lightmap texture from your custom shader.

Add comment

Log in to post comment

All about bugs

View bugs we have successfully reproduced, and vote for the bugs you want to see fixed most urgently.