Search Issue Tracker
Postponed means that the issue was either a feature request or something that requires major refactoring on our side. Since that makes the issue not actionable in the close future we choose to close it as Postponed and add it on our internal roadmaps and technical debt pages instead.
Postponed
Votes
0
Found in
2019.4
2019.4.29f1
2020.3
2021.1
2021.2
2022.1
Issue ID
1362637
Regression
No
Crash on __pthread_kill when deserializing a different type field with the same name
Reproduction steps:
1. Open the user's attached project
2. Enter play mode, then exit play mode
3. Comment-out line 6 in Data.cs
4. Uncomment line 7 in Data.cs
5. Enter play mode
Expected result: Editor throws an error, that the serialized variable field type has changed
Actual result: Editor crashes
Reproducible with: 2019.4.30f1, 2020.3.18f1, 2021.1.20f1, 2021.2.0b11, 2022.1.0a8
Stack trace:
Obtained 256 stack frames.
#0 0x007fff2044d92e in __pthread_kill
#1 0x007fff203d1411 in abort
#2 0x0000013a475826 in mono_log_write_logfile
#3 0x0000013a489830 in monoeg_g_logv
#4 0x0000013a4899d5 in monoeg_assertion_message
Add comment
All about bugs
View bugs we have successfully reproduced, and vote for the bugs you want to see fixed most urgently.
Latest issues
- Articulation Body with 'Revolute' Joint Type has erratic behavior when Upper Limit is set to above 360
- WebGL Player fails to render Scene when Terrain with Detail Mesh is added and WebGPU Graphics API is used
- Inconsistent errors are logged when different types are passed into the Query "Q<>" method in UIToolkit and the ancestor VisualElement is null
- Crash on GetMaterialPropertyByIndex when opening a specific Scene
- Discrepancies in the styling are present when using a TSS file instead of a USS file in custom EditorWindow
Resolution Note (2022.1.X):
Unity does not support loading existing assets after changing the type of a field from an array to a single element. Doing specific checks for that case to avoid the crash would slow down the loading path for all data because it is not a quick check to perform. We will keep tracking this scenario internally as something to address in our upcoming feature and quality work in the area of data migration support.