Search Issue Tracker
Fixed in 4.5.X
Votes
0
Found in
4.3.2f1
Issue ID
594081
Regression
No
Input.inputString doesn't work properly with unicode characters
This issue has no description.
Add comment
All about bugs
View bugs we have successfully reproduced, and vote for the bugs you want to see fixed most urgently.
Latest issues
- [tvOS] Project crashes on startup on Apple TV simulator (both X86_64 and ARM64 architectures)
- Context menus open on a different monitor when there are three monitors used
- Crash on Umbra::QueryExt::queryStaticShadowCasters when opening a specific project
- Crash on GfxDeviceD3D12Base::DrawBuffersCommon when opening a specific project
- Shader variants are recompiled while Building when the Platform is switched and changed back after the first Build
Alloc
Jan 15, 2015 13:51
Besides the documentation on this being incorrect if Input.inputString is assumed to accept Unicode this issue is in fact not resolved, it only works on Windows, on MacOS and Linux only ASCII characters are reported.
Also see this Unity answers entry: http://answers.unity3d.com/questions/232610/non-ascii-characters-in-inputinputstring-in-osx.html