Search Issue Tracker

Won't Fix

Won't Fix in 1.8.X

Votes

0

Found in [Package]

1.8.0-pre.2

Issue ID

ISXB-792

Regression

No

EventSystem does not receive mouse input when fetching mouse input via an MQTT service (for working with a remote mouse setup)

Package: Input System

-

Reproduction steps:
1. Set up a local MQTT broker e.g. Mosquitto (you might need to add its installation path to the system environmental variables)
2. Run the Windows Command Prompt (cmd) as administrator
3. Start the Mosquitto service by entering: “sc start mosquito”
4. Unzip the “TouchInterface.zip” folder and launch the “Start_TouchInterface.bat” file
5. In the cmd, subscribe to the Mosquitto service events by entering: “mosquitto_sub -t "touchinterface/mouse"“
6. Open the reporter-attached “2_UnityRemoteTouchRepo” project
7. Open the “SampleScene”
8. Enter Play mode
9. Try clicking the UI buttons in the Game view

Expected result: The UI buttons respond to the mouse clicks
Actual result: The UI buttons do not respond to the mouse clicks

Reproducible with: 1.7.0 (2021.3.35f1, 2022.3.20f1, 2023.2.12f1), 1.8.0-pre.2 (2021.3.35f1, 2022.3.20f1, 2023.2.12f1)
Could not test with: 2023.3.0b9 (Package import errors)

Reproduced on: Windows 11 Pro (23H2)
Not reproduced on: No other environment tested

Notes:
- Reproducible in the Player
- Input debugger log messages in the Console show the mouse input. The input data is directly translated to the indicator on the screen (the indicator is red if it receives move events and turns green as soon as the left mouse button is pressed)
- The customer is currently working on a "remote mouse" setup that fetches HID mouse data from a mouse and sends it to the Unity Editor via MQTT. It can be observed that the Unity Editor receives the position data and presses of the mouse, but the UI elements cannot be interacted with

  1. Resolution Note:

    This appears to be bugs in the customer projects, i.e. user error. The UI Event system does receive pointer input events from MQTT and will execute them on the UI.

    The problem is with this code in VirtualMouse.cs

    GameObject indicator = new GameObject();
    indicator.name = "HRP_Mouse_Indicator";
    indicator.transform.SetParent(referenceCanvas.transform); // <-- problem
    mouseImage = indicator.AddComponent<Image>();
    mouseImage.color = Color.red;

    By creating a GO that's parented to the Canvas, it will be targeted by the Event System for handling pointer input, and since it's position is always set to current pointer position, it intercepts all pointer events blocking them from reaching the UI. By removing the indicator object from the Canvas hierarchy, the indicator visual won't display properly, but you can interact with the UI.

    Another (potential) issue with the project is around mouse position: the indicator doesn't match the position of the Windows mouse. This occurs because the received MQTT coordinates are in global Windows screen space and need to scaled to the rendering viewport. Note that VirtualMouse does invert the y-position (since Unity screen space y-axis increases going up) but the mouse position isn't scaled. This probably doesn't matter for remote mouse input, but running locally makes it difficult to interact with the UI controls.

  2. Resolution Note (1.8.X):

    This appears to be an issue within the project and not a bug in the InputSystem.

    I verified the Mouse InputState events created by VirtualMouse and "injected" into the InputSystem via InputState.Change() are in fact processed and forwarded to the UI Input Module. I also verified the UI Events are properly dispatched and executed.

    The reason the UI doesn't respond is because the VirtualMouse creates an "indicator" GameObject that is parented to the UI Canvas, and all of the UI Events are dispatched to it, i.e. it's blocking input from reaching the UI controls.

    The problem is with this code in VirtualMouse.cs

    GameObject indicator = new GameObject();
    indicator.name = "HRP_Mouse_Indicator";
    indicator.transform.SetParent(referenceCanvas.transform); // <-- problem
    mouseImage = indicator.AddComponent<Image>();
    mouseImage.color = Color.red;

    By creating a GO that's parented to the Canvas, it will be targeted by the Event System for handling pointer input, and since it's position is always set to current pointer position, it intercepts all pointer events blocking them from reaching the UI. By removing the indicator object from the Canvas hierarchy, the indicator visual won't display properly, but you can interact with the UI.

    Another (potential) issue with the project is around mouse position: the indicator doesn't match the position of the Windows mouse. This occurs because the received MQTT coordinates are in global Windows screen space and need to scaled to the rendering viewport. Note that VirtualMouse does invert the y-position (since Unity screen space y-axis increases going up) but the mouse position isn't scaled.

    I'm guessing this is expected, and doesn't matter for remote mouse input, but I wanted to call it out just in case.

Add comment

Log in to post comment

All about bugs

View bugs we have successfully reproduced, and vote for the bugs you want to see fixed most urgently.