2

I want to send keyboard and mouse events from a small ARMv7 computer board to a remote one, which has no keyboard nor mouse. I'm planning to send these events on a locally implemented CAN bus. The controlling board has a touch screen that is detected as a touchpad. Support for a hardware keyboard is planned but typically there's none but a virtual keyboard.


The context: both boards shall be combined as a master/slave ensemble in a multimedia installation. The one that runs the multimedia platform (typically but not limited to Kodi) is the slave and has no keyboard, no mouse; it should receive its input events from the CAN bus. The master, controlling board is also connected to a small display, which plays the role of a keyboard and touchpad when the slave multimedia board is turned on. The cross-development toolchain is Gentoo Linux.


So I was wondering about the most straightforward way to send local keyboard and mouse (touchpad) events to the remote host, given that I'm no kernel developer.

For instance I could figure out using netcat to send local keyboard/mouse events (from /dev/input/*) to a remote machine but as far as I understand there must be a keyboard and mouse plugged on the remote machine... unless there is some kind of dummy driver available that I could use on the remote end. (This said, I tried but it didn't work.)

I'm not against kernel input module development, for instance, but I'd rather combine user space applications if possible or develop one if necessary. I would also prefer sending "raw" hardware events rather than assume there's a graphical engine such as Xorg or Wayland running on the remote board. Unless discouraged of course.

So in the end I'm wondering if I should either

  • hack lirc and add CAN support, for instance or
  • hack linux input drivers.

If there's a more hassle-free way, I'm all open.

  • 1
    Do/can you have an X server on both boards? If so x2x might be a solution. – Gilles 'SO- stop being evil' Mar 24 '15 at 23:28
  • 1
    It is indeed planned. However I prefer not to rely on the presence of X as I'd like the solution to be agnostic of the graphical environment. Further investigating lead me to uinput, which looks exactly like what I'm looking for. I can manage to find a way to send input events to a remote host and uinput allows for injecting them directly into the event queue on the remote host. –  Mar 25 '15 at 10:12

1 Answers1

0

I just did this yesterday with Input Interception tools framework (gitlab.com/interception).

Install is easy:

  • Install the handful of prerequisite packages mentioned there (+ pkgconf)
  • Clone the repo
  • run cmake with the options given in the README.

Quick POC:

On source computer (the one with the keyboard): src/interception-tools/build/intercept -g /dev/input/by-id/usb-Logitech_USB_Receiver-if02-event-mouse | nc des.ti.nat.ion 9876

On destination machine (the one you want to "beam" events to): nc -l -p 9876 | src/interception-tools/build/uinput

If destination machine has no other Input sources, you may need to export the keyboard-specific yaml:

uinput -p -d /dev/input/by-id/my-kbd prints my-kbd characteristics in YAML, which itself can be fed back to uinput as uinput -c my-kbd.yaml

This was on Debian bullseye between ARM64@5.13 and AMD64@5.10.

If this works for you, you still have to integrate that into your system startup files, and then it will be automatic and seamless.