I want to send keyboard and mouse events from a small ARMv7 computer board to a remote one, which has no keyboard nor mouse. I'm planning to send these events on a locally implemented CAN bus. The controlling board has a touch screen that is detected as a touchpad. Support for a hardware keyboard is planned but typically there's none but a virtual keyboard.
The context: both boards shall be combined as a master/slave ensemble in a multimedia installation. The one that runs the multimedia platform (typically but not limited to Kodi) is the slave and has no keyboard, no mouse; it should receive its input events from the CAN bus. The master, controlling board is also connected to a small display, which plays the role of a keyboard and touchpad when the slave multimedia board is turned on. The cross-development toolchain is Gentoo Linux.
So I was wondering about the most straightforward way to send local keyboard and mouse (touchpad) events to the remote host, given that I'm no kernel developer.
For instance I could figure out using netcat
to send local keyboard/mouse events (from /dev/input/*
) to a remote machine but as far as I understand there must be a keyboard and mouse plugged on the remote machine... unless there is some kind of dummy driver available that I could use on the remote end. (This said, I tried but it didn't work.)
I'm not against kernel input module development, for instance, but I'd rather combine user space applications if possible or develop one if necessary. I would also prefer sending "raw" hardware events rather than assume there's a graphical engine such as Xorg or Wayland running on the remote board. Unless discouraged of course.
So in the end I'm wondering if I should either
- hack
lirc
and add CAN support, for instance or - hack linux input drivers.
If there's a more hassle-free way, I'm all open.
uinput
allows for injecting them directly into the event queue on the remote host. – Mar 25 '15 at 10:12