Characters like 'é' or 'î' are not resolved by getEvents(). For example,
getEvents("é") returns null.
However, it is possible to decompose them. For example,
getEvents("\u0301e") returns the events generating "é".
Thank you Philippe! ;)
In handleEvent(), connection.receiveControlEvent() may never return
null: either it returns a valid ControlEvent, either it throws an
Exception.
Therefore, there is no need to propagate a flag to indicate whether it
returned a valid ControlEvent.
On some devices, we can reuse the same codec and display, but on some
others (e.g. Nexus 5X with Android 7.1.2), it crashes on codec.stop()
with an IllegalStateException.
Therefore, always recreate the codec and display, so that it works on
all devices.
The right-click is almost useless on Android, so use it to turn the
screen on.
Add a new control event type (command) to request the server to turn the
screen on.
Replace screenrecord execution by manual screen encoding using the
MediaCodec API.
The "screenrecord" solution had several drawbacks:
- screenrecord output is buffered, so tiny frames may not be accessible
immediately;
- it did not output a frame until the surface changed, leading to a
black screen on start;
- it is limited to 3 minutes recording, so it needed to be restarted;
- screenrecord added black borders in the video when the requested
dimensions did not preserve aspect-ratio exactly (sometimes
unavoidable since video dimensions must be multiple of 8);
- rotation handling was hacky (killing the process and starting a new
one).
Handling the encoding manually allows to solve all these problems.
Accept a parameter to limit the video size.
For instance, with "-m 960", the great side of the video will be scaled
down to 960 (if necessary), while the other side will be scaled down so
that the aspect ratio is preserved. Both dimensions must be a multiple
of 8, so black bands might be added, and the mouse positions must be
computed accordingly.
The video screen size on the client may differ from the real device
screen size (e.g. the video stream may be scaled down). As a
consequence, mouse events must be scaled to match the real device
coordinates.
For this purpose, make the client send the video screen size along with
the absolute pointer location, and the server scale the location to
match the real device size before injecting mouse events.
Currently, we only use screen information (width, height, rotation)
once at initialization, to send the device size to the client.
To be able to scale mouse events, make it accessible in memory. For this
purpose, replace the "static" DeviceUtil to a singleton Device, and
update it on every screen rotation.
To control the device from the computer:
- retrieve mouse and keyboard SDL events;
- convert them to Android events;
- serialize them;
- send them on the same socket used by the video stream (but in the
opposite direction);
- deserialize the events on the Android side;
- inject them using the InputManager.
Move the DeviceUtil internal static classes to public classes, in a
separate package (".wrappers").
This paves the way to implement InputManager properly.