"adb reverse" currently does not work over tcpip (i.e. on a device
connected by "adb connect"):
<https://issuetracker.google.com/issues/37066218>
To work around the problem, if the call to "adb reverse" fails, then
fallback to "adb forward", and reverse the client/server roles.
Keep the "adb reverse" mode as the default because it does not involve
connection retries: when using "adb forward", the client must try to
connect successively until the server listens.
Due to the tunnel, every connect() will succeed, so the client must
attempt to read() to detect a connection failure. For this purpose, when
using the "adb forward" mode, the server initially writes a dummy byte,
read by the client.
Fixes <https://github.com/Genymobile/scrcpy/issues/5>.
The codec only supports dimensions which are multiple of 8.
Thus, when --max-size is specified, the value is always rounded down to
the nearest multiple of 8.
However, it was wrongly assumed that the physical size is always a
multiple of 8. To support such devices, also round down the physical
screen dimensions.
Fixes <https://github.com/Genymobile/scrcpy/issues/39>.
The text input control_event was initially designed for mapping
SDL_TextInputEvent, limited to 32 characters.
For simplicity, the copy/paste feature was implemented using the same
control_event: it just sends the text to paste.
However, the pasted text might have a length breaking some assumptions:
- on the client, the event max-size was smaller than the text
max-length,
- on the server, the raw buffer storing the events was smaller than the
max event size.
Fix these inconsistencies, and encode the length on 2 bytes, to accept
more than 256 characters.
Fixes <https://github.com/Genymobile/scrcpy/issues/10>.
Paste computer clipboard to the device on Ctrl+v.
The other direction (pasting the device clipboard to the computer) is
not implemented. It would require a communication channel from the
device to the computer, other than the socket used by the video stream.
No exception was thrown on EOF, so the event controller did not
terminate. This leaded to a further InvocationTargetException.
Instead, terminate the event controller on EOF, so that the process
terminates properly.
Expose a 'prebuilt_server' option to pass the path of the prebuilt
binary, so that the build does not require Android SDK.
Usage:
meson builddir -Dprebuilt_server=/tmp/my_prebuilt_server.jar
The custom target used to invoke Gradle from Meson should always
be built, otherwise, the server would not be rebuilt on source changes.
However, when enabling "build_always", gradle is invoked as root on
"sudo ninja install" after "ninja", so it downloads the whole Gradle
world into /root/.gradle.
To avoid the problem, just do not call gradle if the effective user id
is 0.
The client was built with Meson, the server with Gradle, and were run by
a Makefile.
Add a Meson script for the server (which delegates to Gradle), and a
parent script to build and install both the client and the server to the
system, typically with:
meson --buildtype release build
cd build
ninja
sudo ninja install
In addition, use a separate Makefile to build a "portable" version of
the application (where the client expects the server to be in the
current directory). Typically:
make release-portable
cd dist/scrcpy
./scrcpy
This is especially useful for Windows builds, which are not "installed".
Characters like 'é' or 'î' are not resolved by getEvents(). For example,
getEvents("é") returns null.
However, it is possible to decompose them. For example,
getEvents("\u0301e") returns the events generating "é".
Thank you Philippe! ;)
In handleEvent(), connection.receiveControlEvent() may never return
null: either it returns a valid ControlEvent, either it throws an
Exception.
Therefore, there is no need to propagate a flag to indicate whether it
returned a valid ControlEvent.
On some devices, we can reuse the same codec and display, but on some
others (e.g. Nexus 5X with Android 7.1.2), it crashes on codec.stop()
with an IllegalStateException.
Therefore, always recreate the codec and display, so that it works on
all devices.
The right-click is almost useless on Android, so use it to turn the
screen on.
Add a new control event type (command) to request the server to turn the
screen on.
Replace screenrecord execution by manual screen encoding using the
MediaCodec API.
The "screenrecord" solution had several drawbacks:
- screenrecord output is buffered, so tiny frames may not be accessible
immediately;
- it did not output a frame until the surface changed, leading to a
black screen on start;
- it is limited to 3 minutes recording, so it needed to be restarted;
- screenrecord added black borders in the video when the requested
dimensions did not preserve aspect-ratio exactly (sometimes
unavoidable since video dimensions must be multiple of 8);
- rotation handling was hacky (killing the process and starting a new
one).
Handling the encoding manually allows to solve all these problems.
Accept a parameter to limit the video size.
For instance, with "-m 960", the great side of the video will be scaled
down to 960 (if necessary), while the other side will be scaled down so
that the aspect ratio is preserved. Both dimensions must be a multiple
of 8, so black bands might be added, and the mouse positions must be
computed accordingly.
The video screen size on the client may differ from the real device
screen size (e.g. the video stream may be scaled down). As a
consequence, mouse events must be scaled to match the real device
coordinates.
For this purpose, make the client send the video screen size along with
the absolute pointer location, and the server scale the location to
match the real device size before injecting mouse events.
Currently, we only use screen information (width, height, rotation)
once at initialization, to send the device size to the client.
To be able to scale mouse events, make it accessible in memory. For this
purpose, replace the "static" DeviceUtil to a singleton Device, and
update it on every screen rotation.
To control the device from the computer:
- retrieve mouse and keyboard SDL events;
- convert them to Android events;
- serialize them;
- send them on the same socket used by the video stream (but in the
opposite direction);
- deserialize the events on the Android side;
- inject them using the InputManager.