Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Scrcpy 2.2 adds Android camera mirroring (github.com/genymobile)
138 points by rom1v on Nov 4, 2023 | hide | past | favorite | 34 comments


Scrpy is awesome and one of those rare pieces of software that has a clear and powerful thing to do and does it so well you get to forget it's even there. Essentially you get the feeling that this is how it should work.

But at the same time I find it a bit sad example of how much effort to and how much corner case fiddling such conceptually very simple tasks have to take.

Caricatyring here, but it is essentially just blitting pixels over a network connection. I don't think it should be that hard a task with a more reasonable underlying stack. Plan 9 (and probably many other sadly forgotten systems) showed how something like scrcpy can fall out more or less automatically from a better designed system.

Maybe I'm underestimating some fundamental complications, but I can't help but feel something has gone very wrong with how we do things with computers. Maybe something like worse is better on piles of previous worse is better.


Quick math on blitting pixels: - lets assume 1920x1080: 2073600 pixels.

- 4 bytes a pixel = 2073600 * 4 = 8294400 = 8.2 MB / frame.

- 8.2 MB frame * 60 fps = 480 MB/sec = 3.2 gbps = an order of magnitude higher than even your average techie's internet entire pipe. That's for 1080p. The market baseline for resolution, 4K, 4x's that. 240 fps (market baseline for gamers) adds another 4x. ~50 gbps

As long as you're willing to accept intellectually that _sometimes_ there will be video-like content played, where more than 10% of the pixels change in a frame, you now _need_ to assume some form of compression algorithm is applied on every frame, i.e. do video encoding.


This is why VNC has been the solution for this during many years, and still remains so.

Scrcpy is different because it does not require you to.install anything on the phone. It uses the built-in debugging tools (which you need to just activate in a few clicks).


>Scrcpy is different because it does not require you to.install anything on the phone.

When you run scrcpy it copies over the server app which runs on the device. The debugging tools are not used for doing the screen capturing.


Do you know why when using WiFi it has to send data over ADB TCP connection rather than self managed UDP?


Yep, exactly. Sending video over the network with low latency is not a trivial task. There’s a reason we have video and image formats that do heavy compression and inter-frame analysis to reduce bandwidth.


What share of scrcpy's LoCs do the the actual encoding and decoding to save the bits?


I think you're taking for granted just how performance intensive this is. For example, "Just a blit" is a full screen draw worth of pixel operations on a hardware platform that is known to have pixel draw at a premium.


Of course there are a lot of implementation details that need to be done for the bits to move and the pixels to light up.

But the scrcpy code is not about this, it's all hidden behind the libraries and in the hardware. What scrcpy has to do is micromanage and hack the APIs on the both sides for this all to happen in the guts of the systema.

I don't see a reason why all this couldn't be done with something like /dev/remote/window < /dev/screen

Of course there would be a lot of magic happening at the implementation level, but it's not like you type in all the HTTP protocol and negotiate the compression algorithm manually when you go to a website.


I don't understand this comment. It's a marvel of course how far we have come, but what today is so intensive?

The (old!) Apple iPad cable runs BSD Unix and does real time video encoding.


Correct, you're violently agreeing, _they embed a whole *NIX computer in the cable to make it happen_.

They were replying to someone saying why does everyone make it so complicated.

Refer to my comment for the math on why its intensive


Are you violently agreeing with me now? :-)


It did video encoding because the cable wasn’t capable of streaming full quality video at the required bandwidth. Not because they wanted to make a complicated dongle.


What part of streaming a screen on Android do you find complex? You pretty much just setup MediaEncoder and let it do its thing. For the input to the encoder you create a VirtualDisplay that mirrors the default Display that is backed by the input Surface of the encoder. For the output of the encoder you can just send the output buffer over network.

The PC side isn't complex either. You take those packets from the phone, run it through a decoder that draws the frames to a texture which you can show in a window.


If it's so simple, why does it take 30k or so lines of code for scrcpy to do it?


Because there are a lot of features beyond just streaming a screen capture, it supports many versions of Android, and it is a polished utility.

The actually screen recording part is simple since you are just using existing Android APIs to have hook up the diplay server to a media encoder and the media encoder to the network.

https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786...


That is reasonablish as video APIs go. But it's quite a small part of all scrcpy has to do to mirror the screen over the network.

Most of that code would be covered with something like "gst-launch pipewiresrc ! encodebin" and the whole mirroring with adding a few elements.

Well, in theory at least; gstreamer has the habbit of getting quite fiddly oftentimes. But I don't see any fundamental problems why this too couldn't work with simple composable tools.


Bandwidth.


Android 14 was originally supposed to add webcam functionality (IIRC it added an "act as a webcam" option when plugging the phone in via USB in the Betas). The speculation is that although it was removed from the final release, it might be added in a December update.


Does that mean effectively it will only be available for devices that release with Android 15 (ignoring pixels)?

Edit: I have a OnePlus nord n30 5G that released with Android 13. Is it eligible for this feature? Iirc OnePlus said it will only give this budget phone one (1) version upgrade so wondering if a December feature drop counts as a second upgrade… :(

Maybe won’t even matter if this requires specialized hardware :(


Maybe by that time the'll be a LineageOS rom for your phone so you aren't tied to OnePlus limited support policy



It's live and working in the qpr1 beta so it'll roll out to stable in December.


This software just saved my bacon!

I dropped my phone and the screen broke. The touchscreen worked, but no backlight.

Luckily I had enabled Android debugging. It took a bit of trial and error but I managed to unlock the phone and then accept the adb key from my laptop.

After that, scrcpy let me grab all my files and transfer everything to my new phone.

If you have an android device take a few minutes to get scrcpy set up. It just might let you recover from a dead screen some day.


Yes, it is quite good for rescuing data etc. because in a lot of cases it is the screen that fails.

In your case where the backlight dies it often helps to shine a flashlight on the screen and it will usually become more intelligible.

One thing to keep in mind is that android will revoke saved debug-computers after 7 days or something meaning you have to press a popup to accept the connection. You can turn that off in the developer options and then you won't need any interactions on the device to start scrcpy (as longs as it is a computer you've used before with adb).


I know my device can do HDMI out and USB at the same time over the type-c port, so that's my main plan if I end up with a destroyed screen.

I also have kdeconnect installed on everything, but android will remove its permissions if I don't use it often enough.


> I also have kdeconnect installed on everything, but android will remove its permissions if I don't use it often enough.

Kde connect is also great! You can prevent Android from revoking permissions under the app info menu. It's labeled as "pause app activity if unused."


Such a fine piece of software.


Absolutely, it was a blast to use this in school to control the smartboards where you could enable adb on them. And it worked so well, I could write on the smartboard with my touchscreen laptop.


Which camera app will that use? Can I make it use Vector Camera[1] instead of built-in? So that the picture would be modified by the app, rather than go 'raw' from the camera?

1 - https://f-droid.org/en/packages/com.dozingcatsoftware.vector...


It calls the Android camera2 API directly (without any app).


For that you can use normal screen mirroring and open the camera app of your choice.


Would love to see official Mac binaries along with the Windows ones, that are up to date, etc.





Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: