Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I want to be able to install apps from alternative app stores like F-Droid and receive automatic updates, without requiring Google's authorization for app publication.

Manually installing an app via adb must, of course, be permitted. But that is not sufficient.

> Keeping users safe on Android is our top priority.

Google's mandatory verification is not about security, but about control (they want to forbid apps like ReVanced that could reduce their advertising revenue).

When SimpleMobileTools was sold to a shady company (https://news.ycombinator.com/item?id=38505229), the new owner was able to push any user-hostile changes they wanted to all users who had installed the original app through Google Play (that's the very reason why the initial app could be sold in the first place, to exploit a large, preexisting user base that had the initial version installed).

That was not the case on F-Droid, which blocked the new user-hostile version and recommended the open source fork (Fossify Apps). (see also this comment: https://news.ycombinator.com/item?id=45410805)





Yes, it's all about control. Control the platform. Control the access to the platform, and the world is your oyster. And the political and legislation system are their friends. It is the establishment.

The only way to fight is to indoctrinate the next generation, at home, and in school, to use FOSS. People tend to stick to whatever they used in childhood. We the software engineers should volunteer in giving speeches to students about this. It is much easier to sell ideologies to younger people when they are rebellious to the institutions.


I agree with you. But you do realize that it's been like that since about 20 years now. It started because of Microsoft (proprietary software), then Google (propriteary platform), now ChatGPT (proprietary knowledge).

And I tried to tell my kids. And it failed mostly.

But in the long run (a decade), what is exceptional and proprietary will become common FOSS. And everybody will benefit.


I envision this as an ideology. We don't need every kid to follow it, and I don't expect the majority to follow. A 1-2% is good enough. That's why giving speeches to teenages might be the best bang for the buck. There are always kids who need to escape into some cool ideas and it could be the idea of FOSS.

Really its probably the dumbass judge that told Google "The apple app store isn't anti-competitive because they don't allow any competitors on their platform" when google asked why the play store was ruled a monopoly and the app store wasn't.

I cannot think of a more detached and idiotic ruling than that.


The US anti trust legislation punishes the abuse of monopoly power, not being monopoly in itself. Google was found guilty in leveraging their dominating position on the platform to do just that.

On the other hand in the US Apple's App Store was not found to be a monopoly in the first place. Different cases about abusing dominating position also didn't go far.


Hmm, having read that, I am starting to sympathize with Google if they are going to be punished for being open.

No one seems to care that Apple has never allowed freedom on their devices. Even the comments here don't seem to mention it. Google was at least open for a while.

Or maybe no one mentions it just because the closed iPhone is a fait accompli at this point.


Perhaps because Apple never “promised” to be open, Google instead built itself by playing the good guy and started to switch when money called so those who chose them for that reason feel betrayed.

I guess they are going to say whatever to prove the case. The legislation system is highly...closed and shun of laymen.

It's the JUDGE that came up with that reasoning.

Because that's the law, like it or not. Apple doesn't have a problem because the rules were the rules from day 1. Google did a bait and switch, legally.

What does antitrust law have to do with "day 1"? So if Ford and GM are both already in all 50 states and then they try to divide up territory between them, that's illegal, but at the point when there were still areas one of them wasn't in, they could publicly announce a contractual agreement to not enter into the other's territory? That seems not just questionable but actively bad policy with an enormous perverse incentive.

And if you're going to say this:

> Because that's the law, like it or not.

I would ask you to point me to the text in the statute requiring the courts to do that.


Yeah it's the judge.

I think you missed the point that judges aren't part of the legislative branch. They're in the judicial branch.

Please allow me to correct my bad English and replace it with "the law circle".

But the ruling is correct. You can't have it both ways, if you invite competition you're not allowed to be anti-competitive. You can be Nintendo, offer a single store, only allow first party hardware, and exercise total control over your product. Then your anticompetitive behavior can only be evaluated externally. But if you open yourself up to internal competition with other phone vendors, other stores, and then you flex your other business units (gapps) to force those other vendors to favor you then you're in big trouble.

> But the ruling is correct. You can't have it both ways, if you invite competition you're not allowed to be anti-competitive

That's just stupid, because being anti-competitive is an emergent outcome, rather than anything specific.

Apple is definitely anti-competitive, but they exploited such a ruling so that they can skirt it. Owning a platform that no other entrants are allowed is anti-competitive - whether you're small or large. It's only when you're large that you should become a target to purge via anti-competitive laws. This allows small players to grow, but always face the threat of purging - this makes them wary of trying to take advantage too much, which results in better consumer outcomes.


That's like Karcher opening a megamall to sell all their offering, vacuums, pressure washers, floor washers, you name it .. and then you, Bosch, complaining you can't sell your vacuum in Karcher's megamall where all the people go.

What are you even saying?

Whereas google was letting Bosch sell vacuums in their megamall, but only if it uses Google dust filters and people buy only Google made dust filters and Bosch isn't allowed to sell their own dust filters in the megamall.


It's like a company buying all the land within a 100 mile radius and then nominally "selling" plots to people but with terms of service attached that restrict what you can do with the land you bought and that allow the company to change the terms at any time. And then, after people have moved in, most of them having not even read the terms or realized it wasn't an ordinary sale, they start enforcing the terms against competitors. Which most people don't notice because they aren't competitors, and because the terms also prohibited anyone in the city from telling people what's going on[1]. Then people eventually notice and start to ask whether terms locking out competitors like that are an antitrust violation, and someone says that they're not because the people there agreed to them.

[1] https://som.yale.edu/sites/default/files/2022-01/DTH-Apple-n...

But how is an agreement prohibiting people from patronizing competitors not an antitrust violation? It's not a matter of who agreed to it, it's matter of what they're requiring you to agree to.


> nominally "selling" plots to people but with terms of service attached that restrict what you can do with the land you bought and that allow the company to change the terms at any time.

So, a lease.


That's, to begin with, not even how a lease generally works. A lease isn't where you pay once up front to take permanent possession of something.

Moreover, did people buying iPhones on "day 1" think they were buying them or leasing them? Did Apple call it a sale or a rental agreement?


> Karcher opening a megamall to sell all their offering

And their mall is monopolistic if it is only for Karcher products. However, because a competitor can easily open a mall next door, it means this Karcher mall is small, and so the enforcers should leave it be. Until the day Karcher buys up all the mall space, in which case, they (regulators) start purging their mall monopoly.

The threat of being purged because you've acquired a large enough monopoly should _always_ be there. It's part of doing business in a fair environment.


> You can be Nintendo, offer a single store, only allow first party hardware, and exercise total control over your product.

How is this not even more anti-competitive?

It's fine to be mad at Google for being duplicitous, but treachery is in the nature of false advertising or breach of contract. Antitrust is something else.

"You can monopolize the market as long as you commit to it from the start" seems like the text of the law a supervillain would be trying pass in order to destroy the world.


You can't monopolize a market where there is no market. Nintendo can be anticompetitive in the wider games industry, but there is no market for software that runs on a Switch.

I didn't say I liked the ruling, just that it's correct. The opposite conclusion would be absurd, that you can invent a market where there isn't one and claim a company has a monopoly over it. You would be asking the court to declare that every computing device is a de facto marketplace for software that could run on it and that you can't privilege any specific software vendor. I would love if that were true but you can hopefully agree that such a thing would be a huge stretch legally.


> You can't monopolize a market where there is no market. The opposite conclusion would be absurd, that you can invent a market where there isn't one and claim a company has a monopoly over it.

There is no such thing as "there is no market". There is always a market. The question is, what's in the market? The typical strategy is to do the opposite -- have Nintendo claim that they're competing with Sony and Microsoft in the same market to try to claim that it isn't a monopoly.

But then the question is, are they the same market? So to take some traditional examples, third party software that could run on MS-DOS could also run on non-Microsoft flavors of DOS. OS/2 could run software for Windows. The various POSIX-compliant versions of Unix and Linux could run the same software as one another. Samsung phones can run the same apps as Pixel phones. Which puts these things in the same market as each other, because they're actually substitutes, even though they're made by different companies.

Conversely, you can't run iOS apps on Android or get iOS apps from Google Play or vice versa. It's not because they're different companies -- both of them could support both if they wanted to -- it's that they choose not to and choices have consequences.

If you intentionally avoid competing in the same market as another company then you're not competing in the same market as that company and the absurdity is trying to have it both ways by doing that and then still wanting to claim them as a competitor.


You avoided the important part, there is no market for hardware that can play Nintendo Switch games and there is no market for software providers on Nintendo Switch. And they are legally allowed to do that. You can sell appliances that are bound to a single vendor and you are allowed to not license your hardware or software to 3rd parties.

Since that is a legally permissible action it would be an odd thing for a court to declare that doing such a thing is anticompetitive. If they did they would be declaring all locked down hardware effectively illegal. And while that might be nice it's a bit of a pipedream. Where Google fucked up is that they did license their software to 3rd parties—good for them. But then Google had some regrets and didn't like the fact that they didn't have control over those 3rd parties. But they did have some leverage in the form of Google Play and GSM because users expect it to be there on every Android phone. And then they used that leverage. That's the fuckup. They used Google Play and GSM access to make 3rd parties preinstall Chrome and kill 3rd party Android forks. They used anticompetitive practices on their competitors—other Android device manufacturers.

This situation can't occur for Apple or Nintendo because there aren't other iOS/Switch device manufacturers and they don't have to allow them to exist. They can be anticompetitive for other reasons but not this.


> You avoided the important part, there is no market for hardware that can play Nintendo Switch games and there is no market for software providers on Nintendo Switch.

There is a market for these things. Nintendo sells hardware that can play Nintendo Switch games and people buy it. That's a market.

It seems like you're trying to claim that a monopoly isn't a market, but how can that possibly be how antitrust laws work? Your argument is that they don't apply to something if it is a monopoly?

> And they are legally allowed to do that.

That's just assuming the conclusion. Why should it be legal for them to exclude competitors from selling software to their customers? The obviously anti-competitive thing should obviously be a violation of any sane laws prohibiting anti-competitive practices. The insanity is the number of people trying to defend the practice.

Consider what it implies. 20th century GE could have gone around buying houses, installing a GE electrical panel and then selling the houses with a covenant that no one could use a non-GE appliance in that house ever again, or plug in any device that runs on electricity without their permission. They could buy and sell half of all the housing stock in the country and Westinghouse the other half and each add that covenant and you're claiming it wouldn't be an antitrust violation.

Apple wouldn't have been able to get their start because they'd have needed permission from GE or Westinghouse for customers to plug in an Apple II or charge an iPhone and they wouldn't get it because those companies were selling mainframes or flip phones and wouldn't want the competition. If that's not an antitrust violation then we don't have antitrust laws.

> If they did they would be declaring all locked down hardware effectively illegal.

It's fine for hardware to be locked down by and with the specific permission of the person who owns it. But how is it even controversial for the manufacturer locking down hardware for the purpose of excluding competitors to be a violation of the laws against inhibiting competition? It's exactly the thing those laws are supposed to be prohibiting.


So basically you're saying we're fucked. People don't care about FOSS in general, let alone when their phone says it's dangerous.

If peopele cared about privacy as much as politics pretends they do, we'd have solved so many problems in society.

Fortunately, those fighting, albeit a minority, have done great work in protecting this. No reason to stop now.


Yeah we are fucked, but as long as a small percentage of us, like 1% of the population knows, understands and agrees with the idea I think we are fine.

We'll have to initiate solid self defense protocol though. I think the first thing we should do is get a new logical fallacy term officially coined.

Stallman/StallManned Abusing the principles of the Slippery Slope to discredit perfectly rational predictions


Really difficult because you need to have two devices.

One mandated be the establishment and one mandated by visions and freedom.

But it would be a great start.

On my work laptop I am mandated to use Windows 11 but I run (and when I have time) I develop FOSS.


Imagine needing to agree with a TOS that can lock you out of your phone when they change/add some random new policy

I don't really see how you can both allow developers to update their apps automatically (which is widely promoted as being good security practice) and also defend against good developers turning bad.

How does Google know if someone has sold off their app? In most cases, F-Droid couldn't know either. A developer transferring their accounts and private keys to someone else is not easily detected.


> In most cases, F-Droid couldn't know either.

F-Droid is quite restrictive about what kinds of app they accept, they build the app from source code themselves, and the source code must be published under a FLOSS license. They have some checks that have to pass for each new version of an app.

Although it's possible for a developer to transfer their accounts and private keys to someone shady, F-Droid's checks and open source requirements limit the damage the new developer can do.

https://f-droid.org/docs/Inclusion_Policy/

https://f-droid.org/docs/Anti-Features/


One thing worth noting, these checks and restrictions only apply if you're using the original F-Droid repository.

Many times I've seen the IzzyOnDroid repository recommended, but that repo explicitly gives you the APKs from the original developers, so you don't get these benefits.


That's true. The whole point of an open ecosystem is that you get to decide who you get your software from. You can decide on the official F-Droid repository and get the benefits and drawbacks of a strict open source rule with the F-Droid organization's curation if that's your preference. You can add other repositories with different curation if you prefer that.

You know what? That's bullshit.

Anybody slightly competent can put horrendous back doors into any code, in such a way that they will pass F-Droid's "checks", Apple's "checks", and Google's "checks". Source code is barely a speed bump. Behavioral tests are a joke.


Anyone determined enough can break into any house. If not through ingenuitiy, then by a brick to your window. Doesn't mean we shouldn't lock our doors, turn off our lights, and close our curtains anyway.

The fortunate thing is that 99% of people won't bother trying to break your app if it's not dead simple. Advanved security mechanisms to check for backdoors is probably something only billionaire tech companies need to worry about.


You totally misunderstand the threat model. It's not about anybody breaking your app. It's about people making their own apps do things they're not supposed to do.

... and there's always a tradeoff in terms of how much of a deterrent anything is. The app store checks are barely measurable.


The app store checks are barely measureable, yes. Hence why being open source is the best check for any undocumented changes. Even if it's not discovered on FDoid, reports will come out for those who dig. Much easier to view source code than decompiling an APK to analyze.

But at some point there needs to be some level of trust in anything you install. You can't rely on institutions to make sure everything is squeaky clean. They can't even do that on content platforms (or at least, they choose not to afford it).


> In most cases, F-Droid couldn't know either. A developer transferring their accounts and private keys to someone else is not easily detected.

1. The Android OS does not allow installing app updates if the new APK uses a different signing key than the existing one. It will outright refuse, and this works locally on device. There's no need to ask some third party server to verify anything. It's a fundamental part of how Android security works, and it has been like this since the first Android phone ever release.

2. F-Droid compiles all APKs on its store, and signs them with its own keys. Apps on F-Droid are not signed by the developers of those apps. They're signed by F-Droid, and thus can only be updated through and by F-Droid. F-Droid does not just distribute APKs uploaded by random people, it distributes APKs that F-Droid compiled themselves.

So to answer your question, a developer transferring their accounts/keys to someone else doesn't matter. It won't affect the security of F-Droid users, because those keys/accounts aren't used by F-Droid. The worst that can happen is that the new owner tries injecting malware into the source code, but F-Droid builds apps from source and is thus positioned to catch those types of things (which is more than can be said about Google's ability to police Google Play)

And finally,

> How does Google know if someone has sold off their app?

Google should not know anything about the business dealings of potential competitors. Google is a monopoly[1], so there is real risk for developers and their businesses if Google is given access to this kind of information.

[1]: https://www.google.com/search?q=is+google+a+monopoly%3F&udm=...


Android also has the feature of warning the user if an update is coming from a different source than what is installed. This will happen even if they have the same key. This reply isn't trying to argue against anything you've said. I am just adding to the list of how Android handles updates.

> F-Droid compiles all APKs on its store, and signs them with its own keys. Apps on F-Droid are not signed by the developers of those apps. They're signed by F-Droid, and thus can only be updated through and by F-Droid. F-Droid does not just distribute APKs uploaded by random people, it distributes APKs that F-Droid compiled themselves.

For most programs I use, they just publishing the developer's built (and signed) APK. They do their own build in parallel and ensure that the result is the same as the developer's build (thanks to reproducible builds), but they still end up distributing the developer's APK.


Can you give some examples? I've heard that's a thing, but I'm not familiar with any apps that actually pull it off (reproducible builds are difficult to achieve)

Reproducible builds may be hard to achieve, but that doesn't mean you don't have a list of such builds long enough to crash your browser: https://verification.f-droid.org/verified.html

Weird to have a page like that if a human can't use it. Needs some pagination, f-droid!

It's like we're supposed to save the page and grep it or something. Doesn't work in my Firefox.


You have to trust somebody.

Who is F-Droid? Why should I trust them?

How do I know they aren’t infiltrated by TLAs? (Three Letter Agencies), or outright bad-actors.

Didn’t F-Droid have 20 or so apps that contained known vulnerabilities back in 2022?

Who are all these people? Why should I trust them, and why do most of them have no link to a bio or repository, or otherwise no way to verify they are who they say they are and are doing what they claim to be doing in my best interests?

https://f-droid.org/en/about/


I trust them, at least a lot more than I do Google, which is a known bad actor, and collaborator with "TLAs". F-Droid has been around for a very long time, if you didn't know. They've built and earned the trust people have in them today.

> Didn’t F-Droid have 20 or so apps that contained known vulnerabilities back in 2022?

Idk what specific incident you're referring to, but since they build apks themselves in an automated way, if a security patch to an app breaks the build, that needs to be fixed before the update can go out (by F-Droid volunteers, usually). In that case, F-Droid will warn about the app having known unpatched vulnerabilities.

Again, this is above and beyond what Google does in their store. Google Play probably has more malware apps than F-Droid has lines of code in its entire catalog.



Right, that's literally the team marking 12 apps as having known vulnerabilities (seems like it was because of a WebRTC vulnerability that was discovered). It's the F-Droid system working as intended to inform users about what they're installing.

You're calling it an incident like it was an attack or something, but it just seems like everyday software development. Google Play and the App Store don't let me know when apps have known vulnerabilities. I think F-Droid is coming out way ahead here.


So Google and Apple are already known to work with US government agencies. This was revealed in the Snowden leaks in 2013, and confirmed on multiple occasions since. Neither Google nor Apple tell you when apps you're downloading from the store contain known vulnerabilities. We know for a fact that both Google Play and the App Store are filled with scams and malware: it's widely documented.

So to my reading F-Droid comes out ahead on every metric you've listed: It has no known associations with US government agencies. They do inform you when your apps have known vulnerabilities. I'm not aware of any cases of scams or malware being distributed through F-Droid.

I highly recommend it. It's the main store I've been using on my phone for probably more than a decade now.


Because you can literally verify every single step of what they do. That's the reason you can trust them.

You cannot apply this logic to almost anyone else. Apple, Google, etc. can only give you empty promises.


I understand your concern, though your suspicion is a little shortsighted. It can be personally dangerous to volunteer for projects that directly circumvent the control of the establishment.

> Who is F-Droid? Why should I trust them?

For the same reason you trust many things. They have a long track record of doing the right thing. As gaining reputation for doing the wrong thing would more or less destroy them, it's a fair incentive to continue doing the right thing. It's a much better incentive that many random developers of small apps in Google's play store have.

However, that's not the only reason to trust them. They also follow a set of processes, starting with a long list of criteria saying what app's they will accept https://f-droid.org/docs/Inclusion_Policy/ That doesn't mean malware won't slip past them on occasion, but if you look at the amount of malware that slips past F-Droid and projects with similar policies like Debian and compare them to other app stores like Google's, Apple and Microsoft there is no comparison. Some malware slips past Debian's defences once every few years. I would not be surprised if new malware isn't uploaded to Google app store every few minutes. The others aren't much better.

The net outcome of all that is the open source distribution platforms like F-Droid and Debian, that have procedures in place like tight acceptance policies and reproducible builds are by a huge margin the most reliable and trustworthy on the planet right now. That isn't saying they are perfect, but rather if Google's goal is to keep their users safe they should be doing everything in their power to protect and promote F-Droid.

> How do I know they aren’t infiltrated by TLAs? (Three Letter Agencies), or outright bad-actors.

You don't know for sure, but F-Droid policies make it possible to detect if the TLA did something nefarious. The combination of reproducible builds, open source and open source's tendency to use source code management systems that provide to audit trail showing who changed every line shine a lot of sunlight into the area. Sunlight those TLA's your so paranoid about hate.

This is the one thing that puzzles me about F-Droid opposition in particular. Google is taking a small step here towards increasing accountability of app developers. But a single person signing an app is in reality a very small step. There are likely tens if not hundreds of libraries underpinning it, developed by thousands of people. That single developer can't monitor them all, and consequently libraries with malware inserted from upstream repositories like NPM or PyPi regularly slips through. Transparency the open source movement mostly enforces is far greater. You can't even modify the amount of whitespace in a line without it being picked up by some version control system that records who did it, why they did it, and when. So F-Droid is complaining about a small increase in enforced transparency from Google, when they demand far, far more from their contributors.

I get that Google's change probably creates some paper-cuts for F-Droid, but I doubt it's something that can't be worked around if both sides collaborate. This blog post sounds like Google is moving in that direction. Hear, hear!


> They also follow a set of processes, starting with a long list of criteria saying what app's they will accept

How is this an argument in favour of being able to run whatever software you want on hardware you own?


You can run any software you like on Android, if it's open source. You just compile it yourself, and sign it with the limited distribution signature the blog post mentions. Hell, I've never done it, but re-signing any APK with your own signature sounds like it should be feasible. If it is, you can run any APK you want on your own hardware.

Get a grip. Yes it might be possible the world is out to get you. But it's also possible Google is trying to do exactly what they say on the tin - make the world a safer place for people who don't know shit from clay. In this particular case, if they are trying to restrict what an person with a modicum of skillz can do on their own phone it's a piss poor effort, so I'm inclined to think it's the latter. They aren't even removing the adb app upload hole.


>> In most cases, F-Droid couldn't know either. A developer transferring their accounts and private keys to someone else is not easily detected.

> 1. The Android OS does not allow installing app updates if the new APK uses a different signing key than the existing one. It will outright refuse, and this works locally on device

You missed the and private keys part of the original claim.


No I didn't. Finish reading the rest of the comment.

If an app updates to require new permissions, or to suddenly require network access, or the owner contact details change, Google Play should ideally stop that during the update review process and let the users know. But that wouldn't be good for business.

An update can become malicious even without change in permissions.

E.g. my now perfectly fine QR reader already has access to camera (obvious), media (to read QR in an image file or photo) and network (enhanced security by on-demand checking the URL for me and showing OG etc so I can more informed choose to open the URL)

But it could now start sending all my photo's to train an LLM or secretly make pictures of the inside of my home, or start mining crypto or whatnot. Without me noticing.


See that's what the intent system was originally designed to prevent.

Your QR reader requires no media permission if it uses the standard file dialogs. Then it can only access files you select, during that session.

Similarly for the camera.

And in fact, it should have no network access whatsoever (and network should be a user controllable permission, as it used to be — the only reason that was removed is that people would block network access to block ads)


> And in fact, it should have no network access whatsoever (and network should be a user controllable permission, as it used to be — the only reason that was removed is that people would block network access to block ads)

Sure, a QR code scanner can work fine without network. E.g. it could use the network to check a scanned URL against the "safe browsing API" or to pre-fetch the URL and show me a nice OG preview. You are correct to say you may not need nor want this. But I and others may like such features.

Point is not to discuss wether a QR scanner should have network-access, but to say that once a permission is there for obvious or correct reasons, it can in future easily get abused for other reasons. Without changing the permissions.

My mail-app needs network. Nothing prohibits it from abusing this after an update to pull in ads, or send telemetry to third parties. My sound record app needs microphone permissions. Nothing prohibits it from "secretly" recording my conversations after an update (detectable since a LED and icon will light up).

If you want to solve "app becoming malicious after an update", permissions aren't the tool. They are a tiny piece of that puzzle, but "better permissions" aren't the solution either. Nor is "better awareness of permissions by users".


> See that's what the intent system was originally designed to prevent.

> Your QR reader requires no media permission if it uses the standard file dialogs. Then it can only access files you select, during that session.

On the one hand, yes, good point, but it runs into the usual problem with strict sandboxing – it works for the simple default use case, but as soon as you want to do more advanced stuff, offer a nicer UI, etc. etc. it breaks down.

E.g. barcode scanners – yes, technically you could send a media capture intent to ask the camera app to capture a single photo without needing the camera permission yourself, but then you run into the problem that maybe the photo isn't suitable enough for successful barcode detection, so you have to ask the user to take another picture, and perhaps another, and another, and…

So much nicer to request the camera permission after all and then capture a live image stream and automatically re-run the detection algorithm until a code has been found.


>...or to suddenly require network access...

That's the most baffling thing to me. There is simply no option to remove network permissions from any app on my Pixel phone.

It's one of the reasons why I avoid using mobile apps whenever I can.


It's weird because GrapheneOS does have this. Networking is a permission on Android, but stock Android doesn't give you the setting.

I believe that permission is currently "leaky". The app can't access the network but it can use Google Play services to display ads.

I believe that would theoretically allow exfiltration of data but I don't understand all of the details behind this behavior and how far it goes.


Google wants 0 friction for apps to display ads.

So does Apple apparently.

What incentive is there for OEMs to not add this option though? Does Google refuse to veriy their firmware if they offer this feature?

The network permission was displayed in the first versions of Android, then removed. I heard (hearsay alert) at the time that it was because so many apps needed it, and they wanted to get rid of always-yes questions. IIRC this happened before the rise of in-app advertising.

If people always answer yes, they grow tired and eventually don't notice the question. I've seen it happen with "do you want to overwrite the previous version of the document you're editing, which you saved two minutes ago?" At that point your question is just poisoning the well. Makes sense, but still, hearsay alert.


As far as I'm concerned they can grant this permission by default. I just want the power to disable it.

A while ago I wanted to scan the NFC chip in my passport. Obviously, I didn't want this information to leave my device.

There are many small utility apps and games that have no reason to require network access. So "need" is not quite the right word here. They _want_ network access and they _want_ to be able to bully users into granting it.

That's a weird justification for granting it by default. But I wouldn't care if I could disable it.


Android doesn't grant this by default, strictly speaking. Rather, an application can enable it by listing it in the application manifest. Most permissions require a question to to the user.

Did you find a suitable app? I don't really remember, but https://play.google.com/store/apps/details?id=com.nxp.taginf... might suit you.


I did find one but it was years ago so I don't remember.

Could have been easily solved by granting it by default, but I doubt that was original intent.

Well, the original intent was to ask the user for permission at installation time, which turned out to be a poor idea after a while. Perhaps you mean that it would have been simple to change the API in some particular way, while retaining compatibility with existing apps? If I remember the timeline correctly, which is far from certain, this happened around the same time as Android passed 100k apps, so a fairly strong compatibility requirement.

I mean, just make it "Granted" by default and give user ability to control it. Permissions API was already broken few times(i.e. Location for bluetooth and granular Files permissions)

> Does Google refuse to veriy their firmware if they offer this feature?

If a manufacturer doesn't follow the Android CDD (https://source.android.com/docs/compatibility/cdd), Google will not allow them to bundle Google's closed source apps (which include the Google Play store). It was originally a measure to prevent fragmentation. I don't know whether this particular detail (not exposing this particular permission) is part of the CDD.


It's not explicitly part of the CDD, but implicitly. The device must support the Android permissions model and is only allowed to extend this implementation using OWN permissions (in a different namespace than 'android'), but not allowed to deviate from it.

INTERNET is a "normal permission", automatically granted at install time if declared in the manifest. OEMs cannot change the grant behavior without breaking compatibility because:

The CDD explicitly states that the Android security model must remain intact. Any deviation would fail CTS (Compatibility Test Suite) and prevent Play certification.


Well, apart from the OEM violating the Android Compatibility Definition Document (CDD), failing the Compatibility Test Suite (CTS) and thus not getting their device Play-certified (so not being able to preload all the Google services, there is an economical impact as well:

As OEM you want Carriers to sell your device above everything else, because they are able to sell large volumes.

Carriers make money using network traffic, Google is paying Revenue-Share for ads to Carriers (and OEMs of certain size). Carriers measure this as part of the average revenue per user (ARPU).

--> The device would be designed to create less ARPU for the Carrier and Google and thus be less attractive for the entire ecosystem.


It is solvable from user space.

E.g. TrackerControl https://github.com/TrackerControl/tracker-control-android can do it, it is a local vpn which sees which application is making a request and blocks it.

You can write your own version of it if you don't trust them.


I've been using a similar VPN solution. It works great for apps that absolutely should not be connected, like my keyboard. But it has an obvious downside: you can't use a VPN on your phone while you're using that.

Some apps would use this for loopback addresses, which as far as I know will then need network permission. The problem here is the permission system itself because ironically Google Play is full of malicious software.

And neither Android nor iOS a safer than modern Desktop systems. On the contrary because leaking data is its own security issue.


Wasn't the loopback address recently used maliciously?

Yes. Facebook/Meta was using a locally hosted proxy to get info smuggled back without using routes that are increasingly obstructed by things like ad blockers if I recall correctly.

https://securityonline.info/androids-secret-tracking-meta-ya...

Search string for DDG: Meta proxy localhost data exfiltration


This is a huge problem in the Chrome Web Store and Google is doing very little about it. If you ever made an extension that is even just a little popular, expect to get acquisition offers by people who want to add malicious features somewhere between click fraud, residential IP services or even password stealers.

Same for Play Store. I have 2 games and I keep getting offers all the time. The last one offered $2000 for the developer account or a $100 monthly rent.

From their email pitch:

> We’re now offering from $500 to $2000 for a one-time purchase of a developer account that includes apps, or a rental deal starting from $100.

> No hidden conditions — quick process, secure agreement, and immediate payment upon verification.

> We’re simply looking for reliable accounts to publish our client apps quickly, and yours could be a perfect match.


Indeed, an update can't be more malicious than the permissions allow it to be. You have a calculator app with limited permissions, it is "safe" to set to allow the developer to update it. No danger in that.

But I don't think it is enough, or it is the right model. In other cases, when the app has dangerous permissions already, auto-update should be a no-go.


> Indeed, an update can't be more malicious than the permissions allow it to be.

...in the absence of sandbox escape bugs.


> F-Droid couldn't know either

F-Droid is not just a repository and an organization providing the relevant services, but a community of like-minded *users* that report on and talk about such issues.


> which is widely promoted as being good security practice

Maybe that's the mistake right there?

It is a good practice only as long as you can trust the remote source for apps. Illustration: it is a good security practice for a Debian distro, not so much for a closed source phone app store.


OPEN SOURCE EVERYTHING is the premier solution.. again.

By using the distributor model, where a trusted 3rd party builds & distributes the apps. Like every Linux distro or like what F-droid does.

The point here is that app developers have to identify themselves. Google has no intention to verify the content of sideloaded apps, just that it is signed by a real person, for accountability.

They don't know if the person who signed the app is the developer, but should the app happen to be a scam and there is a police investigation, that is the person who will have to answer questions, like "who did you transfer these private keys to?".

This, according to Google and possibly regulators in countries where this will be implemented, will help combat a certain type of scam.

It shouldn't be a problem for YouTube Vanced, at least in the proposed form. The authors, who are already idendified just need to sign their APK. AFAIK, what they are doing is not illegal or they would have been shut down long ago. It may be a problem for others though, and particularly F-Droid, because F-Droid recompiles apps, they can't reasonably be signed by the original author.

The F-Droid situation can resolve itself if F-Droid is allowed to sign the apps it publishes, and in fact, doing that is an improvement in security as it can be a guarantee that the APK you got is indeed the one compiled by F-Droid from publicly available source code.


APKs are already signed. Now Google requries that they be signed by a key which is verified by their own signatures. Which means they can selectively refused to verify whichever keys are inconvenient to them.

> Google has no intention to verify the content of sideloaded apps, just that it is signed by a real person, for accountability.

for now


Still believe that signing binaries this way is always bullshit.

I stopped developing for mobile systems ages ago because it just isn't fun anymore and the devices are vastly more useless. As a user, I don't use apps anymore either.

But you can bet I won't ever id myself to Google as a dev.


> I don't really see how you can both allow developers to update their apps automatically (which is widely promoted as being good security practice) and also defend against good developers turning bad.

These are not compatible, but only because the first half is simply false. Allowing a developer to send updates is not "good" but "bad" security practice.


That's true in theory. But as you can see in practice is that google does very little to protect their users, while F-Droid at least tries.

Which shows that the whole 'security' rigmarole by google is bullshit.


This is a big problem with Chrome extensions and Google hasn't done anything about it there, so I don't think they actually care about it. I'm not actually sure how you would solve that problem even theoretically.

In many cases developer e-mail address changes, IP address changes, billing address changes, tax ID changes...

This exactly. Transferring ownership is a business transaction. Track that. If the new owner is trying to hide it, this is fraud, and should be dealt with in court.

To be fair, on Google Play you have the option to transfer the app to someone else's account. People don't need to trade accounts...

That doesn’t help mitigate the class of attack you responded to.

Quite simple: Actual human review that works with the developers.

But this costs money, and the lack of it is proof google doesn't really care about user security. They're just lying.


> without requiring Google's authorization for app publication.

funnily enough, I am installing google drive for computers right now (macOS), I had to download a .pkg and basically sideload the app, which is not published on the Apple Store

Why the double standard, dear Google?


>I had to download a .pkg and basically sideload the app, which is not published on the Apple Store

You mean install the app? The fact that Apple and Google wish to suggest that software from outside their gardens is somehow subnormal doesn't mean other people need to adopt their verbiage.


> You mean install the app?

Correct, I mean install the app.

Sideloading is the corporate jargon for "installing an app".


Probably because they require APIs which cannot be used when publishing to the AppStore. The whole Microsoft Office Suite is available in the macOS App Store - but Microsoft Teams must be downloaded from their website and cannot be installed via the AppStore...

> Probably because they require APIs which cannot be used when publishing to the AppStore

That's the funny part.

They do stuff they want to prohibit to other developers because "safety".

But we all know that Google can do massively more harm than scammers pushing their scammy apps to a few hundreds people.

For example, in today's news "Google hit with EU antitrust investigation into its spam policy".

There's a bit of irony in it and a lot of hypocrisy, IMO.


Bad example because that .pkg was probably signed with a developer certificate with approval from Apple - just as would be the case on Android in the future.

> > Keeping users safe on Android is our top priority.

Somebody tell them that I do not want to be kept safe by Big Brother.


Your personal data will be kept safe on our servers, citizen, whether you like it or not.

Enforcer, informing citizen on basic practices undermines citizen's delusion of being free. Please refer to room 22a for re-alignment and training.

> Your personal data will be kept safe on our servers, citizen, whether you like it or not.

... and our business partners. And app developers that grab your clipboard. And their business partners. and a few more levels of data brokers. The spi^H^H^H data-vacuum must flow


EU did more by mandating 5 years of updates…

And of course, code signing can't protect you from such a thing. When software publishing rights get bought, so (usually) do the signing keys.

Curation (and even patching) by independent, third-party volunteers with strong value commitments does protect users from this (and many other things). Code signing is still helpful for F/OSS distributions of software, but the truth is that most of the security measures related to app installation serve primarily to solve problems with proprietary app markets like Google's Play Store and Apple's App Store. Same thing with app sandboxing.

It's unfortunate but predictable when powerful corporations taint genuine security features (like anti-tampering measures, built-in encryption devices, code signing, sandboxing, malware scanning, etc.) by using them as instruments of control to subdue their competitors and their own users.


The entire SimpleMobileTools situation left such a bad taste in my mouth. No upfront communication, it had to be discovered in a GitHub issue thread after people started asking questions.

It was shady as fuck on Kaputa's part, especially given ZipoApps is an Israeli adware company, a.k.a. surveillance company, and given Israel's track record with things like using Pegasus against journalists/activists or blowing up civilian-owned beepers, this should automatically be a major security incident and at least treated as seriously as the TikTok debacle.

Kaputa should be extremely ashamed of himself and outted from the industry. I and many others would have gladly paid a yearly subscription for continued updates of the suite instead of a one-time fee, but instead of openly discussing such a model with his userbase, he went for the dirtiest money he could find.


If "automatic updates" were optional and off-by-default then users would not be vulnerable to something like SimpleMobileTools

Why not let the user decide

Letting someone else decide has potential consequences

Using F-Droid app ("automatic updates") is optional, as it should be

"Automatic updates" is another way of saying "allow somone else to remotely install software on this computer"

Some computer owners might not want that. It's their decision to make

I disable internet access to all apps by default, including system apps

When source code is provided I can remove internet access before compilation

Anyway, the entire OS is "user-hostile" requiring constant vigilance

It's controlled by an online ad services company

Surveillance as a business


> If "automatic updates" were optional and off-by-default then users would not be vulnerable to something like SimpleMobileTools

The problem is the vast majority of users want this on by default; they don't want to be bothered with looking at every update and deciding if they should update or not.


The vast majority of users want their apps to work. They don't care whether that happens through automatic updates or not.

It's the developers who don't want the headache of not having automatic updates.


"Automatic updates" is "remote code execution (RCE)" by permission

Given the frequent complaints about the former, the notion of "permission" is dubious


> I want to be able to install apps from alternative app stores like F-Droid and receive automatic updates

That's actually possible, though app stores need to implement the modern API which F-Droid doesn't seem to do quite well (the basic version of F-Droid (https://f-droid.org/eu/packages/org.fdroid.basic/) seems to do better). Updating from different sources (i.e. downloading Signal from GPlay and then updating it from F-Droid or vice versa) also causes issues. But plain old alternative app stores can auto-update in the background. Could be something added in a relatively recent version of Android, though.

If this Verified bullshit makes it through, I expect open source Android development to slowly die off. Especially for smaller hobbyist-made apps.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: