Hacker Newsnew | past | comments | ask | show | jobs | submit | unsungNovelty's commentslogin

It's flarum for me - https://flarum.org/

A really good forum software.


Not even closely OK with facebook. But none of the other companies do this. And Mark has been open about it. I remember him saying in an interview the same very openly. Something oddly respectable about NOT sugar coating with good PR and marketing. Unlike OpenAI.

I dont think it's open source. It says SAM license. Most likely source available.

Joey: I used Thesaurus.

Chandler: On every word???

https://www.youtube.com/watch?v=n_ch8GWnJzQ&t=56



And every swear in the world goes to the punk(s) who programmed google sign-in which asks for your phone number to send 2FA so that they can confirm I'm ME!

Why is there a strange download link with 3 points on front page of HN? The author just have 3 points as well.

I'm a long time lurker who has only recently started posting. What's in the archives themselves are just JSON files. I'll post an article next time with what's here, that way it isn't just Dropbox links.

Sure. But you do understand how it looks right? Nothing against you.

Its my device. Not google's. Imagine telling you which NPM/PIP packages you can install from your terminal.

Also, its not SIDE loading. Its installing an app.


Well... it would be good if this was true, but read the ToS and it looks more like a licence to use than "ownership" sadly :(

"Android" is really a lot of different code but most of it is the Apache license or the GPL. Google Play has its own ToS, but why should that have to do with anything when you're not using it?

Google doesn't own AOSP. We don't need any google apps on an andoid phone for it to function.

I agree, but I don't see why Google gets more critical attention than the iPhone or Xbox.

iPhone has always been that way (try installing an .ipa file that's not signed with a valid apple developer certificate). For Google forced app verification is a major change. Xbox I don't know..

Yeah, let's ask the Debian team about installing packages from third party repos.

I'm not on the side of locking people out, but this is a poor argument.


> Yeah, let's ask the Debian team about installing packages from third party repos.

Debian already is sideloaded on the graciousness of Microsoft's UEFI bootloader keys. Without that key, you could not install anything else than MS Windows.

Hence you don't realize how good of an argument it is, because you even bamboozled yourself without realizing it.

It gets a worse argument if we want to discuss Qubes and other distributions that are actually focused on security, e.g. via firejail, hardened kernels or user namespaces to sandbox apps.


"Debian already is sideloaded on the graciousness of Microsoft's UEFI bootloader keys. Without that key, you could not install anything else than MS Windows."

This is only true if you use Secure boot. It is already not needed and insecure so should be turned off. Then any OS can be installed.


> This is only true if you use Secure boot. [...] so should be turned off. Then any OS can be installed.

You can only turn off Secure Boot because Microsoft allows it. In the same way Android has its CDD with rules all OEMs must follow (otherwise they won't get Google's apps), Windows has a set of hardware certification requirements (otherwise the OEM won't be able to get Windows pre-installed), and it's these certification requirements that say "it must be possible to disable Secure Boot". A future version of Windows could easily have in its hardware certification requirements "it must not be possible to disable Secure Boot", and all OEMs would be forced to follow it if they wanted Windows.

And that already happened. Some time ago, Microsoft mandated that it must not be possible to disable Secure Boot on ARM-based devices (while keeping the rule that it must be possible to disable it on x86-based devices). I think this rule was changed later, but for ARM-based Windows laptops of that era, it's AFAIK not possible to disable Secure Boot to install an alternate OS.


I agree with you and run with it disabled myself, but some anti-cheat software will block you if you do this. Battlefield 6 and Valorant both require it.

This is the real malware that people should be protected from.

Now tell me how

Turning off UEFI secure boot on a PC to install another "unsecure distribution"

vs.

Unlocking fastboot bootloader on Android to install another "unsecure ROM"

... is not the exact same language, which isn"t really about security but about absolute control of the device.

The parallels are astounding, given that Microsoft's signing process of binaries also meanwhile depends on WHQL and the Microsoft Store. Unsigned binaries can't be installed unless you "disable security features".

My point is that it has absolutely nothing to do with actual security improvements.

Google could've invested that money instead into building an EDR and called it Android Defender or something. Everyone worried about security would've installed that Antivirus. And on top of it, all the fake Anti Viruses in the Google Play Store (that haven't been removed by Google btw) would have no scamming business model anymore either.


"... is not the exact same language, which isn"t really about security but about absolute control of the device.

The parallels are astounding, given that Microsoft's signing process of binaries also meanwhile depends on WHQL and the Microsoft Store. Unsigned binaries can't be installed unless you "disable security features".

My point is that it has absolutely nothing to do with actual security improvements."

I agree. It is the same type of language.


While it's possible to install and use Windows 11 without Secure Boot enabled, it is not a supported configuration by Microsoft and doesn't meet the minimum system requirements. Thus it could negatively affect the ability to get updates and support.

> It is already not needed and insecure so should be turned off.

You know what's even less secure? Having it off.


The name “Secure Boot” is such an effective way for them to guide well-meaning but naïve people's thought process to their desired outcome. Microsoft's idea of Security is security from me, not security for me. They use this overloaded language because it's so hard to argue against. It's a thought-terminating cliché.

Oh, you don't use <thing literally named ‘Secure [Verb]’>?? You must not care about being secure, huh???

Dear Microsoft: fuck off; I refuse to seek your permission-via-signing-key to run my own software on my own computer.


Agreed.

Also Secure boot is vulnerable to many types of exploits. Having it enabled can be a danger in its self as it can be used to infect the OS that relies on it.


Could you elaborate? This is news to me?

> Dear Microsoft: fuck off; I refuse to seek your permission-via-signing-key to run my own software on my own computer.

No one is stopping you from installing your own keys, though?


I do not want to be in the business of key management. This is not something that needed encryption. More encryption ≠ better than.

I also dual-boot Windows and that's a whole additional can of worms; not sure it would even be possible to self-key that. Microsoft's documentation explicitly mentions OEMs and ODMs and not individual end users: https://learn.microsoft.com/en-us/windows-hardware/manufactu...


> This is not something that needed encryption. More encryption ≠ better than.

Securing the boot chain protects against a whole range of attacks, so yes, it is objectively better from a security POV.


Name a single prevented bootkit that wasn't able to avoid the encryption and signature verification toolchain altogether.

Malware developers know how to avoid this facade of an unlocked door.

Users do not.

That's the problem. It's not about development, it's about user experience. Most users are afraid to open any Terminal window, let alone aren't even capable of typing a command in there.

If you argue about good intent from Microsoft here, think again. It's been 12 years since Stuxnet, and the malware samples still work today. Ask yourself why, if the reason isn't utter incompetence on Microsoft's part. It was never about securing the boot process, otherwise this would've been fixed within a day back in 2013.

Pretty much all other bootkits also still work btw, it's not a singled out example. It's the norm of MS not giving a damn about it.


The first thing you can do is actually read the article. The question is not about the security reports but Google's policy on disclosing the vulnerability after x days. It works for crazy lazy corps. But not for OSS projects.

In practice, it doesn't matter all that much whether the software project containing the vulnerability has the resources to fix it: if a vulnerability is left in the software, undisclosed to the public, the impact to the users is all the same.

I, and I think most security researchers do too, believe that it would be incredibly negligent for someone who has discovered a security vulnerability to allow it to go unfixed indefinitely without even disclosing its existence. Certainly, ffmpeg developers do not owe security to their users, but security researchers consider that they have a duty to disclose them, even if they go unfixed (and I think most people would prefer to know an unfixed vulnerability exists than to get hit by a 0-day attack). There's gotta be a point where you disclose a vulnerability, the deadline can never be indefinite, otherwise you're just very likely allowing 0-day attacks to occur (in fact, I would think that if this whole thing never happened and we instead got headlines in a year saying "GOOGLE SAT ON CRITICAL VULNERABILITY INVOLVED IN MASSIVE HACK" people would consider what Google did to be far worse).

To be clear, I do in fact think it would be very much best if Google were to use a few millionths of a percent of their revenue to fund ffmpeg, or at least make patches for vulnerabilities. But regardless of how much you criticize the lack of patches accompanying vulnerability reports, I would find it much worse if Google were to instead not report or disclose the vulnerability at all, even if they did so at the request of developers saying they lacked resources to fix vulnerabilities.


> I, and I think most security researchers do too, believe that it would be incredibly negligent for someone who has discovered a security vulnerability to allow it to go unfixed indefinitely without even disclosing its existence.

Because security researchers want to move on from one thing to another. And nobody said indefinitely. Its about a path that works for OSS project.

Its also not about security through obscurity. You are LITERALLY telling the world check this vuln in this software. Oooh too bad the devs didnt fix it. Anybody in the sec biz would be following Google's security research.

Putting you in a spotlight and telling it doesn't make any difference is silly.


If you weren't aware of it... There is a world of static application security tools (SAST) which can help you. Add them to your text editor/ci/cd to use them.

https://owasp.org/www-community/Source_Code_Analysis_Tools


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: