Hacker Newsnew | past | comments | ask | show | jobs | submit | lpln3452's commentslogin

If it runs on Avalonia anyway why choose MAUI? If you have the option just use Avalonia. MAUI is unfinished and its docs is awful.

Do you want to build an app using MAUI? Unless you build an app that barely deviates from the template, expect to desperately search through decade old Xamarin documentation and figure out the details through painful trial and error.

Good luck.


They addressed this.

It is a way to get people choosing the "official" path (ie. choosing MAUI) to experience Avalonia. They are hoping you come for the MAUI and stay for the Avalonia and become an Avalonia developer.

As for, why choose the Avalonia version of MAUI, there are three reasons: - Linux support (the big one I think) - Drawn framework (same renderer on all platforms) - WASM support (probably useful sometimes but not the real draw)

They are making a big deal of WASM here because it is easy to demo. We can all go into it and run it. But do we want to use it for our apps?

As for, why not Avalonia directly? To loop back to the beginning, it is because you do not yet know Avalonia and trust it. The Avalonia team is hoping this helps with that.


Thank you for explaining this. I didn't even realize this was an Avalonia project until you pointed it out. I like Avalonia (and love .NET in general), but I think the messaging on this needs a lot of work. Avalonia is creating unnecessary cognitive dissonance by emphasizing MAUI, which is a competing project after all.


This is what mystifies me about this announcement. Avalonia already works fine on Linux, allowing anyone to build a cross-platform .NET GUI application.

MAUI is supposed to be a wrapper around native widgets. The fact that they had to use Avalonia under the covers to get it to work on Linux seems to defeat the point. (Avalonia is a complete UI toolkit, like Qt or Flutter, that owns the entire stack from XAML to pixels.)

https://avaloniaui.net/maui-compare


The large number of Chinese products currently permitted in the U.S. demonstrates that the bans were imposed not because of their nationality, but because confirmed security risks were identified.

The company's issue is not its country of origin, but its history of installing backdoors and its public declaration to abandon fixing security flaws for numerous devices still in use.

The issue started to be pointed out by numerous independent tech news outlets and communities far more than a year ago. Do you have a basis to argue otherwise?


> its history of installing backdoors

If TP-Link is known to have intentionally installed backdoors in its products, that is news to me. Can you provide a source for that claim?

Vulnerabilities have been found, of course, but that is hardly unique to TP-Link, and the existence of a vulnerability does not imply that it was put there intentionally.

> its public declaration to abandon fixing security flaws for numerous devices still in use

I have several machines that are still running Windows 10 and are (according to the Windows software) not eligible to upgrade to Windows 11, let alone for free. The Microsoft software informs me that I will no longer receive security updates on these machines.

When will the US government ban Microsoft products from sale in the US?

---

Still, I have much more context on the DJI ban. The law that will place DJI on the FCC's "covered list" states that if DJI is not audited by a (unspecified) US government agency, DJI products will be placed on the covered list and so be ineligible for FCC certification starting (IIRC) Jan 1 2026. In other words, the law was cleverly written such that nobody actually needs to do an audit to determine what nasty things DJI is actually getting up to; if nobody raises their hand, the ban will happen automatically.

---

Do not take me for an enthusiastic supporter of DJI, TP-Link, other Chinese companies, or the way America's political and business leaders have generally pissed away our technological advantage over China in the name of enriching themselves in the present (now past). I am, in fact, livid. But we will not dig ourselves out of this hole by becoming a backwater where Americans' relationship with consumer technology is as if they are living in a sanctioned country.


Yeah but it’s not like every Chinese tech product is being systematically scrutinized by the US government. It’s more like one gains attention and then everyone piles on.


No. The individuals genuinely at fault here are

1) those who inflict harm on others, considering that being wealthy or disliked does not justify actions such as death threats

2) those who target the wrong person simply due to a shared name.

Any discussion of compensation should be directed at them.


> 2) those who target the wrong person simply due to a shared name.

In this case that includes the other Zuck's company. He should at least do something about that.


Why do companies actively lie in their advertising about being eco-friendly, instead of just keeping a low profile? Is it because we tend to focus only on current events and quickly forget their past track record? Indeed, if people soon forget the lies, the risk is minimal.


One reason is that large institutional investors or lenders enforce certain agendas by only giving money to companies that meet certain criteria. Thus companies will posture themselves as meeting those qualities to attract money and investment.

It’s an explanation of why so many companies suddenly appeared to go “woke”, or why they did a complete 180 when the political climate changed. Even powerful companies like Apple must grovel for favor.


Oh, I didn't know it worked that way. Thank you for the information.


`curl -fsSL https://tinyurl.com/2u5ckjyn | bash`

This satire is amusing. Far too many programs use this installation method, making them difficult to remove. Seeing this is an immediate deterrent to installation.


Don't blame the program authors; blame the Linux community for completely failing to agree on a better alternative.


Admittedly, a distro agnostic equivalent to the PKGBUILD or Nix Flakes would have been great. But it's hard to excuse them when so many better alternatives are available. Even apps that could easily be built into a single binary use these odd installation methods.

And while Flatpak gets a lot of criticism, I honestly think it's far better than these `script| bash` methods.


> Even apps that could easily be built into a single binary use these odd installation methods.

Yeah because the curl bash script can deal with automatically selecting the right binary (even on Mac) and adding it to your PATH.

> And while Flatpak gets a lot of criticism, I honestly think it's far better than these `script| bash` methods.

I agree but does Flatpak actually work for CLI tools (which is where I see most use of curl-bash). E.g. could I install Rust using Flatpak? My understanding was it's more suited for sandboxed GUI apps.


> I agree but does Flatpak actually work for CLI tools (which is where I see most use of curl-bash). E.g. could I install Rust using Flatpak? My understanding was it's more suited for sandboxed GUI apps.

Distrobox is basically flatpak for CLI apps. Not exactly, but it accomplishes a similar goal.


Doesn't it use Docker? Fine for trying stuff out on different distros but there's no way I want Docker as a normal installation process.

Docker is what you use if you've failed to do it the right way.


No. It is absolutely ridiculous to suggest that one must read some unknown script just for the simple task of adding a binary to the directory on PATH. Unless, of course, you're the type of person who just runs any script without verification.

Furthermore, for files installed 'automatically' like that, it's nearly impossible to remember what was done and where. This means that to remove it, you have to find and read 'that specific version' of the script you ran, and then delete the files. It's not like the script is always in a place with a persistent history, like a git repository. Good luck with that.


Yeah uninstallation is a flaw, but what alternative are you suggesting?

And yes I am the type of person that uses heuristics to trust what software to run. You aren't magically safer if you audit the install script and not the actual binary.


The smaller the attack surface, the better. There is no need to expand it unnecessarily. By your logic, we shouldn't even use the binaries provided by the official package manager, because they also cannot be trusted.

We are talking about the dangers of the installation method. Not the program itself.


> We are talking about the dangers of the installation method. Not the program itself.

It only makes sense to separate them if you install the program and then never run it which is obviously ridiculous.

> There is no need to expand it unnecessarily.

I agree, but it is necessary because there aren't any good alternatives.

Also it is very minimal additional risk. If it wasn't we'd see it used as an attack method and as far as I know that has yet to happen once.


Yeah, honestly, package your thing up as either a .DEB or an .RPM. 'alien' [0] will handle converting from one to the other, and that will take care of like 90->98% of the Linux users out there.

The "OMFG there's no standard way to package things on Linux!" complaint kinda sucks.

[0] <https://wiki.debian.org/Alien>


Actually, I don't use Debian or Ubuntu either, and I don't think it's a good idea to only support those methods.


> ...I don't use Debian or Ubuntu...

Neither do I.

I've been using Gentoo for twenty-three years (since 2002). I stopped looking for equally-well-managed alternatives somewhere between 2012 and 2015. I have enough local compute that the build times for everything other than Chromium aren't a problem. If I didn't, I could use the official prebuilt binary packages that have been around for a few years.

> ...I don't think it's a good idea to only support [.deb or .rpm packages].

If you can do more, you should, yes. However, -as a hobbyist open-source programmer- I recognize that other hobbyists only have so much time and giveashit available.

The absolute best thing they can do is provide a source tarball that builds and installs correctly with './configure && make && make install' [0] or the language-specific equivalent. Any competent distro package management system will make it somewhere between trivial and pretty easy for others to package projects like this up. [1]

If they have more time and giveashit available, make prebuilt .deb packages so that your software is trivially installable for the most users out there. If you find yourself with more spare resources, then write packages for other OS package managers to get the remaining small fraction of Linux users.

The absolute worst thing to do would be to assume you MUST package your software for every distro out there (lest someone whine at you on the Internet), decide that that's way too much work, and not publish anything. As someone who has many, many unstarted projects because they seem like way too much work, I can tell you that that's a totally real failure mode.

[0] Perhaps with an optional side-trip to 'make test'.

[1] Unless they're using something godawful to package like NodeJS.


fpm (Effing package manager) goes far beyond what alien does, too, in case you need even more flexibility.

https://fpm.readthedocs.io/en/latest/getting-started.html


Neat. That I didn't know about this probably says something about how long it has been since I needed something like alien.


you can use <a href="https://brew.sh/">homebrew</a> or flatpak


Some may lament that 'writing' now includes typing, despite having been limited to pen and paper. Frankly it's doubtful that point is taken seriously.

Similarly the core of your 'self-farming' analogy is the direct management of the crops. The involvement of others in demolishing existing structures, erecting fences, or managing water resources on the land is of little consequence.

Of course, some might argue that unless the farm is directly managed, it does not constitute self-farming.


LLMs generate code based on statistical patterns found in vast amounts of training data from existing projects, not by reading language specifications. If the tag is rare in the wild, it will be rare in their output.


LLMs also don't know about new MCP tools when they are training but they use them perfectly fine when presented with the information about them.

AI software development models and agents can be "taught" to go look at the official version documentation for languages via the prompt (just as one example) without needing to modify their weights.

One call to getSpecification('Python','3.14') or some similar tool call and they know exactly what they are working with, even if the language version did not exist when the model was trained.


I mean, they're trained on specs, too. I'll have to play with asking for semantic HTML and see what they come up with.


If he had kept his strange and alarming behavior to himself, he wouldn't be 'getting dogpiled' for it now.

The problem is that he's forcing his ways on others. If we're identifying an aggressor here, it's him. The project maintainers are the victims.


At least in the thread linked here, it seems like his maintainership over the project is legitimate, which makes it wrong to characterize him as "forcing" his ways on anyone.


Even ignoring that examples of his behavior are easily found elsewhere, the link itself shows him completely disregarding feedback from other contributors to force his own way.

Honestly, I can't understand the intent behind such a defensive rebuttal to the criticism of his actions.


I don't care one way or another. I'm not a JS developer. I'm just struck by a reaction that seems quite extreme, and very visible dogpiling.


My point wasn't about javascript. He got pushback because he ignored everyone and just did his own thing. It has nothing to do with javascript and you can see that in the link. That's a weird excuse.


Firefox still lets you do this.

You can add any URL as a custom search engine by providing a string template for the query.

It doesn't have to be a formal "search provider". Any URL that accepts a query string will work.


IMO the problem with Firefox is that custom search engines in Firefox can't use POST requests, even though it's supported. You may want to check Mycroft Project [1] out for that.

[1]: https://mycroftproject.com/


So another Chromium based browser featuring the usual 'Privacy' and a collection of miscellaneous features. Maybe 'AI' will be added. None of this appears to go beyond what some browser extension could do. Personally I'm not seeing the appeal.

As a side note, the very similar Arc browser was just sold off to Atlassian for a quick exit.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: