I've been thinking of how to make a blog simple recently, and I came across xslt. It looks really cool and seems pretty set in stone, so I thought I'd ask, what are the advantages/drawbacks of making your own tech stack versus xslt? At first glance, it seems perfectly able to handle rss and other simple linking patterns, and pretty much anything can easily be turned into an xml then xslt could be used to generate an html (server-side, or rather writer-side, not like the blog is gonna change) that you serve?
XSL is neat, and it is a functional language, but between XSL and XPath, it is quite verbose. Here's a small section of XSL I use to generate my website (not my blog):
And yes, there is other code I've omitted for brevity. This is used to generate the navigation links for the site. I initially write this ... prior to 2009 (that's when I moved it into git). There have been some minor fixes to the XSL over the years, but it's largely unchanged (for a reason that I hope is obvious). Yes, I still use it, because it still works, and it's for a static website.
I always do this on F-Droid, always find nice gems in there.
Wouldn't dream of doing it on Play Store, it's all trash, and even the stuff I go there to download specifically, I wish I didn't have to most of the time.
> Not mentioned: there would be a single gatekeeper for the internet, Cloudflare.
Cloudflare is already a monopoly though. From what I can tell, what they are saying here, besides proclaiming their continued existence, is that they and AI companies and content creators need the internet to exist.
And what they are building is on top of the 402 response, which anyone can use? So you could use that without using any CDN at all, without too much development cost?
They are all platforms: news companies, publishing houses, ISPs, recording labels, YouTube, Google, Meta, TikTok, Windows, App Store, Play Store, Amazon.. and as platforms they squeeze both the layer above and the one below, their customers and their providers.
You know what that looks like? rent. It looks like home owners setting rent as high as the market can bear, until it becomes almost no advantage at all to participate.
It sure looks like Cloudflare wants to be the platform for AI royalty collections. But creators, customers, sellers - they are all hostage to the platforms they can't avoid. That is where the money sink is.
Sorry, not exactly monopoly. Probably too big to fail, from what I gather, though. I don't know enough to know whether there is anyone else that could realistically compete with their scale.
My point was simply that their size is unrelated to this proposition, outside the fact they are big enough to be taken seriously.
Incidentally, I found out Haiku OS went in beta in September 2018, and that coincides with the bump in unknown in the counter. Not sure how much that's a cause and effect for 2018 or for later, though.
I don't know much about materials science, but I had a few classes about it.
Seems like their wood gets ~550 MPa in ultimate strength in tension. Seems like their material is brittle (so it behaves like a spring until it breaks), therefore you probably want a safety margin, because at 550 MPa it breaks. Note the unit is a Force/Area, you can compare materials with the same cross-section. In compression they say it's about 160 MPa in axial load, it can be more or less in the other directions (due to wood having fiber it's not the same in all directions, and there they compress it perpendicular to the fiber so they get one direction stronger than the axial load and one weaker, but I guess for a beam you mostly care about axial strength).
Torsion and flexion are directly dependent on compression, shear and tension, didn't find shear. Although I'm not entirely sure how it works for materials that aren't the same in all three directions like steel.
For steel, depends on the steel but a quick search (https://www.steelconstruction.info/Steel_material_properties and https://eurocodeapplied.com/design/en1993/steel-design-prope...) says ~200 to 400 MPa in tension for yield, at which point it starts changing shape instead of behaving like a spring, then 350 to 550 MPa for strength, at which point it breaks. I believe in multiple applications they do go apply forces where the metal bends a bit and adapts to its application, but I'm not sure. Regardless, that would mean the wood in tension is equivalent to very strong (presumably very expensive) steel.
In compression, steel is from 170 to 370 MPa apparently(https://blog.redguard.com/compressive-strength-of-steel, didn't find much else easily because numbers were strange on other sources), so I guess steel would win on that one.
But this is comparing the raw strength. In reinforced concrete, you add the metal for tension resistance, concrete is there to sustain compression, so it wouldn't matter much. For beams, the shape of beams is optimised to resist in the direction it needs (e.g. the H cross-section resists to bending in one direction). But you probably can't do that with their wood (they say for now they are limited in shapes), so you'd need more material, and probably it would be stronger overall since you have more material. Question then is how much material (in weight, compared to steel) do you need (they say 10 times less but it probably doesn't take into account the shape), and how much does it cost?
I'm guessing they could also make composite beams at some points too, with not only wood in them.
Then for mechanical applications, there might be also other things that enter the game. In their paper they needed to coat the wood so it wouldn't swell with humidity. For any application with friction, not great. Also, I wouldn't be surprised if it's more sensitive to friction than metals.
Note that the numbers are from 2018, they may have improved the process.
Might be a bit naive, but by the time 2029 comes around, and AI companies have started 'monetising free users', won't a lot of people/companies have open-source models tuned and tailored to their needs running locally, no provider required?
If there is anything I can expect to remain consistent over the next 30-40 years, it's that the majority of people have no interest or ability to maintain technical systems themselves, as evidenced by the often banal and extremely simplistic tasks that are required of me as an IT support technician, family, friends, etc. There absolutely will be a direction in models toward more specific tasks, but they will be packaged and distributed by channels and entities that need to monetize their services in some way. That is just inevitable. The problem was never the access to information, it was the nature of most people. (Just to be clear, I don't think this is technically a bad thing, just something I have noticed.)
I think it'll mostly still be centralized around providers
I don't see good local models happening on mobile devices anytime soon, and the majority of desktop users will use whatever is the best + most convenient. Competitive open source models running on your average laptop? Seems unlikely
Remember when kids used to learn editing or photoshop? You can do that in some remarkably capable tools online, and the kids don't bother with offline tools anymore. This is the same thing.
No, the fundamental issue is the difficulty of proving correctly-rounded results, which means implementations end up returning different results in practice. Unums do nothing to address that issue, except possibly by virtue of not having multiple implementations in the first place.