As the person ultimately responsible for the Minecraft Wiki ending up in the hands of Fandom, it is great to see what Weird Gloop (and similar) are achieving. At the time of selling out, the Minecraft Wiki and Minecraft Forum cost tens of thousands of dollars per month to run and so it didn't feel too much like selling out, because we needed money to survive[1]. 15 years later, the internet is a different place, and with the availability of Cloudflare, running high-traffic websites is much more cost effective.
If I could do things over again, on today's internet, I like to believe Weird Gloop is the type of organisation we would have built rather than ending up inside Fandom's machine. I guess that's all to say: thank you Weird Gloop for achieving what we couldn't (and sorry to all who have suffered Fandom when reading about Minecraft over the years).
[1] That's a bit of a cop out, we did have options, the decision to sell was mostly driven by me being a dumb kid. In hindsight, we could have achieved independent sustainability, it was just far beyond what my tiny little mind could imagine.
You and your team made (a good portion of) my childhood. I remember spending nights studying all the potion recipes and enchantment odds. Thanks for all you did
I was approached about a decade ago to combine The Infosphere with then Wikia's Futurama wiki. I asked it was possible to do a no-ads version of the wiki, and while initially they seemed like that might be possible, they eventually said no, and so we said no. So now there are two Futurama wikis online. I still host The Infosphere, haven't checked the Fandom one in years.
Fortunately for me, Futurama isn't as popular as Minecraft (for some reason!), so I've been able to pay out of my own pocket.
A bit of a follow up to this; after a bit of thought, I am considering reaching out to Weird Gloop. I do not feel I am able to give The Infosphere the care that it deserves. And with Futurama back on Hulu, we are naturally seeing an uptick in activity. We have a very restrictive sign up in place, because I don't have time to moderate it anymore. It keeps the spam down, yes, but also new users away.
Note: The reason I'm writing I'm _considering_ reaching out and not just straight up reaching out is because the domain itself has a different owner than me, and I want to make sure they are also approving of this decision.
What kind of costs are associated with something like this, and what sort of visitors are you getting? I'm wondering what kind of infrastructure you need.
Importantly, I have since set up Cloudflare before the website to help. I am just using their free tier, but looking at their analytics, they say we got about 350k HTTP(S) requests in the last 24 hours.
Had it not been for Cloudflare, I am not sure my server could have handled that. Before I did that, I set up Varnish as a cache provider for users who are not logged in. That is effectively the second line of defence now.
The server itself is a dedicated server at Hetzner. I use the server for a bunch of other things, that see nowhere near the same activity as the Infosphere, and I also use it for my personal screen+irssi setup. But all in all, the server costs me about 50 euros a month.
Though, again, Cloudflare is basically the single most important reason it's not costing me more, and why I have not needed to hand it over.
Ah OK, that's basically exactly the setup I'd use as well. Surprising that the server alone couldn't handle the traffic, as the sibling says, 4 rps isn't that much when you cache (cache hits are basically free).
I imagine 90% of the traffic (or more) is anonymous users, which can be cached, doesn't Varnish handle that without breaking a sweat?
4 requests per second is absolutely something even a cheap VPS should be able to handle, even if you double that for peak load. You just need to put caching in front of everything dynamic.
Disappointing for people just carelessly giving Buttflare the keys to the kingdom and effectively excluding alternative Browser users without considering other options.
An off-hand reference to "350k/day" shouldn't be naively translated to "4 per second"
350k/ day likely means sometimes it's 3.5 million, all smashed into a 30 minute period of time, because some nitwit linked to my site.
And then, I get paged about "my site being down" and I have to stop hanging out with my friends or family and fiddle around with things I don't want to fuzz with. Or maybe it just breaks and doesn't self heal and it is offline for a week until I notice it and fix it, and by then people all think the site's gone.
Anyhow, sure, maybe people not wanting to devote their lives to devops fanfic is something that can "just be solved with this simple trick cloudflare hates" but maybe not.
It's been a long time since I switched to Cloudflare. Looking through my email archive, it was December 2015. I uncovered an old discussion[0] about the switch, but it only seems to highlight that the server is slow.
But I think it speaks to my lack of skill in this area. I have no actual professional training in system administration, and entirely autodidactic in this area. Though it sounds like Weird Gloop can also provide guidance in these matters rather than simply taking on the hosting. I won't deny that at times I have felt defeated, and that may truly have been my reasoning for switching to Cloudflare.
Though this post and response so far have given me hope.
Their growth people emailed me again and again and tried to do the same with StrategyWiki decades ago.
Here's one of their emails:
> [Redacted] mentioned that your site was very cool - and that you're heading off to college. As you may know, Wikia is founded by Jimmy Wales (of wikipedia fame) and we are trying to become THE resource for all gamers
> I was wondering if you'd consider moving over to wikia now that you're going to might have less time with your studies. As an incentive I can point to a few things that might make the move easier
This is basically an offer to buy your business for $0 and we might hire you as a contractor. It's a bad deal. I mean Jimmy Wales himself wouldn't have accepted this for Wikipedia.
Cripes, that sounds creepy and exploitive. I'm pretty sure it would have raised more than a few red flags in my mind, even as a teenager about to head off to college. (Granted, I was a wee bit uptight at that age.)
> As you may know, Wikia is founded by Jimmy Wales (of wikipedia fame)
And Jimmy should be ashamed about being involved with Fandom/Wikia. Then again, he's also not ashamed about begging from third-world people and others much less well off as himself.
I remember reading the Minecraft wiki back in the early 2010s, back when Fandom was still Wikia. It would have been much more appealing at the time than it is today - not just for the reasons you list, but because Wikia actually kicked ass in the early 2010s. It was sleek, modern, and easy to use. And today, it isn't.
Every time I wind up on some garbage Fandom page I reminisce about the good old days of Wikia. I remember many a fun night trawling through pages while playing Fallout or Skyrim or whatever - all the information you could ever need, right there at your fingertips. It's an ethos you don't see so much on the modern net.
It’s funny that people are now looking back at wikia fondly because at the time most folks thought it was full of ads and shit. To the point where Curse/Gamepedia managed to get serious market share by not screwing with the community in the same way at the time.
I remember thinking that wikia sucked at the time, but at least it didn’t actively hinder me from finding what I was looking for. I just don’t open fandom pages because it locks up my phone.
Wikia is a great example of enshittification - provide great value to users, then take it away from users and hand it to other businesses (eg advertisers), then take it away from businesses too.
Will Weird Gloop inevitably suffer the same fate? I hope not.
> Will Weird Gloop inevitably suffer the same fate? I hope not.
Unless explicitly structured to prevent it, my bet is it will. If it's backed by a for-profit entity, it'll eventually need to turn a profit somehow, and users/visitors are the first to lose their experience at that point.
However, if Weird Gloop is a properly registered non-profit with shared ownership between multiple individuals, I'll be much more likely to bet it won't suffer the same fate.
I skimmed around a bit on the website to try to get an answer to if it is an non-profit, but didn't find anything obvious that says yes/no.
At least it is a private company though, meaning they are are required to make constant year over year gains for shareholders and investors. They have much more control over where the company goes and how it operates.
publicly traded companies are not "required" to make constant year over year gains for shareholders and investors, that is just what the owners usually decide to tell the company to do. The owners of a privately traded company could decide to, and the owners of a publicly traded company could decide not to. For example, zuckerberg controls 53% of the voting stock of facebook, so whatever zuck says goes and if other shareholders don't like it they can kick rocks. This is pretty much the same situation that people imagine is the case with privately traded companies, even though facebook is obviously publicly traded.
"that is just what the owners usually decide to tell the company to do"
Because the entire system encourages it. The market rewards growth FAR more than it rewards a consistent dividend payout. (See: companies growing 40% YoY command a significfantly higher earnings multiple than those growing 10% YOY). So imo this is a like saying "people could decide to just invest money and then not seek the best returns possible." Also remember these shareholder are seldom John Smith principled human retail investor. It's firms whose entire purpose themselves is to seek maximum return.
"The owners of a privately traded company could decide to"
Meanwhile this DOES actually happen sometimes. See: Valve. We all know there's ways Valve could put up really great growth numbers for about 2-3 years while completely destroying all of the things that make Steam so god damn compelling to users that they can command the same cut as Apple, on an OPEN platform (vs Apple fighting utterly tooth and nail to keep iOS 100% airtight locked down). But they don't.
"For example, zuckerberg controls 53% of the voting stock of facebook, so whatever zuck says goes"
TBC most founders/CEOs are NOT majority voters in their companies. They answer to the board. Most company founders lose voting control. The fact that Zuck is still in control is incredibly unusual and is a testament to how fast Facebook has grown that he's been able to keep hold of the reins.
You just explained one reason why Steam is like this. Because they do not control the OSes Steam runs on. (Arguably, even not in the case of SteamOS.)
(Steam does try to do part of the job of the OS though, taking control over updates and even deciding what is acceptable on their platform and what is not.)
Elon Musk is another CEO in total control. Although Tesla is a public company and therefore has a board, it’s stacked with Elon’s allies/appointees and answers to him, not the other way around. Despite Elon not being a majority owner of Tesla stock.
And when he took over Twitter in 2022, he immediately dissolved the board and fired the executives who were on it.
In fact, the relatively new concept of a "public benefit corporation" is (at least in part) an effort to allow for-profit entities to pursue goals other than shareholder enrichment. However, some have criticized public benefit corporations as being entities that simply strengthen executive control at the expense of shareholders. https://en.wikipedia.org/wiki/Benefit_corporation
About Dodge v. Ford Motor Co.:
Dodge v. Ford Motor Co., 204 Mich 459; 170 NW 668 (1919),[1] is a case in which the Michigan Supreme Court held that Henry Ford had to operate the Ford Motor Company in the interests of its shareholders, rather than in a manner for the benefit of his employees or customers. It is often taught as affirming the principle of "shareholder primacy" in corporate America, although that teaching has received some criticism.[2][3] At the same time, the case affirmed the business judgment rule, leaving Ford an extremely wide latitude about how to run the company.[citation needed]
The general legal position today (except in Delaware, the jurisdiction where over half of all U.S. public companies are domiciled and where shareholder primacy is still upheld[4][5]) is that the business judgment that directors may exercise is expansive.[citation needed] Management decisions will not be challenged where one can point to any rational link to benefiting the corporation as a whole.
This doesn't contradict what I said. In fact it supports it. I said that the owners of the company are the ones who determine what it does. The shareholders are the owners. If the owners of the company want it to do a certain thing, and the directors do a certain thing, and it does that thing, no court is going to stop them. There is a rule that says that shareholders aren't allowed to try to screw over other shareholders, but I don't think "The other shareholders decided to pursue the public benefit rather than maximum profit" would quality.
Actually, you pointed out a true inaccuracy in my comment, because when I said:
> zuckerberg controls 53% of the voting stock of facebook, so whatever zuck says goes and if other shareholders don't like it they can kick rocks
This is only true in cases where zuckerberg's actions are not intended to benefit his interests at the expense of other shareholders'. I think in the Ford case, there was not a majority of shareholders who wanted to expand the business and increase wages at the expense of profit, So it was essentially two minority shareholders fighting.
Shouldn't it be worrying that companies are required to make consistent gains* for shareholders and investors? At some point, a company will naturally reach a market saturation point.
This wasn't exactly the question. The question was about growth. A company could be very profitable without growth (say, they own a mine which produces $40 million worth of ore each year with expenses of $10 million with no end in sight) or can have growth without profit (Open AI is a great example, or for history, the first 5 years of Facebook.)
I know most of stock investing is about capital gains and not dividends, but I think GP was saying it's inherently impossible to have growth forever.
On a financial level I get why people prefer to invest their money in a stock that goes up rather than one that pays them 8% a year consistently in dividends, but it seems unfortunate that somehow it seems like we aren't allowed to just have sustainable companies that don't depend on infinite growth to stay in business.
We have services agreements with the League of Legends and RuneScape developers, and we run 1 ad (below-the-fold, not in EU/UK) on the RuneScape wikis. This covers all expenses (including 5 staff) by a pretty healthy margin
> The company primarily relies on three streams of revenue: user donations, serving ads on select Weird Gloop wikis, and a contract with Jagex that includes a fee to cover hosting and administration costs.
I didn't see anything in the article about setting up incentives to keep the same thing from happening to Weird Gloop that happened to Fandom, which means the blog post is just empty marketing.
The only difference is that Weird Gloop is the little guy. Competition is good! That might be a good enough reason to choose them if you're in the market for wiki hosting!
But the moral posturing won't last if they become dominant, unless they set up incentives fundamentally differently than Fandom did, which doesn't seem to be the case.
As long as advertising is one of their revenue sources, the user experience will get crappy as soon as the network effects make it hard to leave. The cycle continues.
Did you read the post? There's a whole section talking about how they are entering into binding agreements that let communities leave (and take the domain) if they have a better option
Can we flip it? Some companies are explicitly structured to guarantee enshittification.
Venture capital/private equity is what causes this. We've been poisoned to believe that websites should exist purely to achieve hyperscale and extract as much money as possible. When you look at the real physical world there are tons of small "mom and pop" businesses that are content with being self sustainable without some special corporate structure to legally require that.
I work for private equity, and while we have a lot of layoffs, we don’t necessarily pursue short term gains (at least, as far as I can determine not as a factor of being PE anyway)
The article explicitly covers this question. Looks like they're setting up explicit legal(?) agreements. One key point is the domain name: minecraft.wiki, for example, not a subdomain of something owned by Weird Gloop. So the wiki can leave if it wants to.
Does that mean that to the users of these wikis, the switching costs[1] of the backend would basically be zero (one day they might just end up on a different server with the same content), while on the administrators' side the switching costs are at a reasonable minimum?
To my understanding wikis can take all their data, host it themselves, point the domain to their new hosting, and the move would be entirely invisible to end users if done properly and the quality of the hosting infrastructure wasn't considerably worse.
Observant users might notice the removal of any Weird Gloop branding but otherwise the only way people would know if the wiki itself announces the move or performance of the wiki becomes noticeably worse.
And Weird Gloop won't do what Fandom does and keep a zombie copy of your wiki online. So you won't be competing with Weird Gloop wiki traffic to reclaim your traffic. In fact, the obligations they agree to forbid it.
Upon termination by either party, Weird Gloop is obligated to:
- Cease operating any version of the Minecraft Wiki
- Transfer ownership of the minecraft.wiki domain to the community members
- Provide dumps of Minecraft Wiki databases and image repositories, and any of Weird Gloop's MediaWiki configuration that is specific to Minecraft Wiki
- Assist in transferring to the community members any domain-adjacent assets or accounts that cannot reasonably be acquired without Weird Gloop's cooperation
- This does not include any of Weird Gloop's core MediaWiki code, Cloudflare configuration, or accounts/relationships related to advertising or sponsorships
This sort of agreement means Weird Gloop is incentivized to not become so shit that wiki would want to leave (and take their ad revenue with them) because they've tried to make leaving Weird Gloop as easy as possible.
This is very reassuring. Usually, I assume agreements between different groups will inordinately benefit one party, but this particular agreement sounds like it creates a more level playing field.
And besides, it's not like non-profits are exempt from restructuring and becoming worse. There is no silver bullet.
Yeah - it would be on the same domain, so way users access it wouldn't change at all.
If any of the wikis we host want to leave, we'd provide them with a database dump. The admins would have to configure all of their own MediaWiki stuff of course, but I figure that's a pretty reasonable switching cost.
Thanks (seriously). Fandom may not be great, but you could have said I don't want to foot the bill, turned off the servers and walked away. Then the community would have lost every thing. Leaving it with Fandom gave Weird Gloop something to start with instead starting from scratch.
I can't imagine that this would have happened, like ever. The wiki was basically essential reading prior to starting to play Minecraft, especially in the early days. I think most the crafting recipes were documented by the developers themselves during those days.
If they killed the wiki, they would have killed their userbase.
citricsquid wasn't a Mojang employee. This whole thing is and always has been community-run [0], so the "they" in "if they killed the wiki" is not the same as the "they" that was selling Minecraft.
Now, one could reasonably ask why Mojang/Microsoft didn't (and I'm assuming don't) foot the bill for the manual that is an essential part of their game.
I hate that MCW ultimately ended up with Fandom in the end. Keeping MCW and the other wikis running smoothly was essentially my one huge passion in my life that I lost after Fandom acquired Curse. No one wanted it to happen that way. Even internally at Curse/Gamepedia we were all devastated when we learned that the company was buying bought out by the rival we were striving to overcome all those years. I am so glad to see after the past few years that the wikis are finally healing and going to places that are better for them.
[1] I'm the tech lead/manager that worked on Gamepedia at Curse that administered Minecraft MCW for many years before Fandom bought Curse in December 2018. I'm just writing this here since I figure other readers won't have any idea. ヾ(≧▽≦*)o
One thing I find interesting about playing video games in modern day is that with the proliferation of Wikis, there is assumed to be some kind of third party guide for every game. Especially in smaller/newer games it seems like developers sometimes don't bother putting necessary information in the game at all because they don't have the person-hours for it.
For instance, back when I first played Minecraft in Alpha the only ways to find the crafting recipes was through a wiki, or trial and error.
It's nice that it makes development easier, but I wonder if this trend is making it harder for new people to get into video games, since it's hardly obvious if you're not used to it.
> One thing I find interesting about playing video games in modern day is that with the proliferation of Wikis, there is assumed to be some kind of third party guide for every game. Especially in smaller/newer games it seems like developers sometimes don't bother putting necessary information in the game at all because they don't have the person-hours for it.
While this may have become more of a norm in recent years, online communities with community-supported guides have definitely been around since before wikis were common in the gaming community: most notably at gamefaqs.com. To this day you can still find plaintext walkthroughs for thousands of games, written 25 years ago by pseudonymous authors.
Which isn't exactly to dispute your point, just waxing nostalgic about the good ol' days. The RPG Maker 2000 forum was basically my introduction to programming, waaay back in the day.
Video game magazines would regularly publish short walkthroughs and maps, as well as tips on common places to be stuck in popular games, and cheat codes.
Guidebooks were found in stores next to the games, they were typically slim, full-color affairs full of screenshots and production art, with complete lists of all the stuff you could do in the game. Full walkthroughs, item statistic charts, locations of the 52 Secret Gears you need to collect to build the Wind-Up Sword to achieve the secret ending, etc, etc. Here's a photo of someone's collection of a bunch of them: https://www.reddit.com/r/originalxbox/comments/12rsvll/seems...
I don't really know how exploratory most games are compared to old Minecraft. Some games like Stardew Valley have certain things that are much easier to do because of third party wikis but I don't think the same is true of a lot of games in the same way it was for Minecraft.
I picked up Stardew Valley a few months ago for the first time, and consciously chose not to use the wiki. I'm obviously way behind where I would be had I used the wiki, but it's been fun figuring out what works by myself.
One game I recently got which has great exploratory potential is Shapez 2. The in-game help is amazing.
More than a decade has passed since then so I am stretching my memory. At peak we were serving in the region of 10 million page views per day which made us one of the most popular websites on the internet (Minecraft was a phenomenon and every Minecraft player needed the wiki). We were probably the highest traffic Wiki after Wikipedia. Nowadays Cloudflare could absorb most traffic because of the highly cacheable nature of it, but at the time, Cloudflare didn't exist, and every request hit our servers.
Yeah, Wikia in aggregate was in the top 50, maybe a top 20 site at various points. Wikia was built on caching. From my memory, about 99% of page views hit some kind of cache. If that dropped down to 97%, servers started to suffer. It's good to remember that the Fastly CDN company is a spinoff of Wikia, it was developed internally there first. Without that (varnish cache plus lots of memcache) Wikia would not have been able to handle the traffic. Mediawiki is horribly inefficient and one reason why Wikia was attractive as a host was that we had figured out a bunch of tricks to run it efficiently. The default configuration of mediawiki/wikipedia is real bad. Bigger independent wikis just couldn't handle the scale and many of the best independent wikis moved there for that reason. Just as one example, every link/url on a page hits a hook/callback that can call into an extension literally anywhere in the code base, which was several million lines of PHP code. I remember the "Batman" page on the DC wiki used to take several minutes to render a new copy if it fell out of the cache. That was one page I used for performance optimization tests. The muppet wiki and the lyrics wiki also had huge performance issues and fixing them was some of the most fun engineering work I've done. Every useful feature had some kind of horrible performance side effect, so it was always a fun puzzle. I also hate landing on a Fandom wiki now but thanks to the actual editors, it's still got some good content.
I bet, being by some counts the most popular video game ever - but which also makes it kind of a bad example to use when talking about wikis.
By definition, very few wikis will have to deal with becoming one of the most popular websites. (And as you say, at that point one should be able to figure out funding.)
Cloudflare get the best deals on bandwidth. It will usually be cheaper to serve a terabyte from Cloudflare than to do it yourself: you could probably run the wiki on the free plan!
Perhaps, but VPS traffic prices are also already a lot better than "big cloud" traffic prices, especially if you choose your VPS provider with that in mind. And once your traffic is large enough there are also options where you pay for a fixed pipe instead of a transfer amount.
If you want to pay for bandwidth then yeah, CloudFlare is a great option.
Otherwise, if you like the experience of not paying per GB/TB, go for a dedicated server with unmetered connection that has the same price every month, regardless.
Cloudflare don't charge per GB/TB. You get unlimited bandwidth even on their free plan. The problem with paying per GB is that it's in the CDN's interest for you to get a DDOS attack so they can charge you for all the bandwidth. It's in Cloudflare's interest to reduce DDOS attacks and unwanted bot traffic because it costs them bandwidth, not you.
I moved a few of my personal websites to AWS's CloudFront and it cost me like a buck a month, way cheaper than maintaining a virtual server to do it. Except that somebody somewhere decided to try their DDOS tool on one of them for a few hours in the middle of the night, and I got a bill for $2541.69.
The whole point of systemic incentives is that there is no conspiracy. Nobody wants a DDOS and every large provider will have people genuinely working to avoid them. But every time there is an opportunity to allocate resources, the team that gets to frame their return on investment in terms of real dollars will always have an edge over one whose value is realized only in murky customer satisfaction projections. Over the lifetime of a company, the impact of these decisions will add up with no need for any of the individuals involved to even be aware of the dynamic, much less conspire to perpetuate it.
That's sound logic. In this specific case of capitalistic incentives, I haven't noticed that it's working out in a way that make one more vulnerable to DDoS when one pays for bandwidth
It's more like: if you have a website that (sometimes) gets a lot of traffic, do you want Cloudflare to cache it and serve it with very few hits to your cheap server, or do you want your compute costs to expand to cope with the requests?
> do you want Cloudflare to cache it and serve it with very few hits to your cheap server, or do you want your compute costs to expand to cope with the requests?
Usually you have something like a platform/tool/service that is mostly static requests that could be cached, with some dynamic requests that couldn't, as they're CRUD requests or similar.
If your struggling to serve static content, then do go ahead and slap Cloudflare on top of that bad boy and probably your visitors will be a bit happier, instead of upgrading from a cheap VPS.
If you're struggling to serve the dynamic requests, Cloudflare/CDN won't matter because these things actually need to be processed by your backend.
So instead of trying to shave 50ms off from my simple static requests with a CDN, I'd much happier to optimize for all the requests, including the "dynamic requests" that need to hit the backend anyway.
I'll still go for a dedicated server with proper connection and performance rather than a shitty cheap VPS with a CDN in front off it.
Hetzner is pretty cheap, but only offers Europe location for their dedicated servers last time I checked. For more locations, DataPacket is nice, although a bit more expensive.
If you can run your application on Cloudflare Pages / Workers with Cloudflare's storage/DB things, it really gets dirt cheap (if not free) and very fast. And even without that, Cloudflare's caching CDN is very good, very cheap and very easy.
Ten years ago bandwidth was expensive. Still is, even if not as much. A simple VPS gets overwhelmed, but a simple VPS behind cloudflare can do quite well.
Cloudflare caches pages at many many datacenters, often colocated with large ISPs.
This lets Cloudflare deliver pages from their local cache over local links (which is fast and cheap), instead of fetching the data every time across the world from wherever the VPS is located.
In all fairness, running modest to large MediaWiki instances isn't easy. There's a lot of things that are not immediately obvious:
- For anything complex/large enough you have to set `$wgMiserMode` otherwise operations will just get way too long and start timing out.
- You have to set `$wgJobRunRate` to 0 or a bunch of requests will just start stalling when they get assigned to calculate an expensive task that takes a lot of memory. Then you need to set up a separate job runner in the background, which can consume a decent amount of memory itself. There is nowadays a Redis-based job queue, but there doesn't seem to be a whole lot of documentation.
- Speaking of Redis, it seems like setting up Redis/Memcached is a pretty good idea too, for caching purposes; this especially helps for really complicated pages.
Even to this day running a Wiki with an ambient RPS is kind of hard. I actually like MediaWiki because it's very practical and extensible, but on the other hand I know in my heart that it is a messy piece of software that certainly could make better use of the machine it's running on.
The cost of running a wiki has gone down over time in my experience though, especially if you are running things as slim as possible. A modest Digital Ocean machine can handle a fair bit of traffic, and if you wanted to scale up you'd get quite a boost by going to one of the lower end dedicated boxes like one of the OVHcloud Rise SKUs.
If anyone is trying to do this I have a Digital Ocean pro-tip. Don't use the Premium Intel boxes. The Premium AMD boxes are significantly faster for the money.
One trap I also fell into was I thought it might be a good idea to throw this on a hyperscaler, you know, Google Cloud or something. While it does simplify operations, that'll definitely get you right into the "thousands of dollars per month" territory without even having that much traffic...
At one point in history I actually felt like Wikia/Fandom was a good offering, because they could handle all of this for you. It didn't start out as a bad deal...
As I was exploring self-host options that would scale to our org size, it turned out there was already an internal team running a company wide multi-tenant mediawiki PLATFORM.
So I hit them up and a week later we had a custom instance and were off to the races.
Almost all the work that team did was making mediawiki hyper efficient with caching and cache gen, along with a lot of plumbing to have shared infra (AD auth, semitrusted code repos, etc) thst still allowed all of us “customers” to implement whatever whacky extensions and templates we needed.
I still hope that one day Microsoft will acknowledge that they use Mediawiki internally (and to great effect) and open-source the whole stack, or at least offer it as a hosted platform.
I tried setting up a production instance af my next employer - and we ended up using confluence , it was like going back to the dark ages. But I couldn’t make any reasonable financial argument against it - it would have taken a a huge lift to get a vanilla MW instance integrated into the enterprise IT environment.
The rumour i heard is they were making their own custom thing.
There was some rumours that they were unhappy about mediawiki's response to patches they submitted (they made a bunch around accessibility). However i looked through their patches at one point when this rumour started flying around and it looked like most were merged. Those that weren't generally had code review comments with questions or pointing out mistakes which were never replied to. I sort of suspect the patch thing was some sort of internal excuse because the team involved wanted to make their own thing.
Regardless, im really happy they decided to open source their extensions and it was nice to see that they put in effort to upstream core patches.
A lot of things should be solved by having (micro)caching in front of your wiki. Almost all non-logged in requests shouldn't even be hitting PHP at all.
In my experience this hasn't been necessary yet on anything I've ran. I know WMF wikis run Varnish or something, but personally I'm trying to keep costs and complexity minimal. To that end, more caching isn't always desirable, because RAM is especially premium on low-end boxen. When tuned well, read-only requests on MediaWiki are not a huge problem. The real issue is actually just keeping the FPM worker pool from getting starved, but when it is starved, it's not because of read-only requests, but usually because of database contention preventing requests from finishing. (And to that end, enabling application-level caching usually will help a lot here, since it can save having to hit the DB at all.) PHP itself is plenty fast enough to serve a decent number of requests per second on a low end box. I won't put a number on it since it is obviously significantly workload-dependent but it would suffice to say that my concerns with optimizing PHP software usually tilt towards memory usage and database performance rather than the actual speed of PHP. (Which, in my experience, has also improved quite a lot just by virtue of PHP itself improving. I think the JIT work has great potential to push it further, too.)
The calculus on this probably changes dramatically as the RPS scales up, though. Not doing work will always be better than doing work in the long run. It's just that it's a memory/time trade-off and I wouldn't take it for granted that it always gives you the most cost-effective end result.
Varnish caching really only helps if the majority of your traffic is logged out requests. Its the sort of thing that is really useful at a high scale but matters much less at a low scale.
Application level caching (memcached/redis/apcu) is super important even at a small scale.
Most of the time (unless complex extensions are involved or your wiki pages are very simple) mediawiki should be io-bound on converting wikitext -> html (which is why caching that process is important). Normally if db is healthy, db requests shouldn't be the bottle neck (unless you have extensions like smw or cargo installed)
Most of MediaWiki seems to avoid too much trouble with contention in the database, but I was seeing it prior to enabling application-level caching. It seemed to be a combination of factors primarily driven by expensive tasks in the background. Particularly complex pages can cause some of those background tasks to become rather explosive.
With Digital Ocean the cpuinfo is obfuscated so figuring out exactly what you're running on requires a bit more trickery. With that said I honestly assume that the comparison is somewhat older AMD against even older Intel, so it's probably not a great representation of how the battlefield has evolved.
That said, Digital Ocean is doing their customers a disservice by making the Premium Intel and Premium AMD SKUs look similar. They are not similar. The performance gap is absolutely massive.
One of the things on my todo list is to spend some solid time thinking about load-shedding, and in particular tools and methods for small or hobbyist projects to practice it. Like what do you turn off on the site when it's the 15th of the month and you're already at 80% of your SaaS budget?
Like maybe if a request for an image doesn't result in a 304, instead of sending a 200 response you redirect to lower res versions, or just 429 out. How much throttling do you do? And do you let bots still run full speed for SEO reasons or do you do something else there?
To be fair a lot of wikis' and internet cultural places' continuity woes would be mitigated by making it easier to decentralize hosting or at least do a git pull. Wikis especially don't tend to be that large and their S/N is quite high, making them attractive to mirror.
For example I configured my osdev wiki (mediawiki based) so that the history and other special pages get the Cloudflare test but just viewing a page doesn't trigger it. OpenAI and other bots were generating way too much traffic to pages they don't need.
Blame the bots that are DDOS'ing sites for the captchas.
Not a lawyer, I'm guessing here.
I'd assume the intention matters a lot. Scrape bots don't intend to cause trouble, they intend to get your data (for free). Same way as when some famous person tells people on Twitter to visit a website or when some poor blog gets the hug of death from HN. The intention wasn't to bring down the site.
Aside from that: is DDosing actually illegal (under US law)?
Right. Pretty sure it's illegal under EU law(s), and people were already condemned for it (but yes, in case ill intent was proven) - why wouldn't it be illegal under US law - it's basically akin to vandalism ?
(In other news, the Internet Archive got DDoSed today :(
At least they moved away from Google Captchas, which really hates disabling of 3rd party cookies and other privacy-protection measures.
I haven't had a problem with Cloudflare and their new Captcha system since their changed, but I still suffer whenever I see another website using Google Captcha :(
I used to have a lot of bot spam, but then I mostly foiled them with the world's silliest captcha. Looks like a math problem, but the solution isn't what's required to proceed.
The other side of the coin is lizards trying to literally end the internet era with their irresponsible behavior, and hell, making a nice living in the process
Cloudflare dropped captchas back in 2022 [0], now it's just a checkbox that you check and it lets you it (or does not).
And this mean that my ancient android tablets can no longer visit many cloudflare-enabled sites.. I have a very mixed feelings about this:
I hate that my tablets are no longer usable so I want less Cloudflare;
but also when I visit websites (on modern computers) which provide traditional captchas where you click on picture of hydrants, I hate this even more and think: move to Cloudflare already, so I can stop doing this nonsense!
total block on _old_ tablets - Android 4.4 specifically, and I am sure many people on HN would be horrified to see those anywhere close to internet. New tablets are fine.
As for "more user-friendly captchas" - I have seen some of those (like AliExpress' slider) but I doubt they will work as well as hydrants. And with new AI startups (1) slurping all the data on the web and (2) writing realistic-looking spam messages, I am sure anti-bot measures would be more important than ever.
Mediawiki is trivial to cache, though. For all intent and purposes most hits will be cache hits, and thus "static" content.
I'm also shocked at the tens of thousands per month, it can't possibly be hosting alone. It has to be that the maintainer had a generous salary or something.
I could have the numbers wrong, archive.org is down otherwise I would check as we shared information publicly at the time. As far as I recall, we weren't taking money from the websites, we were spending on infrastructure alone with more than $10k in spend in the final month before the sites were acquired. I think it is easy to forget how much more expensive running things on the internet was back then along with the unprecedented popularity of Minecraft. Once archive.org is back online, I'll track down numbers.
Not everyone is a professional web hoster with requisite knowledge on how to setup caching properly.
Mediawiki involves edits that users expect to propagate instantly to other pages. Sometimes this can easilt result in cache stampedes if not setup carefully.
MediaWiki supports extensions. Some of the less well architectured extensions add dynamic content that totally destroies cachability.
Everyone is better off learning how to setup caching properly than continue to pay tens of thousands dollars per month. It's not rocket science.
> Mediawiki involves edits that users expect to propagate instantly to other pages. Sometimes this can easilt result in cache stampedes if not setup carefully.
Most users should not even be hitting MediaWiki. It's ok to show cache entries that are a couple of seconds or even minutes out of date for logged out users.
> MediaWiki supports extensions. Some of the less well architectured extensions add dynamic content that totally destroies cachability.
Again, nothing reasonably needs to update all that often.
> For all intent and purposes most hits will be cache hits, and thus "static" content
That's not what static means in the context of hosting. Static means you upload files by FTP or WebDav or some other API and that's it. Something like hosting on S3. If users can log in, even if they usually don't, it's nothing like static any more.
Seriously? How does that even make sense to you? The OP had an asset generation 10k+ a month in profit and was so squeezed for cash he had to sell it.
Doesn’t it make more sense that a media have site would have been paying through the nose for bandwidth, hence the callout for cloudflare which would have made that cost free?
I have no idea how it works, but given that the read:write ratio is probably 100:1 or more, certainly it could just serve static, prerendered pages straight from the filesystem or something like memcached?
[Im a mediawiki dev]. Typically people use varnish for that use case. MediaWiki does support serving logged out views from a filesystem cache, but varnish is generally a better idea. There are also some caches out of memcached (mediawiki has "parser cache" in memcached which is the part of the page that stays constant between all users. Typically people use varnish on top of that for the entire page for logged out users)
Sometimes people add things to their sites that are incompatible with caching, which will make hosting costs go way up.
I sold httpstatuses.com (website and domain) in... 2016 I think, maybe 2017, and the acquirers kept it as-is until 2022. Someone (not me) recently relaunched the site (as it was opensource) under a new domain -- https://httpstatuses.io -- so if you can replace ".com" with ".io" in your muscle memory, you can get the original site!
A few months ago I created a project called Hypertext Town, a simple project where anybody can create "camps" (a collection of HTML, images etc.) and connect them together through "towns". A town lives at a subdomain (e.g: town.hypertext.town) and a camp lives at /~camp (e.g: town.hypertext.town/~camp). I never "launched" it so it's just been languishing in obscurity on the www but if anybody wants to make cute little creative HTML websites without the need for hosting, it's live to use at: https://www.hypertext.town
1. click "Set up camp in www" 2. make an account 3. choose your camp name 3. add your html / images etc.
Although undocumented, there is a way to programmatically get the camps in a town, you can check the www. town source for an example of how I've done it, the endpoint is `/camps`, e.g:
Schools should do this, let their kids explore the joys of building their own personal online space, rather than the branded and conformed offerings they'll likely use instead (FB, Google, Pinterest, etc)
Yup. Sounds like less "technical" interface to a web server. I remember in high school we all had shell accounts on the school server, but you actually had to learn SCP and some basic Linux usage to make it work.
Considering how much google search results dictate whether you come across information or not though, I wonder if webrings could make a lot of the web more visible these days?
As I understand it the 250,000,000 number refers to total streams of all tracks on the album[1], it does not refer to plays of the entire album, so the article is wrong. The album is 20 tracks which means we're looking at 12.5 million streams per track. There were (and still are) many offers for a free Tidal trial, which were shared all over the internet at the time the album became available, so I do think it's possible that Tidal could have had those numbers. Paying subscribers? No, but users? You could sign up to Tidal for free, and Kanye albums are very highly anticipated... Is it reasonable to expect ~3 million people to each listen to an album ~4 times through on average? I certainly listened to it at least a dozen times in the first 2 weeks and know many friends who did the same.
For comparison Taylor Swift's album Reputation did ~500,000,000 streams in the first few weeks after it launched, granted that was available on all services and not just Tidal: http://www.bbc.co.uk/news/entertainment-arts-42564917
I'm a little suspicious of the numbers, given Taylor Swift has a wider appeal than Kanye and her album was on all services but you could sign up for Tidal for free and the article is clearly wrong when it says "a claim that would have meant every subscriber played the album an average of eight times per day" so I'm inclined to say Tidal are telling the truth because if they did inflate the numbers then they would not have needed to inflate them from, say, 25 million to 250 million which is what the report seems to suggest.
[1] See the BBC Taylor Swift article for an example of how the figures are calculated
At this stage, I think as far as for metrics of this nature, it is time to stop quantifying album streams and focus on streams of tracks within the albums (in the example you gave, someone streaming the entire album should count 20x). I, myself, still buy albums in full, but I think there's a significant number of people who go for singles at this stage. Streaming, probably just as much of a divide.
To focus at the track level is about the only way to quantify for both cases.
>it is time to stop quantifying album streams and focus on streams of tracks within the albums
They already do this (or at least the RIAA does)
>In the new structure, 150 streams of a song equals one paid download, and ten paid downloads equates to an album download. So, an artist’s music will have to be streamed on any of the approved, included services 1,500 times for an album “sale” to be counted. [1]
> ten paid downloads equates to an album download.
But it gets fuzzy here. If the album has 10 tracks, it makes sense, but for ones with less than that or more than that, it doesn't hash out.
I'm just saying for these kind of metrics they should, now, stick to the track level but group them by album (ie Taylor Swift would have x streams/downloads from 1989 and y streams/downloads from Reputation).
Albums in these metrics just muddies the water too much and doesn't really quantify for those who go for singles. Say an artist releases a really, really popular song on a terrible album that no one buys/streams. That song performs well, lots of streams and lots of downloads or buys. Should the album be considered a hit if it is just a single track on it that performed well? Even if no one, or relatively few, streamed or bought the album proper?
> for ones with [...] more than that, it doesn't hash out.
Chris Brown released an album last year exploiting this fact. He released a fourty track double album. If that album is streamed in it's entirety a thousand times, he gets four sales on the charts. Many hiphop act did the same last year. The last Migos album had 24 songs and their label had just released a 30 songs compilation 2 months prior. Drake's last project was dubbed a 'playlist' and had 22 songs.
Well if these others are exclusive Tidal releases then those numbers might be a little loose but are relatively within the ballpark because they've narrowed access down to a single source. The bigger question is whether artists benefit from being on every single service or are better served by narrowing releases to a single distributor whether its tidal or something else or their own website.
This is not what Tether is intended to be. Tether is not pegged to the value of 1 USD. The value of Tether does not need to be stabilized. Tether is a 1 for 1 USD backed coin, meaning it's entirely within the scope of Tether for the value of 1USDT to reach $1.10, $1.20 or even $1.50 -- if the market decides that is the value of it.
The point of Tether is that for each USDT there is a corresponding USD in an account belonging to Tether. If Tether were printing USDT to keep 1 USDT valued at 1 USD then they would be violating what Tether is publicly stated to be.
But if you can buy a new Tetherbuck straight from Tether for $1USD, issued in unlimited amounts, why and how could the value of a Tetherbuck ever exceed $1USD?
And on the flip side, if Tether redeems any Tetherbuck for $1USD, on demand and quickly and without solvency concerns, why and how could the value of a Tetherbuck ever drop below $1USD?
Any deviation from $1USD value means the market disbelieves one of the above things.
Due to network congestion and manual processing,
we have closed ticket payments using
Cryptocurrencies — Hopefully, next year there
will be more unity in the community about scaling
and global adoption becomes reality.
We have, and always will, accept cryptocurrencies
for our conferences, up to fourteen days before the
event. However, due to the manual inputting of data
in our ticketing platforms when paid in cryptocurrencies,
we decided to shut down bitcoin payments for last minute
sales due to print deadlines.
This is a longer article on the subject of poorer health outcomes for minorities, covering the situation for black women and one family's experience, which is very much worth reading, but here are the key points:
Statistics:
* A black woman is 22% more likely to die from heart disease than a white woman
* ...71 percent more likely to perish from cervical cancer
* ...243 percent more likely to die from pregnancy- or childbirth-related causes
Reasons:
* Black women are more likely to be uninsured outside of pregnancy, when Medicaid kicks in, and thus more likely to start prenatal care later and to lose coverage in the postpartum period.
* The hospitals where they give birth are often the products of historical segregation, lower in quality than those where white mothers deliver, with significantly higher rates of life-threatening complications.
* [Black women] are more likely to have chronic conditions such as obesity, diabetes, and hypertension that make having a baby more dangerous.
* Black expectant and new mothers frequently told us that doctors and nurses didn’t take their pain seriously [...] numerous studies that show pain is often undertreated in black patients for conditions from appendicitis to cancer.
* An expanding field of research shows that the stress of being a black woman in American society can take a significant physical toll during pregnancy and childbirth.
* Black women are 49 percent more likely than whites to deliver prematurely (and, closely related, black infants are twice as likely as white babies to die before their first birthday).
But this seems to be different situation from what the article is talking about which is the inadequate involvement of minorities in drug trials. You are highlighting the lack of adequate health coverage which is a different issue in the US. But this bring up a question if blacks and other minorities have the poorer health outcomes in other countries? Japan, China, South Korea for example does their own drug testing and development so would they find that a particular drug doesn't work on their population at all? What about the black population in UK or France? Do they also have poorer health outcomes?
Or generally if historically oppressed minorities get poorer health outcomes.
It seems a complex question that has many facets. Two tickets that come to mind:
1. When US sociologists and political scientists talk about “systemic and structural racism” this is one of the manifestations.
That in general the well being of Native peoples in the US, of African Americans is devalued. It is witnessed in the exclusion in drug trials, in the diseases that pharma considers worthwhile to address, in the staffing of hospitals, in the access to healthcare,etc.
2. The unique inefficiency of the US healthcare system among those of wealthy countries has been documented in depth so I don’t know if is possible to do an adequate comparison of progressive health systems (e.g. Japan, UK, Finland) against that of the US.
It might be worthwhile to pull in progressive health systems that focus primarily upon Black people —- Botswana comes to mind —- as a point of comparison.
>When US sociologists and political scientists talk about “systemic and structural racism” this is one of the manifestations.
Which has always bothered me, since there are other explanations (lifestyle habits, genetics, poverty) that would explain the difference, in whole or in part. It's a politically convenient assumption that goes contrary to Occam's Razor.
I'm going to quote what someone said below because this is blatantly false
"This is not only false, but dangerously false. We are in the process of discovering that certain classes of popularly-prescribed drugs (eg ACE inhibitors for blacks, certain chemotherapy drugs for Asians) are ineffective or even toxic for populations not represented in the relevant drug development research cohorts. It's not identity politics to note that pharmacokinetics can differ between individuals and populations. These differences do not explain all of the population-level morbidity and mortality differences between ethnicities, but they are significant when investigating differences between groups on the same course of treatment."
Nothing you've said seems wrong to me, but the tone of how you write is too dismissive.
> since there are other explanations (lifestyle habits, genetics, poverty) that would explain the difference, in whole or in part.
This doesn't contradict the meaning of "structural, systemic racism", but it explains it. When pharmacies are making drugs that are only effective for white people, and not researching effectiveness on black people, that's structural racism almost by definition.
Obviously pharma companies are responding to financial incentives, and if it's not profitable for a company to research treatments specifically helping a minority group then they're probably not going to. Less availability of pharmaceuticals makes treatment harder, causing what is available more expensive or leading to complications that require more further medical treatment (and cost more money); and those who choose not to get treatment will find themselves with further medical conditions later. In the end it would cost the minority more money, which they likely cannot pay for other systemic reasons, so more often they would be denied access to a hospital outright. Everyone involved is responding to natural incentives, but the net result still becomes [minority group] is neglected because of the color of their skin.
> Did you read the whole comment?
When you write like this it feels like you're attacking the character of the person you're talking to, which makes the whole conversation more toxic to follow.
I just finished reading "Color of Law", which, while not the best formatted book, gives a really really good overview of the systematic, government led, programs designed and enforced explicitly against african americans from the late 1870s until the 1980s. It's a pretty quick read and well worth it for anyone who thinks that "systematic racism" isn't real.
Black people have worse health outcomes controlling for obesity and poverty.
Blaming genetics for such a wide range of negative outcomes is silly if you're looking for a simple explanation. Consider this[1] paper on cervical cancer. There are nine different genes linked to it. Now do that for every relatively worse health outcome. That's the opposite of simple.
Lifestyle is even more nebulous. It encompasses so much that suggesting it's a kind of verbal jujitsu to suggest it's an adequate use of Occam's Razor. There are hundreds of lifestyle factors and you can weight them however you want to get the result you want.
All of those are about as broad and as simple as racism, which you appear to categorically disregard as a plausible explanation.
>You dismissed one assumption and named three more. Please explain which applies to Occam's razor.
Because there is a direct correlation between observable characteristics like obesity and poverty to health outcomes for people of all races. If a fat, poor white woman in Appalachia has heart problems in her 40s and receives low quality care, is that a result of systemic racism?
And as others have pointed out, we don't know the entire scope of genetic effects, but we know they exist to some degree (which was the whole point of the article).
The centuries-long existence of slavery, segregation (which was brutal oppression, including lynching), and racism isn't an "assumption", but indisputable fact. Occam's Razor is not a real arbiter of truth, but in this case it cuts the other way: Racism is the simpler and blazingly obvious explanation, backed by endless reearch and even the most casual observation. You really have to work to contrive explanations that don't include systemic and structural racism.
>Racism is the simpler and blazingly obvious explanation, backed by endless reearch and even the most casual observation.
Things that are wrong can be obvious to individuals and groups of people. It's certainly not obvious to half the country, and that "endless research" is tainted. How long do you get to keep your job in academia if you point out the primary drivers of black misery in the US (out of wedlock births, drugs, and violence) are self inflicted?
> How long do you get to keep your job in academia if you point out the primary drivers of black misery in the US (out of wedlock births, drugs, and violence) are self inflicted?
To "point out" something (in academia or elsewhere), that something must be a fact.
It's certainly obvious to the most of the U.S., and facts are not subject to a popularity vote regardless.
> "endless research" is tainted
Easily said, but completely unsubstantiated
> How long do you get to keep your job in academia if you point out ...
Can you substantiate that such a thing is true, and then answer your question? One thing that will lose you your job in academia is making intellectually weak, baseless claims. Academia doesn't run an affirmative action program to include all political ideologies; you have to actually have evidence and good arguments.
> How long do you get to keep your job in academia if you point out the primary drivers of black misery in the US (out of wedlock births, drugs, and violence) are self inflicted?
It's almost as if the sins of the past affects the lives of people in the present somehow.
It's almost as if people like to use the sins of others long dead to excuse their own shortcomings.
In any event, if free will isn't a thing, there's no point in trying to make the world a better place, right, so we should just leave things as they are?
You've been using HN primarily for political and ideological battle. That's an abuse of the site which destroys its main purpose, so we ban accounts that do it. Would you please read https://news.ycombinator.com/newsguidelines.html, take its spirit to heart, and use HN as intended from now on?
You are suggesting that there isn't widespread racism now? What knowledge or basis do you have for all this?
> if free will isn't a thing
So either there is no free will or there are no systemic problems? Are poverty in Somalia and Kirghistan systemic issues, or is it just a failure of the people there that they don't live like people in the Bay Area? In the U.S., is poverty on Native American reservations, and among almost every group that isn't white men, just due to laziness? Society, health care, schools, the economy, racism, etc. - all have no effect?
Something like the negative effects of concentrated poverty seems to fit Occam's Razor well enough. "The most obviously-shared attribute amongst clusters of extremely poor minorities across the country is the clustering of poverty, and here are some potential causal ways this can lead to different lifestyle habits, different levels of education, different access to health care, etc."
That seems way more likely to me than oft-hinted-at-by-the-"politically-incorrect" "here's a cluster of people who all made the same bad decisions or were victims of the same bad luck in the same way, for no underlying reason other than genetic factors also associated with the color of their skin." That's pretty damn "politically convenient" if you're not in the minority population, too - "hey guys, it's not our fault! They just suck!" Hard to imagine something more politically convenient to the lucky than that.
>That in general the well being of Native peoples in the US, of African Americans is devalued. It is witnessed in the exclusion in drug trials, in the diseases that pharma considers worthwhile to address...
Speaking about medical research specifically:
You seem to say these outcomes are because of white racists "devaluing" black and native peoples' well-being in an evil act of collective racism.
A simpler explanation would be: There are far fewer blacks and natives than whites in America, which means fewer sick people to help and fewer customers, which means their unique illnesses get less research focus.
The same effect happens between common diseases and rare diseases irrespective of race. Common diseases get studied first, because that's where the most good can be done. This isn't because the well-being of people with rare diseases is "devalued".
Frankly I'm concerned you jump so quickly to a mass accusation of collective racial evil (which echoes historic hatred against other high-performing ethnic groups) when a simpler explanation is so obvious.
When people try to explain why you're coming to broken conclusions due to broken reasoning, they get attacked as radical leftists for using the straightforward terminology we have for describing the phenomenon we're discussing.
Here, you've provided a perfect illustration of why we have the term "structural racism". Structural racism is the emergent discrimination arising from the circumstances that created our status quo. You'd think an audience of computer scientists would have an especially good intuition for emergent systems properties.
Here's a simple explanation for how African Americans can be discriminated against in health care without any of the doctors or nurses involved having overtly racist impulses:
Until the nineteen seventies --- within many of our conscious lifespans! --- African Americans were actively, overtly, deliberately discriminated against in real estate. They were redlined out of white neighborhoods and into low-income neighborhoods. Naturally, once real estate lenders would allow them to buy houses in any neighborhood they wanted, African Americans of means began buying houses anywhere they wanted. Unlike low-income "white" people, low-income "black" people were stuffed into neighborhoods that were first deliberately underfunded, and then further disinvested by the vicious cycle of neighborhood flight ---- like a run on a bank.
The hospitals, doctors offices, pharmacies, and medical service providers available in those neighborhoods are poorer than those in white neighborhoods due to disinvestment.
The unbelievably awful people who designed and executed on redlining are probably long retired by now. Many of them are no doubt deceased. Most of us would recoil from racial barriers in real estate lending. We all believe ourselves to be well-intentioned. Samuel L. Jackson has a retort our best intentions.
Grandparent has been deleted and I can't see it, so I'm not sure what their post was, so I'm aiming this response in more at the terms being used and their underlying meaning.
>When people try to explain why you're coming to broken conclusions due to broken reasoning, they get attacked as radical leftists for using the straightforward terminology we have for describing the phenomenon we're discussing.
How much of this is caused by people have past experience with selective application of different lines of reasoning.
For example, use the legal's systems racial and sex based discrimination. If we look at racial discrimination, it should be pretty clear that minorities have it much worse than whites. And there is a lot of research on this. If you then look at it based on gender, it appears there is even stronger discrimination based on gender than on race, with males much worse off than females (and a minority male receiving the worst of each). But the treatment of this online seems quite different. While it is a personal anecdote, on multiple occasions I've been told the racial discrimination is caused by structural racism against minorities that treats them worse than whites at every step on the system (from being more likely to be stopped and searched, to being more likely to be convicted given equal evidence, to receiving harsher sentences), and then being told that the gender discrimination is caused by sexism against women, resulting from the legal system treating women as children every step of the way (meaning they are less likely to be stopped and searched, less likely to be convicted, and receive less time). These seem like polar opposite lines of reasoning, yet I've seen both used as the same time.
I think it is at this point you get people who become opposed to the underlying reasoning because it appears that the group using the reasoning is starting with an assumption and then picking the logic that best fits their assumption. And I think many of the people you encounter online who use this reasoning are doing just that. People of every political and other leaning like to manipulate data to fit their world view. Combined with a lack of exposure to the actual scientists who work on this it can paint people's view of the language. To say nothing of scientist being humans and thus there being examples of scientist being very non-scientific about some issue (while I don't know of any examples on this particular issue, I did read through case of correspondences published in a scientific journal dealing with classification of certain behaviors as mental illnesses where some scientist were making some very indefensible arguments concerning evolution of which numerous counter examples were available that basically boiled down to "there is no way trait X evolved because it isn't reproductively advantageous in our environment").
And to be clear on my own stance, I do think that systematic racism exists in our current system, including in sub-systems where there are no racist members. There are agent models that show with even a small in-group bias, completely devoid of any out-group bias, you can have a system where out-group bias is apparent. For example, a system of entities of type A and B where A's has a certain preference for grouping with other A's, but no preference for not grouping with B's, ends up behaving similar to a system where A's have a preference for not grouping with B's.
> These seem like polar opposite lines of reasoning,
The problem is you are viewing them as reasoning about the cause from the effect alone rather than reasoning about the cause from a combination of the effect and masses of historical evidence.
This is a question in response to the GPs post that:
>>masses of historical evidence
>The ones that show that being a male has long been a major disadvantage in a legal system?
This is a purposed answer to the previous question, making the claim that there is plenty of historical evidence that males have been strongly discriminated against by the legal system, even if we went back in time. Need I source a claim showing that men were more likely than women to be charged, convicted, and receive longer sentences? This has been the case for at as far back as I've looked.
The reason for this is because GP acted as if I was ignoring some evidence that would justify the argument that the legal system is biased against women because it goes easier on them and make it somehow compatible with the second argument that the legal system is biased against racial minorities because it treats them harsher. It shouldn't be hard to see that these things still appear to be in contradiction.
>Yes, going back into the past being a minority was even worse than it is today, to a point where there was absolutely no justice at all
This is to preempt a response that if you went back in time, the legal system was even more biased against minorities than it is today. Instead of waiting for that response to be potentially made, I made it myself. I preempt this based on past experiences of seeing the point made in counter to my point.
> but that doesn't have an impact on the line of reasoning used to try to say the legal system discriminates against women.
I then follow up saying that I don't see this as a counter, because it doesn't impact the half of the two statements I have a problem with. One can try to explain why racial minorities being treated worse in the past by the legal system supports the statement that the legal system going easier on women is discriminating against women, only that the claim that I made in the first half of this sentence is not enough.
>This type of response only further reinforces the notion that the underlying reasoning and terminology is created ad-hoc to justify existing notions.
I then finish by saying the type of response from GP, which does not explain their argument to any degree other than a claim of forgetting to take historical evidence into account, makes people more dismissive of the original line of reasoning and the terminology associated with it, that of systematic discrimination, because the historical evidence appears to support my claim, not theirs. In short, an unsatisfactory defense strengthens the opposition's argument.
From what I can tell, the person who introduced the (unrelated) gender discrimination issue to this thread is you. As to the rest of your comment: I didn't doubt that any of what you had to say was important to you. I just don't think it has anything to do with what I said upthread.
You completely missed the point. I specifically put statistics in my article to refute this.
In many clinical trials African Americans that contract various conditions at the same rate or higher than White Americans represent only 1% of the clinical trial versus 15% of the population. While 95% of the trial are White Americans but they are only 60% of the population. There is clearly a disparity here.
I also use the word "Neglect" they haven't recently purposefully ignored minorities (although in the past they did), But the people that want to run the trials put the trials in the neighborhoods (read mostly white populations) that they have worked with before and want to cover. Therefore trials aren't being run where Minorities live.
This is where the "systemic racism" comes into play.
> Frankly I'm concerned you jump so quickly to a mass accusation of collective racial evil (which echoes historic hatred against other high-performing ethnic groups) when a simpler explanation is so obvious.
Do you live in America? Are you familiar with US history? If both of these are true then I dont understand why its so hard for you to grasp the fact that much of this was done out of malice.
> Japan, China, South Korea for example does their own drug testing and development so would they find that a particular drug doesn't work on their population at all?
Yes, it happens sometimes that certains drugs approved in the US fail to make it in Japanese clinical trials. Recent one I can think of is Prozac:
We haven't done extensive research in populations outside of the United States, but for the last 50-100 years the US (and Europe) has been leading the charge, but in both these locations, Minorities have been a low percentage participation in research.
As well, many of these outcomes he mentioned are the indirect result of years of research. Think of diabetes, 100 years of research was 99% spent on a white population and that contains prevention, diagnosis, and treatment. Populations that aren't white begin to see detrimental effects that add up over time.
> We haven't done extensive research in populations outside of the United States, but for the last 50-100 years the US (and Europe) has been leading the charge
I'm confused, I thought that over the last ten years a lot of the medical research formerly conducted by U.S. pharma companies has been outsourced to SROs, who in turn test their drugs on poor people in India or whatever. Is that not actually the case?
That only happens for certain drugs, typically treatment naive diabetes, IBD and other chronic disease patients where in the US there’s a standard treatment the patients receive right away so they have to go overseas to find untreated patients. That being said often those treatments will still have to come back to the US to run trials later on for approval on the population here.
There is a much longer answer to that question. SK and Japan only make up 5% of the asian population, many parts of asia are not well tested for these drugs. We're actually in the process of working with a large pharmaceutical company on a multi-country diabetes prevention trial in Asia because traditionally they have not been tested on as much as necessary. As well the companies there will run many of their trials in the US as well as Asia but our larger population (as well as running the trials in mostly white continents such as Australia and Europe) still causes quite a large discrepancy. We could do a whole other article on this topic.
Japan retests, and interestingly recreates the same problems in a local way.
Taking medical checks for foreigners or minorities in japan means the guidance numbers are mostly irrelevant. Doctors might not know very well how to deal with you, a lot of advice for common but non critical illness can be summed up to “you do you”
That was a fascinating, and heartbreaking, read. The statistics are astounding. I'm disappointed though that the author completely omitted any discussion of single-parent families. 66% of black families are single-parent vs. only 25% of white families. I would think that would have a huge impact on well-being and health of the mother, in general. Even just in terms of making time for doctor's visits, etc.
Sure, tons. A simple Google search turns up reams of sources (and articles about sources). A few random readings show studies on single-mother families having highest rates of poverty, increased rates of smoking, lower health, etc.
The fully explored answer to that is an article (or book) in and of itself, I can't cover every single outcome but highlight some prominent ones in a 2000 word article.
I was actually referring to the ProPublica article linked in the parent comment of mine. I believe it's much longer than yours so I was surprised it skipped such an alarming fact. I can certainly appreciate the tradeoffs of what to mention in only 2,000 words!
Then why is asthma medication less effective on Latino's and African Americans? And why did my friend have to visit 5 doctors and it wasn't until he found the Black doctor that his skin condition (which is common to only African Americans regardless of weight) was properly diagnosed and treated?
I'm not saying weight is not a problem, I come from a black family where unhealthy diets are a tradition (but that leads back to the fact that traditional african american diets come from the slave food which was unhealthy but taken in as cultural meals, much longer discussion there), but that does not mean that taking out weight in this discussion solves all or even most of the issues at hand.
As well using your same logic, why do Asians who typically have a lower BMI than white individuals have higher incidences of Diabetes? It's not one size fits all
> that leads back to the fact that traditional african american diets come from the slave food which was unhealthy but taken in as cultural meals
Is this really true? What foods did slaves in the US eat that are still regularly eaten today? What about black Americans that didn’t descend from slaves, are they not affected?
Is it not more likely that this effect is due to economic reasons (ie in recent years low quality food is significantly cheaper to obtain)
It is an indirect correlation. BMI is a factor related to the likelihood of having diabetes, and what's considered a safe range for white people is an unsafe range for Asians. Thus if you have a white person at a 25 BMI and an Asian person at 25 BMI, the Asian person has a higher likelihood of getting diabetes.
"The educated [Asian] population knows that they're getting diabetes and hypertension and all these things at a much lower BMI, but if you're in a culture where everybody's really fat and you're thin, you tend to go around and think, 'Well, I'm protected,'"
> However, the impact of increasing BMI on risk of hypertension and diabetes was significantly greater in Asians. For each one unit increase in BMI, Asians were significantly more likely to have hypertension (OR 1.15; 95 % CI 1.13–1.18) compared to non-Hispanic whites, blacks, and Hispanics.
>Remove any notions of race and compare the stats based entirely on equivalent weight/BMI I you will find nearly all differences would disappear.
This is not only false, but dangerously false. We are in the process of discovering that certain classes of popularly-prescribed drugs (eg ACE inhibitors for blacks, certain chemotherapy drugs for Asians) are ineffective or even toxic for populations not represented in the relevant drug development research cohorts. It's not identity politics to note that pharmacokinetics can differ between individuals and populations. These differences do not explain all of the population-level morbidity and mortality differences between ethnicities, but they are significant when investigating differences between groups on the same course of treatment.
The same is true for men and women. The narrative we like to repeat is that men and women are the same except for the shapes of their genitals but there are numerous biochemical and metabolic differences that should affect dosages for several classes of medication [1]. The real tragedy is that many drug trials were never done with women so we may not even be sure of what the doses should should be [2]. The same problem exists with children. These are not simple body weight issues.
I feel like the problem is worse with children due to them having a developing brain, where the impacts of medication can have impacts into changing the very person taking them. I am especially concerned with medication for mental illnesses that are prescribed to children, often times off label, but the risk exists for most any medication.
You're incorrect about hours, WeWork provides 24-hour access at your assigned office to all members except On-Demand members who pay per-day to acccess a WeWork location. Hot Desk, Dedicated Desk, Private Office all provide 24 hour access. Front Desk / Community Staff are usually present only during 9 - 5, but otherwise there's no difference between 11am on a Monday and 11pm on a Sunday -- I am typing this from my Dedicated Desk at 19:53.
You can see this information on the WeWork Plans page (https://www.wework.com/plans), and you can see events listed after 5pm at the WeWork Irvine location, demonstrating it is not an outlier and is definitely open after 5pm.
I'm also surprised that wasn't mentioned as they advertise a lot on YouTube too, they have over 200,000,000 combined views on their YouTube adverts. Paid advertising is a major component of their strategy.
The first rule of paid advertising, is don't talk about paid advertising. Customers don't like to think they "fell" for an ad, and the advertising is often secret sauce. It is pretty easy to duplicate a technical product, and actually reasonably easy to copy the advertising. The big difference is most tech/product guys look down on advertising. Also many make the mistake of pricing lower than the competitor which means lower margins to buy advertising, losing every ad auction, and never getting off the ground.
They were also very aggressive with their youtube ads this year. For an extended amount of time this year every second ad I saw on YouTube was them (I live in Sweden).
"Their balance sheet is appalling, and I have no idea who would loan this company the money to purchase the land, as it goes against almost every underwriting principle. The company brought in $378k in revenue last year and had an operating loss of $1.8 million."
They have no water rights sufficient to create a farm in the middle of the desert. They bought a town which only draws enough water for personal use and some small gardens.
It's also on the way to Las Vegas, but requires you to detour on a dirt road for a significant distance. That's not going to happen -- it's simpler for tourists to just continue on their way to Las Vegas.
Calling Nipton a "town" is a bit of an exaggeration, too. From what I'm seeing in the satellite photos, it's barely a wide spot in the road, in the middle of the desert, near what might have once been a train station. I've seen truck stops larger than this.
(And I'm not exaggerating when I say "in the middle of the desert". It's right on the edge of the Mojave National Preserve. This is not a good place for anything, let alone for agriculture.)
"They have no water rights sufficient to create a farm in the middle of the desert."
They may have no water sufficient to create a farm in the middle of the desert, but I don't think it's a problem of water rights.
As a Colorado native I have a good sense of water rights, but as a farmer in California I have been (pleasantly ?) surprised by how little restriction or ownership there is attached to water here ... it is very unlikely that the water rights are not attached to their parcels and its possible that they were never detached at any point.
All they have to do is dig a well. Maybe nothing will come out, but nobody will stop them ...
i mean if they want to have some legal weed(and gambling) there is primm in the same amount of distance from the off ramp that you would be taking anyway. 10 miles on a small road vs. 10 miles on a freeway to primm.
All the smell of a scam. The journalists at various papers that are disgorging this thinly veiled PR release as if it's news should be put on your shortlist of garbage journalists.
If I could do things over again, on today's internet, I like to believe Weird Gloop is the type of organisation we would have built rather than ending up inside Fandom's machine. I guess that's all to say: thank you Weird Gloop for achieving what we couldn't (and sorry to all who have suffered Fandom when reading about Minecraft over the years).
[1] That's a bit of a cop out, we did have options, the decision to sell was mostly driven by me being a dumb kid. In hindsight, we could have achieved independent sustainability, it was just far beyond what my tiny little mind could imagine.