Hacker News new | past | comments | ask | show | jobs | submit | blahaj's comments login

Would you mind sharing your site?



> Service robots depicted in the study were the same except for the use of gray or pink colors to portray male or female genders, respectively. After receiving a menu recommendation, participants then rated the robot’s persuasiveness.

So they found out that different colors have an effect on how persuasive people perceive robots to be? And then they just claim that this is due to genders they associate to the colors? Their metric of a self reported sense of power also seems dubious to me.

This looks like junk science to me.


>To alter the gender portrayed in robots during the second study, the researchers used an iPad display showing different gendered facial features that topped a Bear Robotics Servi robot, which does not have any typical human-like features of its own. These facial features had “cute” designs, including round faces and big eyes.

Maybe read the article?

It's well studied that waitresses get tipped at a higher rate then waiters, the inclination that these human behaviors would be placed on semi-anthropic semi-gendered robots doesn't really seem like a big surprise to me.


Also how is gray a male color? If anything I’d say gray is non-binary.


Things like this are generally assumed to be male without a reason to gender them female, like being colored pink or having a bow or hearts drawn on or something. It may be sexist, but it's what people assume.


I don't think ISP caching would be a thing without https. It would bring a lot of additional complexity and resource requirements for them. I can hardly imagine that being worth it to save some bandwidth. Maybe it made sense in a world where bandwidth was very limited.

Also I am very happy that it is not a thing and that ISPs cannot do that. When I go to a website I want to get the website from the webserver exactly as the server delivers it and not some other page that my ISP thinks is how the website should look.

Besides with global CDNs we have something very similar but better anyway. I don't get the site from the other side of the world but from the closest CDN server that does caching. The important difference is that the CDN server is authorized by the website to cache the page and the webmaster has control over what it does.


> I don't think ISP caching would be a thing without https. It would bring a lot of additional complexity and resource requirements for them. I can hardly imagine that being worth it to save some bandwidth. Maybe it made sense in a world where bandwidth was very limited.

Transparent squid proxies were common back when most sites were on http. They let ISPs reduce the use of their limited upstream bandwidth, while also making sites load faster. The complexity and resource requirements were modest: install squid on a server, and configure the router to redirect (masquerade) all outgoing TCP port 80 connections to the port configured for squid on that server.


The internet is much bigger, more diverse and complex today. You need a lot of storage to get any meaningful impact. Caching the http of Wikipedia won't get you much. You need to cache lots of YouTube videos. Or you just get them from the data center you peer with over the fat link you built.

With bandwidth usage the diversity of the data retrived over the internet has also gone up. You can't just cache the few most popular websites and save most bandwidth. But bandwidth capacity has scaled a lot so you probably also do not need to.


> When I go to a website I want to get the website from the webserver exactly as the server delivers it and not some other page that my ISP thinks is how the website should look.

You could have some hash check to prevent hijacking. The old method would be naive today.

There would be some privacy concerns I guess. But it could be opt-in on the site owners part. I think caching some videos and pictures would save a lot of power.

> Besides with global CDNs we have something very similar but better anyway.

Sure but they are some switches away.


> You could have some hash check to prevent hijacking. The old method would be naive today.

But how do you know that the cached site is up to date? How does the ISP know that? What about dynamic content? What about consistency between different requests that are part of the same page load?

> Sure but they are some switches away.

My point is that this does not matter much. Usually, at least in non sparsely populated parts of the world with modern infrastructure, these switches are close and there is lots of bandwidth capacity.

I just don't think it makes sense for ISPs to save bandwidth on these links by building their own local data centers when they peer with a CDN data center anyway.


The root html would govern what caches are up to date with the hash for some non-dynamic payloads and the root html would not be cached. Etc.

It would be interesting to know how much bandwidth would be saved by caching X gb of the most downloaded films, pictures and textfiles at a neighbourhood level.

In the 90s early 00s I think the share was way bigger than it would be now.


Just a guess, but Windows executables probably depend on a bunch of Windows APIs that are guaranteed to be there, while Linux systems are much more modular and do not have a common, let alone stable ABI interface in the userspace. You can probably get small graphically capable binaries if you depend on QT and just assume it to be present, but Flatpak precisely does not do that and bundles all the dependencies to be independent from shared dependencies of the OS outside of its control. The article also mentions that AppImages can be smaller probably because they assume some common dependencies to be present.

And of course there are also tons of huge Windows software that come with all sorts of their own dependencies.

Edit: I think I somewhat misread your comment and progval is more spot on. On Linux you usually install software with a package manager that resolves dependencies and only installs the unsatisfied dependencies resulting in small install size for many cases while on Windows that is not really a thing and installers just package all the dependencies they cannot expect to be present and the portable version just does the same.


You could do a GDPR request for the data they have on you. I would be curious what they save. Keep us updated if you want to.


> Unfortunately, using an advanced analytics package I’ve projected that around May 2026 the YouTube homepage will just be one video, and by September there will be no videos at all on the homepage.

Doesn't exactly that already exist with TikTok?


It already exists on YouTube under the Shorts tab, which is just "we have TikTok at home".


I don't think things have to be impressive to be shown. A funny little idea is all you need, no matter how simple the code. Actually I find exactly that quite neat.


> Rate Limit: 10 requests per minute per IP

I understand that one wants some rate limiting so that others don't just use this as a backend for their own service causing every single request for their service to also create an API request. But this is as simple and resource unintensive as it gets for an HTTP server. 10 requests per minute is just silly.

Also could it be that the limit isn't enforced against the origin IP address but against the whole Cloudflare reverse proxy?


Mate, it's a joke, not a serous service. The only silly thing here is going off on a tangent about the rate limit.


10 requests per minute per IP is plenty enough to play around with and have a little fun. For anything more than that you could (should!) host it yourself.


So it is just purposefully made to be less useful? Is that part of the joke?

The rate limit still pretty surely isn't applied per IP.


If it helps you, think of the rate limiter as the "no" final boss.


With the female contraceptive pill 0.3% of women get pregnant within a year if taken perfectly (which is rarely the case, but the figure here is also from lab conditions). This drug leads to 1% pregnancies in 4 weaks, which is much worse.

That said this is still great news especially as the condom is also much less safe then the female contraceptive pill.


Condoms are extremely effective also, if used perfectly, which is rarely the case. Statistics should be based on real-world experience not theoretical best case.


Wikipedia gives a 2% pregnancy rate for condoms within one year with perfect use which is much worse than 0.3%.

We only have the numbers for the lab environment with I assume perfect use for this new drug, so we can only compare perfect use.


I've always been intrigued by where those 2% come from, since condoms are a physical barrier... Teleporting sperm?


The statistics show that the 2% comes from breakage/slippage, micropores and manufacturing defects:

https://www.contraceptionjournal.org/article/S0010-7824(04)0...


Mostly misuse. The studies are all nonsense AFAIK, they rely on the participants to use the condoms correctly, which predictably doesn't happen. Problems the studies mention, like slippage and breakage are the result of misuse, probably due to choosing the wrong condom size.

The only reason for a properly used condom to not work would be a manufacturing defect, which should be extremely rare, certainly not 2%, that's plainly ridiculous and immediately disqualifies any study that claims so.


The statistics are based on real world experience rather than theoretical best case. Not to pick on you but really surprised to hear people confidently express so much misinformation on this topic when it's not even particularly hard to find information on it:

https://en.wikipedia.org/wiki/Comparison_of_birth_control_me...


What throws me off about this, is it's not difficult to use perfectly. Condoms are the only method I trust. Pull-out? Can screw it up. With birth control you're (until now) relying on your partner to use it perfectly.


It's worth measuring both ideal and real world use, IMO.


> In male mice, the drug caused infertility and was 99% effective in preventing pregnancies within four weeks of use.

I don't know if they mean 99% reduction compared to normal or 99% of mice did not cause a pregnancy. Either way this does not mean that every intercourse has a 1% chance of causing a pregnancy. Also you are assuming an unconditional probability. It could very well be a conditional probability. It might completely work for 99% that do not cause any pregnancies at all and not work for 1% that cause pregnancies as without the drug.

Anyway I am looking forward to getting the perl index for humans from clinical trials.

Edit: fixed wrong wording


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: