This honestly feels like you didn't even read the article and are responding to what you think it says. If I'm wrong I'd be happy to have you clarify why Molly is wrong, instead of just generally asserting that people misunderstand.
> It doesn't store biometric data.
She addresses this: they do in fact store the biometric scans if you opt in, and they strongly encourage you to opt in because if you don't you'll have to periodically reverify as they tweak the algorithm.
> It doesn't store any information that is useful to anyone for any purpose. The only thing it can do is tell if your iris has been scanned before. That's it. It can't reproduce what your iris looks like and can't sell any useful data about it to anyone.
As Molly points out, you're making a huge assumption that this number that uniquely identifies your iris isn't useful information to sell to someone (or for someone to hack).
> Yes, people will sell accounts. This is fine because it still solves the problem of people being able to make infinite accounts online at present. It still creates a barrier of entry for spam where there currently is often very little or none. Inauthentic behavior online will continue but not at the rampant pace it currently has.
This is where the project really needs to figure out what it's actually trying to do. If the goal is simply to reduce inauthentic behavior on the ETH chain, then it's possible that you are right that sale of accounts doesn't matter. But if the goal is to provide some sort of UBI system, the fact that it has no way to verify who is using the account after its initial creation is a huge huge problem that will lead to massive amounts of corruption and harm if they succeed at implementing the kind of worldwide UBI they're talking about. Just look at what happens to humanitarian aid that goes into territory controlled by warlords: that's what we're talking about.
Again, Molly addresses this, so it feels like you didn't read her article.
You're right, I didn't read it. I'm responding to comments in general on this article and others, which is full of misconceptions. I didn't bother with the article itself because every single one I've read is also full of misconceptions or incomplete information or uncharitable interpretations or unsubstantiated personal attacks against Sam Altman.
> As Molly points out, you're making a huge assumption that this number that uniquely identifies your iris isn't useful information to sell to someone (or for someone to hack).
How would someone possibly use the information that you've already scanned your iris to your detriment? How does that benefit anyone?
Let me know when you've read the article, and I'll reply to your question if you're unsatisfied with her answer. I'm not going to rehash any more of the points that Molly already made.
> It doesn't store biometric data.
She addresses this: they do in fact store the biometric scans if you opt in, and they strongly encourage you to opt in because if you don't you'll have to periodically reverify as they tweak the algorithm.
> It doesn't store any information that is useful to anyone for any purpose. The only thing it can do is tell if your iris has been scanned before. That's it. It can't reproduce what your iris looks like and can't sell any useful data about it to anyone.
As Molly points out, you're making a huge assumption that this number that uniquely identifies your iris isn't useful information to sell to someone (or for someone to hack).
> Yes, people will sell accounts. This is fine because it still solves the problem of people being able to make infinite accounts online at present. It still creates a barrier of entry for spam where there currently is often very little or none. Inauthentic behavior online will continue but not at the rampant pace it currently has.
This is where the project really needs to figure out what it's actually trying to do. If the goal is simply to reduce inauthentic behavior on the ETH chain, then it's possible that you are right that sale of accounts doesn't matter. But if the goal is to provide some sort of UBI system, the fact that it has no way to verify who is using the account after its initial creation is a huge huge problem that will lead to massive amounts of corruption and harm if they succeed at implementing the kind of worldwide UBI they're talking about. Just look at what happens to humanitarian aid that goes into territory controlled by warlords: that's what we're talking about.
Again, Molly addresses this, so it feels like you didn't read her article.