Very fair point/question, I should have explicitly drawn this link because my comment was quite ambiguous and making (bad) assumptions on shared context.
The relevance is IMHO this bill is largely an ossification at the government level of the safety and alignment philosophy of the big corps. I'm guessing they mainly wrote this bill. It's not the specific words "safey and alignment" that matter, it's the philosophy.
If the bill were only covering AI killing machines I'd (probably) be in agreement with it, but it seems significantly more overreaching than that.
>If the bill were only covering AI killing machines I'd (probably) be in agreement with it, but it seems significantly more overreaching than that.
Just to make sure we are on the same page: my main worry is the projects ("deployments"?) that aren't intended to kill anybody, but one of those project ends up killing billions of people anyways. It probably kills absolutely everyone. That one project might be trying to cure cancer.
The only way of not incurring this risk of extinction (and of mass death) that I know of is to shut down all AI research now, which I'm guessing you would consider "overreaching".
It would be great if there were a way to derive the profound benefits of continuing to do AI research without incurring the extinction risk. If you think you have a way to do that, please let me know. If I agree that your approach is promising, I'll drop everything to make sure you get a high-paying job to develop your approach. There are lots of people who would do that (and lots of high-net-worth people and organizations who would pay you the money).
The Machine Intelligence Research Institute for example has a lot of money that was donated to them by cryptocurrency entrepreneurs that they've been holding on to year after year because they cannot think of any good ways to spend it to reduce extinction risk. They'd be eager to give money to anyone that can convince them that they have an approach with even a 1% probability of success.
Agreed, and I think this bill probably would help against that, although indirectly by stifling research outside of big corps. You might be winning me over somewhat - stifling research outside of big corps does feel like a pretty low price to pay against the death/destruction of all of humanity...
I guess I need to decide how high I feel the risk is of that, and that I'm less sure of. Appreciate the discussion btw!