> Yeah. People over-estimate the flashy threats from AI, but to me the more significant threat is killing the open exchange of knowledge and more generally the open, trusting society by flooding it with agents which are happy to press "defect" on the prisoner's dilemma.
I don't think societies are open/trusting by default -- it takes work and a lot of anti-intuitive thinking, sustained over long periods of time.
> "High trust society". Something that took the West a very long time to construct through social practices, was hugely beneficial for economic growth, but is vulnerable to defectors. Think of it like a rainforest: a resource which can be burned down to increase quarterly profit.
I think the trust is downstream of the safety (and importantly "economic safety", if we can call it that). Everyone trusts more when they're not feeling threatened. People "defect" from cultures that don't work for them -- people leave the culture they like and go to another one usually because of some manifestation of danger.
I don't think societies are open/trusting by default -- it takes work and a lot of anti-intuitive thinking, sustained over long periods of time.
> "High trust society". Something that took the West a very long time to construct through social practices, was hugely beneficial for economic growth, but is vulnerable to defectors. Think of it like a rainforest: a resource which can be burned down to increase quarterly profit.
I think the trust is downstream of the safety (and importantly "economic safety", if we can call it that). Everyone trusts more when they're not feeling threatened. People "defect" from cultures that don't work for them -- people leave the culture they like and go to another one usually because of some manifestation of danger.