> Social proof used to reflect crowd wisdom. Now it reflects algorithmic endorsement — triggered not by consensus, but by proximity to influence. A single interaction can distort scale, making selected content appear widely supported.
I think this is barking up the right tree with the wrong lesson - these things are the same. Elon Musk, for worse mostly, is a social influencer. You can tell because a lot of people follow him. I am sure the algorithm in unreasonably kind to him (as he can write it) but it's also true that a lot of people care what he does and what he does changes what people care about.
The real question here, to me, is: does this kind of mass social calculus make any kind of real sense? Can we actually extend the idea of interest to 219,000,000 people or do we leave the coordinate system at some point? I suspect it doesn't hold up.
I am a long time believer in the need for good algorithmic filtering. There is more happening in the world than I have attention for and I want a machine to help me. Most solutions are quite bad because they are focused on how much money they can make instead of how much they can help. But I think it's a real problem and the bad, money-grubbing algorithms that surround us now are making our lives much worse.
Ultimately I think this comes back to operationalizing human relationships. What does it math for Musk to have that many followers? This is distasteful but real, I fear, in the age we live. Social influence is clearly real and we are measuring it in flawed ways and we should try and improve those flawed measurements.
American politics due to the two-party system is fundamentally dishonest. Issues are packaged across parties and you have to buy everything the party is selling. For example there's probably lots of Republicans that would not mind decently run government-subsidized healthcare and there's lots of Democrats that think the government should respect their right to be armed. But because the parties don't really support these positions, it creates significant pressure for people outside of the party buckets to twist their public political talk. Fundamentally this makes political talk and political social media activity dishonest as well. When owners of social networks become political figures, it basically turns all coefficients in this equation to exponents.
Elon Musk had 85 million followers in 2022[1] before he acquired Twitter. He obviously has some organic engagement / grassroots appeal, separate from whatever benefits he derives from owning the platform.
I think this is barking up the right tree with the wrong lesson - these things are the same. Elon Musk, for worse mostly, is a social influencer. You can tell because a lot of people follow him. I am sure the algorithm in unreasonably kind to him (as he can write it) but it's also true that a lot of people care what he does and what he does changes what people care about.
The real question here, to me, is: does this kind of mass social calculus make any kind of real sense? Can we actually extend the idea of interest to 219,000,000 people or do we leave the coordinate system at some point? I suspect it doesn't hold up.
I am a long time believer in the need for good algorithmic filtering. There is more happening in the world than I have attention for and I want a machine to help me. Most solutions are quite bad because they are focused on how much money they can make instead of how much they can help. But I think it's a real problem and the bad, money-grubbing algorithms that surround us now are making our lives much worse.
Ultimately I think this comes back to operationalizing human relationships. What does it math for Musk to have that many followers? This is distasteful but real, I fear, in the age we live. Social influence is clearly real and we are measuring it in flawed ways and we should try and improve those flawed measurements.