20 Comments
User's avatar
K. Liam Smith's avatar

You might find Meganets by David Auerbach an interesting read. I believe he has a similar position as you: He thinks that social media and the internet more generally are currently going to destabilize governments/institutions, regardless of whether those institutions are effective or not. He has some specific policy prescriptions aimed at regulating social media to make posts less viral and slower.

Expand full comment
theahura's avatar

thanks, will check it out!

Expand full comment
Performative Bafflement's avatar

> The more pressing question on my mind is: what now? It's obvious to me that our prior way of being, and possibly the way we were built evolutionarily, wasn't equipped to handle the engagement economy.

Although I too think the attention economy is a problem that should probably be regulated (for kids at the very least, China is already doing this for kids), I don't think this is actually the problem.

The problem is that we're a democracy and we have entered an equilibrium where fully half the country essentially hates and fears the other half.

https://imgur.com/a/Q78MRiL

Worse, this is a stable attractor. Even if all of our institutions were respected again tomorrow, even if the attention economy were entirely eliminated and we only had legacy media, this is the kind of dynamic where people are socially and memetically filtered enough that they'll just bubble and polarize to roughly this degree again.

And that doesn't work in a democracy. If essentially half of voters don't respect and won't listen to the other half, you're kind of boned. If you can never build anything or fix anything because either one half or the other will object, you're boned.

Now what's the answer to THAT?

I don't think it really has one that keeps our society, country, and institutions intact and recognizable. Either National Divorce or Civil War or letting AI's run us, or something worse than all of those.

Some things, and I think history, culture, and political praxis among them, are acyclic graphs. We can never go back to the 1990's or 2000's or 2010's. We can never go back to the 1950's either. There is no "getting there" from here.

Once a given culture / country / empire's institutions have decayed and become dysfunctional enough, there's no getting back as the same people / culture / country. The only country historically that's done anything close is China, and they sure weren't the same leaders, culture, and country on any resurgence - pretty much every turn of that wheel involved megadeaths and civilizational collapse before rising again - and they're the ONLY example that's even done it!

I mean, I'm personally rooting for AI running us out of those options, because "gradual dispowerment" sure sounds better than immediate megadeaths or civil war.

But you know, maybe a National Divorce could work, too. Fingers crossed.

Expand full comment
theahura's avatar

> The problem is that we're a democracy and we have entered an equilibrium where fully half the country essentially hates and fears the other half.

This is definitely *A* problem, but I think it too is downstream of social media and culture. Like, people aren't born watching Fox News / MSNBC and knowing that they need to 'hate the libs / magas'. That's something that you learn. Part of why I've been thinking so much about epistemology recently is that it seems like all of our *national* systems of 'how we learn' are way out of whack.

We used to have news sources that we trusted to be impartial / neutral arbiters -- NYT was the paper of record, crossfire was an extremely popular TV show, conservatives and liberals alike would admit the other side had a good point if the neutral arbiters of truth said so. We used to have faith in universities and our schools -- if a PhD said something, people would listen; if a teacher recommended a certain course of action for their student, parents would (depending on the school) try and help too.

We've lost alllll of that.

It seems like now, the only things people are willing to trust are things that already confirm their own biases. The 'national conversation' has become toxic because someone watching Fox and someone watching MSNBC are essentially getting unfiltered propaganda that completely misrepresents what is happening in the world. They would have a hard time agreeing on basic facts, much less complex abstract principles.

Without fixing how we learn, we won't be able to fix whether we 'hate the other side'. You need the former to reach any kind of stable ground on the latter.

> Worse, this is a stable attractor. Even if all of our institutions were respected again tomorrow, even if the attention economy were entirely eliminated and we only had legacy media, this is the kind of dynamic where people are socially and memetically filtered enough that they'll just bubble and polarize to roughly this degree again.

Maybe...I think if our institutions were respected again, it would be hard to polarize to this degree _without_ the institutions becoming disrespected. Put another way, lack of respect for institutions is a necessary precondition for this level of polarization. (Legacy media isn't a good thing because its legacy; its a good thing because people used to respect it.) I do think that we would see the same level of polarization because the attention economy would just repolarize everyone into their echo chambers. But if you had mechanisms to _prevent_ those echo chambers -- and misinformation more generally -- I'm less certain that you'd see the polarization.

Expand full comment
Performative Bafflement's avatar

> Like, people aren't born watching Fox News / MSNBC and knowing that they need to 'hate the libs / magas'.

Yeah, totally - but you're positing people being born into a neutral culture here, and culture isn't neutral, right?

That was my point around "even if we eliminated the bad stuff today, we're stuck in a bad attractor."

> It seems like now, the only things people are willing to trust are things that already confirm their own biases.

Isn't this one of the founding truths of Rationalism overall, and one of the results that has actually replicated multiple times?

It's a base truth about human psychology - I personally don't think it's been amplified all that much by the attention economy, I think it was always there, and it's that ground truth that the algorithms predicting "oh, they'll definitely click / engage on this *next* video" are mining.

Like ultimately, it hasn't CREATED the drive, it's just sped it up, by pulling you into the attractor the drive was pushing you towards faster. The journey that used to take "middle aged dad slowly turning conservative" to "crotchety old man bemoaning societal decay" now just happens in 10 years instead of 40 years.

> But if you had mechanisms to _prevent_ those echo chambers -- and misinformation more generally -- I'm less certain that you'd see the polarization.

But when I think about what this looks like, it's basically legislating something like "you know what your algorithm predicts? Do the opposite of that." Or in other terms "you know what makes you money? Do the opposite of that."

Telling the trillion dollar companies to stop making money seems like a pretty hard sell on many levels. After all, aren't they just giving people what they want, via revealed preference?

> Without fixing how we learn, we won't be able to fix whether we 'hate the other side'. You need the former to reach any kind of stable ground on the latter.

I agree here, I just don't think "fixing how we learn" is an available destination from where we are. Acyclic graphs, etc.

I have an argument that AI personal assistants could help this at the margin, but I do think the impact would only be marginal, especially in any mass unemployment + UBI scenarios.

It's far outside the Overton Window and likely impossible for practical reasons, but I'd like a truly federated sort of deal where people can self sort to the environments and cultures they most like. Sort of like a National Divorce, but with more entities than 2.

Society in general has been moving to larger and larger levels of self-chosen organization - you used to just be born into your religion, no other options. Now you choose. People used to live their entire lives <25km from where they were born - now people change cities and states frequently. You're born into a given nation now, but some people choose to live and work and gain citizenship elsewhere.

Similarly, things like Stephensonian / Gibsonian clades and phyles (network states) would be nations you can opt into.

If you let people self-sort themselves by things other than "accident of birth" and wealth and income, which are the options now, good things pretty much have to happen.

And being among "your people" is a force multiplier - people are going to be more productive, generate more insights, start more companies, and what not. This is the advantage of this next level of organization. "Patriotism" is old and stale, and nobody really believes in it any more, because the other people in your nation are too different. But if they weren't, it would be a thing again.

Maybe we'd regain some of the state capacity we've lost. We were able to build the Empire State Building in 410 days. We built the Hoover Dam in 5 years, 2 years ahead of schedule. The Golden Gate Bridge took only 4 years, as did the Manhattan Project.

Now it takes 4 years of appeals and filling out tens of thousands of pages of paperwork to start building a single midrise building.

Total pipe dream, though - never going to happen. The problem of nukes, self defense, and the existing bureaucracy and government and power centers not being willing to give up their power / tax revenues basically forbids it.

Expand full comment
theahura's avatar

You're totally right that confirmation bias is a built in thing. I think for a period there we had papers of record and respect for institutions, enough that you could break through the confirmation bias. I don't think we can get back to that necessarily, but it's worth thinking of what that might look like in a world that also has Twitter.

Related: I DO think we have to tell these trillion dollar companies to turn off the addiction machines. I give a few proposals in the article about how -- I think an extremely unintrusive one is making personalization default off instead of default on. And while you're right that we will see pushback, it's not the first time we've done something like that as a society -- cf cigarettes.

Expand full comment
Performative Bafflement's avatar

> I give a few proposals in the article about how -- I think an extremely unintrusive one is making personalization default off instead of default on.

Yeah, I didn't want to crap on your ideas, because I actually agree with the thrust of your argument overall, but when I was reading / thinking about them, I could see several ways to immediately subvert them, above and beyond lobbying to block it in the first place.

For the "default off" personalization, all you do is A/B them on two streams, the default and the optimized one, and then when they've excitedly clicked for the third time in the optimized steam (easily measurable / modelable), you pop up a box saying "these better recommendations are a result of personalization, turn it on?" with a checkbox. Bam, you've got 80%+ back on personalization. All the others seem as trivially subvertible.

I agree with your point that the government is the only tool we have in terms of addressing these externalities - I disagree that the US gov is *capable* of doing it in a way that works.

The essential problem is you've got literally an army of top talent lawyers, 10k Phd's, and genius-level executives with trillion dollar incentives on one end, and on the other end you've got one or two statically written laws that don't change on the books, along with heavily-lobbied legislators that don't actually want them to be all that effective.

That's a vastly unequal battle with vastly unequal optimization forces on each end, and it's basically going to net out to no effect compared to today (except maybe some regulatory barriers to smaller companies trying to get a foothold in the FAANG ecosystem, you'll def lock those guys out).

China can do it because they have one guy in charge, value cultural stuff over economics, and can call their billionaires to heel (like waterboarding Jack Ma for two weeks, or whatever equivalent thing they did to him). In our system, the billionaires call the government to heel, and nobody wants to stop economic growth and progress or sweet sweet tax revenues.

What's the solution? I don't think there's an institutional one - I think empirically we're down to individual choices and culture (like a culture of abstaining yourself and not raising your kids with smart devices).

I mean, does even the EU have a solution to the attention economy? Not really. Only authoritarian states do, and arguably that solution is worse than the problem.

Expand full comment
theahura's avatar

But how did this work in the past when we regulated the tobacco industry? I think I might just be a bit less pessimistic -- like I think even the pop-up you describe isn't going to result in nearly as many people getting the personalization (also you could argue that even doing the ab test the way you describe could violate the "default opt in approach only")

But ya I mean more likely than not the only way out of the hole right now is a lot of pain. We really only have intensely unified government and populace after major events like the civil war or the Great Depression. I don't want to get something like that, but weak men make hard times.

Expand full comment
Performative Bafflement's avatar

Didn't tobacco work by mandating no advertising and increasingly gross / scary mandatory messaging on the packs, and take 30+ years to really move the needle? And that's after ~20 years of fighting about studies (famously capturing RA Fisher on their side).

So as a parallel, you mandate a click-through box saying "social media will eat you and your kid's brains," you think anyone is going to read it? They're already spammed with forced EULA's and forced "please eat my brains" buttons every time they interact with websites, apps, or software already.

And of course, you couldn't say that because no rigorous studies. So let's wait 10 years for rigorous studies, then introduce clickboxes, then wait a few decades...in the meantime, there's no more FAANG's, there's some other memetic superstimuli, or AI ate the light cone, or something else made the whole thing irrelevant.

To your A/B test point, I didn't want to overfit on details, I'm more pointing to the level of optimization on either end. The details don't particularly matter, because the intrinsic smarts + money + motivation on one side so vastly outweighs the other that it's as though you did nothing.

To your last point, yeah - that's where I kind of am, too. I've been living as an expat the last couple of years because of it. Figured I might as well get some tropical beach time and distance myself from that mess, and haven't regretted it once!

Expand full comment
theahura's avatar

Also, I think there's basically no other institution that CAN do this. Like, governments exist in part to internalize negative externalities. Companies would love to pollute rivers or log forests or whatever, if it makes them more profit. Governments step in and prevent that from happening. There's lobbying, and it's a push and pull, but it does happen, and it basically has to happen through the government.

Expand full comment
Christos Raxiotis's avatar

Wiki is already politicized btw, doesn't mean it doesn't have credible information, but there is big bias of pro establishment,anti-west,pro-left views. It is better to have that in mind. Most people who edit have similar backgrounds and opinions.

Expand full comment
theahura's avatar

Wiki is probably the most "correct" source on average, across all sources of news and information. Yes, there are times when individual articles can be skewed, but even those are often fixed quickly. My prior is that any bias presented by wiki -- including the ones you list -- are more akin to the "bias" that vaccines are good. Sorry, not all viewpoints get equal weight just because they exist.

Expand full comment
Christos Raxiotis's avatar

I disagree and I read a lot of wiki articles. Both Wikipedia and Scott have biases, but the fact I trust his blog more to get true information means the highly competent and skillful folks at wiki underperform, or I suck at epistemology. And when I say bias I don't mean earth isn't flat bias,or I wouldn't call it that and it should be obvious

Expand full comment
theahura's avatar

I was a bit snarky earlier because I wasn't sure which angle you were coming from -- generally i find people who hate on wikipedia do so because their own pet theory isn't "well represented". Immune reaction to dumb people. Apologies.

> but the fact I trust his blog more to get true information means the highly competent and skillful folks at wiki underperform

I don't think this is true though! It's true that Scott is a source of high quality information, and that's why I also read his blog, but Scott can't and doesn't write about _everything_. He's a pretty niche expert, he probably wouldn't be the right person to go to for, like, understanding the Wnt pathway's role in oncology or the role of trade in Medieval Mongolia, or whatever. Wikipedia, by contrast, is a great source of information for just about anything! It shouldn't be surprising though that the information is less detailed -- there's an inherent cost to scale AND (attempting to be) objective.

Wikipedia should be used a bit like a cache in a database. You go to your l1 cache and see if there's a hit for the information you're looking for. If not, you go to your l2 cache (the wiki refs) and check there. If not, you go to disk (the papers published by phds in Nature or Science or whatever) and check there. You could always go straight to disk, but who has the time?

In that role, I think wiki is great, and whatever biases it has make it more likely to be correct than not given the amount of topics it aims to cover.

Expand full comment
Christos Raxiotis's avatar

Sinse it seems my comment wasn't fully understood, when i use the word bias, i mean offering information where accuracy takes a secondary role behind promoting specific group interests, in wikipedia's occasion painting leftists talking points, authoritatirian governments and events carried by non western countries in positive light whereas an objective presenter ( i use Scott here as an example here since we both know him to have an understandable comparison, who i trust is sincere even if i disagree with his view ) would do otherwise. I can offer some examples like edits to the Nakba page after oct.7 ( i assume in good-faith to not fuel the conflict, but not the busyness of a 'credible data center' anyway ) , but anything highly specific isn't worth my time.

Expand full comment
theahura's avatar

> i mean offering information where accuracy takes a secondary role behind promoting specific group interests

In your personal epistemology, how do you distinguish between when wiki is being 'accurate' vs 'promoting a specific group's interests'?

Like, as far as I am aware, wikipedia does not promote vaccine skepticism. Is that because "accuracy has taken a secondary role to promoting big pharma's interests" or because that is just the most accurate representation of the world?

Expand full comment
Christos Raxiotis's avatar

Obviously its hard to distinguish , i need to follow a certain source for long and compare/contrast it with others i value their judgement, but in your specific example i wouldn't pick either as an explanation if i suspected big pharma was held in some of wikipedia's editors good graces. For example I believe there is a high chance wiki downplays human rights abuses happening in China or Saudi Arabia (by reading wiki and contrasting it with other sources), and i believe that is intentional. I could be wrong but i don't hold wiki to a very high standard for topics that are controversial or politicised . For these subjects i trust their information than a ytber who reacts to news, and less that Our world in data .

Expand full comment