Ep 266: Is Social Media Making Our Teens Angry?

Andy Earle
Hey, it's Andy from talking to teens, it would mean the world to us. If you could leave us a five star review. reviews on Apple and Spotify help other parents find the show. And that helps us keep the lights on. Thanks for being a listener. And here's the show. You're listening to talking to teens where we speak with leading experts from a variety of disciplines about the art and science of parenting teenagers. I'm your host, Andy Earle

Haye, we're here today with Tobias rose Stockwell talking about social media and specifically how and why it has been designed to produce outrage among users. And when you look at the psychology of the human brain, and how it interacts with the features of social media, you start to be able to understand on a much deeper level, why these platforms are really problematic for adults. And especially for teenagers. When you really do a deep dive into this stuff, you start to develop a much higher level of awareness about what's going on, which can really protect you from a lot of the negative effects of social media. Tobias has been researching this topic for quite a long time now. And he is the author of the new book, outrage machine, how tech amplifies discontent disrupts democracy, and what we can do about it. Tobias is a writer, designer in media researcher whose work has been featured in major outlets such as the Atlantic, wired, NPR, BBC, CNN, and many others. His research has been cited in the adoption of many key interventions to reduce toxicity and polarization within leading tech platforms. He previously led humanitarian projects in Southeast Asia focused on Civil War reconstruction efforts for which he was honored with an award from the 14th Dalai Lama. We're here with Tobias today to look deeper under the hood of social media sites, and to really understand on a much more profound level, what's going on when we use social media and why and why using these platforms inevitably leads to feelings of outrage, why it drives a wedge between people and why it often leads us to develop really, really strong beliefs about things that we don't actually understand. And as we'll see, these results can be really problematic if they're left unchecked. I am so excited to have Tobias on the show today to talk about all that and a whole lot more. Tobias, thank you so much for coming on talking to teens.

Tobias Rose-Stockwell
Thanks so much for having me.

Andy Earle
I have just finished reading your book outrage machine, man, you got a lot of really, really interesting stuff in here. I

think it's gonna be super relevant.

Tobias Rose-Stockwell
Awesome. Great. I'm excited to dig in. Thank you for reading. Oh my gosh, yeah, I

Andy Earle
couldn't put this down. Really. There's so much meat in here. You really just you really got me thinking about a lot of topics. What What inspired you to write this book? Yeah,

Tobias Rose-Stockwell
well, I was. I was a little bit early in the world of social media, recognizing that it was kind of this weird thing that was very influential and impactful. on us. I had this experience of going viral when I was pretty young, in the days before social media had really taken over our lives. I was I've been traveling through Asia as a backpacker. I mean, after college, I met a monk of Cambodian monk, Buddhist monk who basically invited me out to his village in the middle of nowhere, which was you know, a really fun interesting opportunity to kind of connect with with a different culture and you know, have this this meaningful travel experience I need to travel is such an important part of a young person's life it really was for me yeah, I got I got pulled out to this this community in the middle of the kind of rice fields in northwestern Cambodia. He when I got there he put me in this pagoda while I expect them to just have a community visit there was actually several 100 farmers they're coming Council village elders and kind of one by one they got up and said, Thank you for coming. We've been waiting for you. We so agree that you've we are we're so grateful that you've agreed to help us rebuild our reservoir. So I was I was I was shocked to say the least I was confused. I was like, I'm sorry. But reservoir. You talking to me? Is there someone else here this monk had basically kind of spun up a tall tale to this to his the rest of his community there that he had met this guy that this guy was going to help them rebuild this reservoir, which I certainly was not prepared to do hours you know, backpacker with barely any money, you know, no background in hydrology, not a trust fund kid. Nothing, nothing. That that would have suggested that I could actually do this but I Listen to their project. And this this monk, he ran a local NGO that provided services to about 6000 different farmers in that area in that region, they were collectively trying to rebuild this big reservoir. And I want to say a reservoir was more like a lake, like a giant, giant lake that was that provided a second crop of rice, through irrigation to to this community there. And they were looking for about $15,000 to do that. And, you know, I spent, I spent a couple of days with them, and just like hearing their story, and I'm like, why, you know, you definitely have the wrong person, but I can, I will at least be an advocate on your behalf.

Andy Earle
And I'll talk to people, you know, no promises, I'll put in a word and see if I can get anybody interested. And, you know, be in touch. Yeah, I

Tobias Rose-Stockwell
wrote an email to friends and family back home. And this is in 2003 2004. So it was like days before social media. And that email landed, went out to some to a listserv, you know, a bunch of people. And within this listserv, there was there was one link out to a site that a friend had designed. And this friend would go on to be one of the first engineers at Twitter, and their site was a social media site. It was a pre, pre Twitter, pre Facebook, social media site, and the email and the text in it went viral from that point. So it was for the friends or friends of friends. And all of a sudden, I had this like, tremendous outpouring of support, who people had read this, this piece, this kind of impassioned piece I had written about the monks and they're like, this is amazing, how can I help, suddenly, I had interest, there was just, you know, there was people that are interested in doing anything, people interested in connecting with engineers, and all of a sudden, there's kind of explosion of interest and empathy in this, you know, in this group of villagers in the middle of nowhere, they were looking for 15,000 hours to do this thing. And, you know, I thought, Okay, well, maybe I can spend two months doing this, I didn't have a whole lot going on. At that point in time I was, you know, stolen, I was kind of reflective years after college, trying to figure out what I really want to do with my life. And so I'm like, I can, maybe I can spend a couple a couple of months doing this, and I'm back in United States, started doing a little bit of fundraising for this community, I went back to work, saved more money, and then came back to Cambodia a couple months later, and what ended up so I thought would be $15,000, and maybe two months of my life, I ended up living in Cambodia for almost seven years working on this project, and two months from the sound was almost seven years. And then $15,000 turned into a quarter million dollars, there was this huge project that basically they just like ever escalating commitment of, of work and interest and kind of fascination, I ended up you know, basically doing a dy, the Y Peace Corps in Cambodia during during my 20s. And that was enabled by social media. So I was really, really fascinated by how powerful this technology was for connecting people to previously on unseen causes for kind of being a positive influence in people's lives and for making a difference in the world. So that was really kind of the origin, the origin story of how I got obsessed with via social media, you know, maybe a decade before everyone else did. So that's how I got to have fell into this, this work.

Andy Earle
It's funny how in the early days of social media, it was, like, so positive, and you're even talking about the origin of the like button. And it was like, Yeah, we wanted to make it really frictionless and easy for people to like, spread positive vibes and say, hey, that's cool. I like that.

It's like, so much of these things that I think we genuinely thought was just all good. And we're building utopia, where everyone's gonna be connected. And we're gonna just all be, you know, sharing things and liking stuff and supporting each other. And then And then here, we are kind of, like in with the fallout of that and looking at kind of places where well, okay, maybe it's not as quite as utopian as we initially thought.

Tobias Rose-Stockwell
Right? Yeah. And you remember what it was like? Can we some of us remember what it was like back in those early days of social media, you know, circa 2010 2011, there was this real kind of euphoria around what social media could be to the world and we really felt like it was there, the Arab Spring, there was all these, these dictators being toppled in the Middle East, using Twitter and Facebook. You know, there was this this kind of outpouring of sudden new awareness of a lot of their kind of problems, harms and causes that were previously kind of hidden from view and away from mainstream media interest. Yeah, there was there was this real kind of moment of of excitement and optimism about what social media could do and what it could create in the world. But it turns out that you know, a lot of those same dynamics, right, the things that allow for you to raise awareness for a, you know, an emotional cause a hidden emotional cause in the middle of nowhere, can also really rally people around any hidden emotional cause right and can can give give the ability to read is, you know, raise huge awareness and interest and activations and mobilizations for for just about any, you know, any friend of cars for the better, for better or worse, we ended up with this kind of new crisis of like all of these ways these tools made it so good, also have made it really dangerous for, for society overall, right, they've really kind of caused us to be more anxious, more emotionally concerned about things that otherwise we maybe wouldn't be, and also have really like, flooded us with a sense of urgency of the problems of the world in a way that hasn't been the case in the past. And the result of that is like, I think a sense of unfortunately, kind of learned helplessness in which we feel totally overwhelmed by the problems of the world, we've reached well beyond the kind of saturation point where we can parse the amount of difficulties that we're facing as a species, we feel overwhelmed, we feel depressed, we feel anxious. And I think kids are absolutely showing that in some ways more than adults, because they don't have any models for what it was like before. So they're just kind of growing into this world of social media and Metrics and Evaluation and viral content that is really negative for kids mental health.

Andy Earle
We have some really interesting research all throughout the book. And really, you kind of break down sort of the mechanics of how these like different platforms work, and then, and then mix that with psychology of how the how the human brain works, to sort of really unpack why things have sort of gone in the direction that they have, and why a lot of this is sort of has become problematic over the years. And, and a lot, I mean, there's so much we could talk about, and here's a lot of stuff that stands out to me, one thing that really is talking about emotions, or emotional content, I thought this was fascinating. You have some research in here from Upworthy with kind of these this nice quadrants of of different different emotions, anger, happiness, sadness, relaxation, and, and it's, it's sort of, I guess, looking at after a bunch of different tests that they ran, like, which which quadrants are the most likely to cause people to share something,

Tobias Rose-Stockwell
there's really interesting research on this, which is, you know, we kind of assume that, oh, well, if you put good stuff online, and you give people a choice into what they're clicking on what they're looking at, then you're gonna get the best stuff that people will look at, people want to feel good. So they're gonna look at the good stuff. People want to feel, you know, optimistic and happy. So they're going to kind of read those good stories, and they're going to click on those things. But it actually turns out that we have real negativity bias in the way we consume content online. And that kind of happy, hunky dory stories that that might fill us with good good vibes are not actually what we really click on, we actually tend to click on stuff that is negatively valence. So stuff that has kind of difficult emotions, on the right stuff, like stuff that is that that is, that is scary stuff that is angering stuff that is outrageous stuff that is kind of sad, we will we will, you know, Doom scrolling is a thing for a reason. It's like we really do respond to the kind of the horrors of the world in a meaningful way. And we respond to that significantly, right. And that's part of our part of our wiring, in a way I think, right? We are, you know, our ancestors, it was more, it was more helpful for them to be attuned to the potential dangers and harms. And your community and in your world.

Andy Earle
Everything's fine. People are doing are great. People feel good. Okay, no big deal. Oh, something scary. Someone's someone's mad. Yeah. Okay. Pay attention to that. That's

Tobias Rose-Stockwell
right. There's a couple of different ways in which that works, right. So we're highly attuned to status in our communities. And this is a way it really affects kids in a big way. Right? So, so we within our communities, we really want to know, who is in charge who is on top, who is down. And that's the way our brains are kind of naturally wired. It's like, we want to maintain good status in our in our communities, and our tribes. So so we we focus on on those signals and cues around that pretty regularly, which would make a lot of sense, right? You want like for our ancestors, that was a really helpful thing. But in modern times, we basically build a system, you know, social media has overlaid on our lives, or in our personal lives, or public lives or private lives in this very intrusive way. And the result of that is that is that we're kind of constantly flooded with these social metrics of, of a claim of disapproval, of judgment, of gossip of hearsay, and it really is kind of a perfect storm, of, of, you know, triggering some of our innate instincts to, to find, you know, to figure out what's wrong, who's who's doing the bad thing, and how can we shame the the person that's doing bad thing, you know, how can we how can we put people in these hierarchies where we were really conscious of who's up and who's down and that's what social media has done very well. It's just built itself on top of some of these very basic human desires, all in the name of engagement and you know, engagement, as we know, engagement equals ad revenue on these platforms. And that's a pretty basic metric, which is if you're using, you know, advertising to run your business, then you want to keep people on there as long as possible. And that's, that's an unfortunate unfortunately, part of part of the way these tools are designed these days.

Andy Earle
I love some of these really interesting stuff, you look at him here, like the study of 100 million different headlines showing the top performing ones, or some, like, some of these, I loved tears of joy, make you cry, shocked to see give you goosebumps, like, kind of, it's just data driven, a really testing and seeing what works and not not really malicious, but just oh, hey, this is working this, people are clicking More on this. And it's kind of the natural reaction that happens. And to me, it really fascinated me is actually something that is from from Mark Zuckerberg actually is sort of this, he was really looking at kind of the moderation that they're doing on Facebook, and what content is banned versus what content is acceptable, it forms this really interesting pattern, when you look at kind of the engagement,

Tobias Rose-Stockwell
this was a pattern on on Facebook, that that they found an unexpected pattern that they found back, they popped, publicly talked about in 2018, they had been they had been researching it before that, which is very strange phenomenon. It's called the borderline engagement phenomenon, which is that if you, you know, if you draw a line around your content, and you know, there's certain things that are constantly being, they're always banned on the platform, right, so there's, you know, hate speech and violence, pornography, and that kind of thing are always banned on these platforms, right, there's always a kind of a content moderation line when it comes to these very, you know, problematic pieces of content. But so and if you if you look at if you put this on a graph of the, of the level of engagement, that a piece of content will get average content, you know, kind of average or average social media, normal feed stuff, right, the the baby pictures from your friends, the food, you know, the food shot for on your Instagram, the the job update, whatever, you know, that'll get kind of average engagement. And of course, if you put something that's that, that is beyond the line of, of content, moderation, that will get zero engagement, right. So that'll that'll drop it to zero. But as you approach the line of, of content moderation, so as you approach the stuff that is that is, you know, it's not hate speech, but it's everything but hate speech, you know, it's not, you know, it's not violence, it's kind of everything but violence, you know, it's not the the terrible, you know, completely atrocious language, it is like everything, but the most, that actually engagement will spike up exponentially as you approach that kind of that kind of content line. So your contents banned if you if you pass the line, but if you walk up the line, and I speak about this in the book in the context of line steppers, which is people who have figured this out to build audiences, you can actually just, you know, walk straight up to the line and not quite not quite passed over it, and then you'll get a tremendous amount of natural engagement from it. And that happens for a couple of reasons that happens, because like we are, you know, like, all those things, we just said, we are very attuned to kind of the outrages in our communities and the things that the gossip and the kind of juicy and salacious, kind of dirty stuff that you know, that we're naturally predisposed to look at that stuff. And then, and when, when we, when we see that stuff online, we feel called like, if someone posts something that's like a abhorrent comment, a terrible, terrible piece of commentary about about a group of people that you respect, or that you care about you, we feel called to respond to that. Right? So if we feel called to actually comment on that, and like, fight them in that space, we're like, No, you are actually wrong here, you are mistaken. These are the reasons why you are wrong. Now, engagement algorithms, they actually look for what is called MSI, which stands for meaningful social interactions. Right? So that was the the original engagement algorithm that Facebook deployed. And MSI is a very powerful way of rank ordering content. And like there's a reason why we have engagement algorithms, we have them because it's, it's we have too much information, we've just produce far too much. It's impossible, like everything exactly. You don't want to lose the important posts, you know, underneath the 50 food shots of your, that your friends made, right? You want to you want to see the stuff that's most interesting to you. So so there's a there's a real useful element to why we have these engagement algorithms. But when you're looking for meaningful social interactions, there's the things that that the algorithm thinks that you are going to respond to, you know, some of your friends get into a heated argument that actually would potentially get more engagement that will, that will get more engagement from you. So that that is likely to be served to you above other content. So that kind of sticks to the top of our feeds when we open our apps, right? It's the, the kind of disgusting, angry conversation is the is the fight about the news item, it's the, you know, it's the moral condemnation of someone for doing something they think was wrong. And these, these, these moments of kind of moral disgust, they explode in our social networks. And we have, it's impossible to avoid them, right. Because if even if you're not interested in the topic, if enough of your friends are talking about it, you need to be educated on it, you need to actually have an opinion on it, you need to pick aside of these moral disagreements, right. So that's, that's the kind of the core of the issue. And then there's this other piece that happens on top of that, which is once kind of old guard or traditional journalism gets involved, or, you know, Dr. Web journalism gets involved, when they see a certain number of people that are, that are angry about something, right? They see a really heated discussion on Twitter, between, say, like five individuals, and they're going back and forth, and they're angry about something, a savvy web journalist can look at that item. And they say, Oh, well, people are angry about this thing. I bet more people will be angry about it. If I wrote a story about it, and they will, they'll take that, that small altercation between, you know, say, like five to 10 people online, and they can write a whole story about it, saying people are angry about X, right? So people on Twitter are angry about x. And they then post that online, and then that that story gets, you know, that blows that far out of the range of these, you know, these these five to 10 people that were that were angry about something instead now it's exposed to 1000s of people who are now like, wait, what, why are people angry about this ridiculous thing and only takes one really one person to be angry about something, you know, something ridiculous for everyone to take note. And then they're, you know, everyone's kind of paid along the way, in this process, that the people that are engaging the arguments, they, they might be getting likes and shares for their posts, the social media company is getting the ad revenue for keeping us on the site longer. And then the journalist is getting getting paid for, for for pushing people to their web web and getting their website and getting traffic from that is basically build this kind of, we build this this this perfect machine for taking obscure, obscure outrages that normally would just be, you know, silly altercations and the corner of the web. And we've we've served them up for everyone to pick aside on that particular issue that has that has really elevated their sense of I think of of anxiety and frustration and anger and, and in some ways despair, when we look at the state of discourse online because it's actually infected. Big way, right? You'll see a lot of stories in major news outlets that are just people are angry about stories, right? People are angry about x stories. And yeah, that's not that's not good, especially again, to bring it back to teens here. It's really I think, hard for hard for kids to parse what is like worth fighting about, like what is actually like what like, Why is everyone so angry all the time, and that can be really hard on teens.

Andy Earle
We're here today with Tobias rose Stockwell talking about how social media sites have been engineered to produce feelings of outrage to divide people against each other, and to lead us to form really, really strong opinions about things that we don't actually understand. And we're not done yet. Here's a look at what's coming up in the second half of the show.

Tobias Rose-Stockwell
I am obsessed with social media just like everyone else. Um, in some ways more so because I've been writing a book about it for such a long time. I had a couple of experiences that that made it into the book that I think we're really indicative of this strange, messy way that social media forces us to take pick sides on issues. I didn't have an opinion about pitbulls. I really liked it. You know, I've heard that they're dangerous breeds before but I personally have known a couple of pitfalls that are just like really sweet dogs. I was scrolling one day, there was a video it was like a front yard ring cam footage of a cat sitting on a porch. This woman is walking her two pitbulls down the street and you can see the woman approaching from a distance the cat isn't really looking and the pimples like instantly pull the woman to the ground chasing after this cat, they reach the cat and they start following the cat. And this was just this is a you know, kind of an instant viral sensation on on on Facebook. And this moment, I'm just like, flooded with emotions. I'm like, This is terrible. Like this is wrong and I have this instant judgment. So I quickly jumped to the conclusion, as I think most people do when they see that video, that all pitbulls are dangerous, right? And it was only after a really long and kind of painful process that I reevaluated that initial impulse that I heard about it. If you and I are having a conversation privately, there's a circle of trust that is present in our conversation. And then I'm more likely to listen to the core of what you're saying the kind of central point of your argument. If we're disagreeing in this moment, I can look at you, I can see you're a real human and have a real time conversation, I can hear the inflections in your voice. Even though I disagree with you strongly, I'm not going to shout at you the way we actually find the truth is because other people check our biases, we have this superpower, which is that we can see the flaws in other people's arguments better than we can see the flaws in our own. So if we take this one to one conversation, this basic one to one conversation that we're having, and then we erect grandstands around it, we put people in bleachers on either side of us, we suddenly know now that there's hundreds or 1000s of people watching us have this conversation, it's going to change our posturing, and the conversation is going to change how we perform in that conversation. And if we can even worse and even more. So if we can get points, right from players in the audience. When we perform especially well in a way that actually appeals to their biases, then that is a recipe for moral grandstanding and for public performance and not for trying to find the truth not for not for active listening, not for good debate, not for good sense making. The value of living in a democracy comes from the ability to share ideas freely. And the current battleground around this is social media. Presently, it's like what can we say online? Who is censoring who, you know, who's being shadow banned? Who's being D platformed. And that is the kind of the current Flashpoint around the entirety of the discourse on social media right now is like, who is you know, who's in control of your speech, essentially, right? And yeah, really, what it comes down to is this kind of philosophical framework, which I think is really important for recognizing that first of all, most of us are actually wrong much of the time, it's really important to have other people who can have opposing ideas, and can call you out when you're wrong. What social media has done, unfortunately, is it's made many of these kind of mediated, more approachable pieces of our discourse toxic and it's made it very, very, very hard for us to find any common ground on things. Social media is so important for understanding for parents and for teens what these dynamics look like because we're not getting an accurate view of the world through social media, it is going through these very specific set of steps the information that the world is producing is going to use very specific set of steps that is designed to keep us attentive and keep us growing above all else.

Andy Earle
Want to hear the full episode, head over to talking to teens.com/register for a free trial of our premium podcast, you get exclusive access to loads of great content with no obligation and your membership supports the work we do here at talking to teens get started today with a free trial over at talking to teens.com/registered Thanks for listening. We'll see you next week.

Creators and Guests

Andy Earle
Host
Andy Earle
Host of the Talking to Teens Podcast and founder of Write It Great
Ep 266: Is Social Media Making Our Teens Angry?
Broadcast by