Skip to main content
CONNECTICUT COLLEGE
CONNECTICUT COLLEGE
  • Current Issue
  • Past Issues
    • 2025 Issues
    • 2024 Issues
    • 2023 Issues
    • 2022 Issues
    • 2021 Issues
    • 2020 Issues
    • 2019 Issues
    • 2018 Issues
    • 2017 Issues
    • 2016 Issues
    • 2015 Issues
    • 2014 Issues
  • Older Issues
  • Letters to the Editor
  • Update Your Address
  • Alumni Association
  • News & Media Hub
  • Make a Gift
  • College Home Page

Contact Us

Connecticut College
Office of Communications
270 Mohegan Avenue
New London, CT 06320

Amy Martin
Editor, CC Magazine
asulliva@conncoll.edu
860-439-2526

CC Magazine welcomes your Class Notes submissions. Please include your name, class year, email, and physical address for verification purposes. Please note that CC Magazine reserves the right to edit for space and clarity. Thank you.

Submit Class Notes

The Social Dilemma

Image of young man looking at his phone in the dark

The Social Dilemma

Film writer Vickie Curtis ’07 explains how the technology that connects us might destroy us.

By AMY MARTIN (STORY EDITOR)

T

he tech industry insiders who invented infinite scrolling, the Facebook “Like” button and so many of the other elements that help make social media so addicting didn’t set out to radically alter the fabric of society. But they did. 

Now, Vickie Curtis ’07 is helping to warn the world. She is one of three writers of The Social Dilemma, a wildly popular Netflix docudrama that explores the dangerous human impact of social networking, as told by the very people who created the platforms. 

“What they were realizing at the time we were filming wasn’t that there were quirks to the thing they made that needed fixing; it was that they had helped create these three billion–tentacled monsters that are actually shifting the course of human history,” she said.  

Curtis joined Austin Jenkins ’95, an Olympia, Washington–based political reporter with the Northwest News Network, for a virtual conversation in November about the process of crafting the film, which premiered at the 2020 Sundance Film Festival and was released on Netflix in September. More than 400 alumni, parents, faculty, staff and students took part in the live event, which was hosted by Conn’s Office of Alumni and Parent Engagement.

CC Magazine has edited the conversation for clarity and length.

Austin Jenkins: Can you tell us a little bit about the origin of this film?

Vickie Curtis: In Jan. 2018, the director, Jeff Orlowski, organized a group of folks to meet with [former Google employee] Tristan Harris, who features as sort of a protagonist in the film. A lot of Jeff’s friends were working for these companies in Silicon Valley, and more and more people were coming to him having left Twitter, Google and Facebook, saying, “I have regrets,” or “Things have taken a turn for the worse and I want out.” The more stories he heard, the more he was wondering, “Could this be a film?”

At first the thought was, “Is this a big enough issue? Does this just mean everyone’s addicted to their phone, which we all already know?” There’s no big reveal there. But the more experts we talked to, the more we realized it is so much bigger than [someone] being advertised to, or being addicted to the phone or looking at Facebook too much. It is really an existential threat that is tearing apart the fabric of society.

AJ: When I watched the documentary, I immediately thought of the 1999 film The Insider, which was about a former tobacco industry scientist who was ultimately convinced by 60 Minutes to tell the story of what was really going on inside Big Tobacco. I’m curious whether you think there is a comparison to be made between Big Tobacco of yesterday and Big Tech of today. 

VC: There are ways in which I would say yes. It’s an industry where the product is not aligned with the incentives of the user of that product. For Big Tobacco, they’re making cigarettes that we now know—and they did know—cause cancer. I haven’t met any cigarette smokers who smoke cigarettes to get cancer. It’s not an outcome they’re looking for. With tech, there’s a similar thing, where the product is these platforms that are addictive and that are misinforming us, making us more narcissistic, more anxious, more depressed, and that are tearing apart some of our institutions. For that reason, I would say that it’s probably more dangerous than tobacco, because it isn’t just having effects on individuals, it’s having effects on larger institutions as well.

AJ: There are all these light bulb or “aha” moments when you’re watching the film, and one is that social media, and tech in general, is the only industry outside of the drug industry that talks about its customers as “users.” I was also struck by the line, “If you’re not paying for the product, you are the product.”

VC: A great question for us at the beginning of the filmmaking process was, “If these companies are worth hundreds of billions of dollars, why? We’re not paying them, so who is paying them?” Advertisers. Advertisers are paying them to target their advertisements to people who will be the most susceptible to that advertisement.

To figure out who’s most susceptible, Facebook and Google create an avatar of you, which is a collection of up to 29,000 data points about what kind of person you are. They’re monitoring things like how fast you scroll, or how you move your mouse, or how long it takes you to absorb an article, or which things you’ve clicked on in the past or what time of day you click on certain kinds of information. Google Maps and Google phones are sending back to Google headquarters all of your real-life habits of where you physically go.

They have a ton of data on who you are as a person, and then they can group you and say, “Okay, there are 29 other people just like Austin in his neighborhood, and they all are doing this, so why don’t we advertise that to Austin, too? We know he’s likely to be susceptible to that thing.”

And on one level, people are like, “Oh, well, this just means the advertisements I see are relevant to me, so, great, this is a pair of shoes I want to buy.” But the algorithm isn’t perfect, nor is it trying to figure out what shoes you want to buy. It’s trying to figure out how to keep you on the platform longer, what gets you hooked. It preys on our fears, doubts and insecurities, so it’s going to show us more outrageous, salacious, fearmongering information in order to keep us there so that we will see more ads, click on more ads. That makes data this really powerful tool for shifting and manipulating people’s beliefs and behaviors.

AJ: I read during the Cambridge Analytica scandal that they could get enough data points to eventually know more about you than you know about yourself.

VC: Absolutely, because a lot of it is subconscious. Like, I don’t know how fast I scroll; I don’t know what my mouse-click patterns are; I don’t know what my personality profile is based on those habits of mine. They have a whole understanding of your personality profile based on the information that they’ve collected on you, and that determines which particular conspiracy theory to show you next.

Social media, and tech in general, is the only industry outside of the drug industry that talks about its customers as “users.”

Austin Jenkins ’95, a political reporter with the Northwest News Network

AJ: There’s an interesting creative decision made in the film. [Instead of] a traditional documentary, you actually bring in the art of fiction. You have a fictionalized family represented, and you also have individuals who, in essence, act as humanized versions of the algorithms that these companies are using.

VC: For a long time, we wrestled with how to make the film more cinematic, because it has a lot of talking heads. Jeff’s original idea was that we could personify the algorithms. So [we wondered], “Can we portray what’s happening on the other side of your screen?” There’s no person making this decision, and the people running the company don’t even know what the algorithm is doing, what decisions it’s making or why.

AJ: What are these psychological techniques they’re using to keep us on these platforms?

VC: One answer is that we don’t know, because [these tech companies are] very secretive about exactly how they go about doing some of this stuff. And like I said, they don’t even understand exactly how their algorithms are making choices. But there are other things that are absolutely known. For instance, the notifications popping up, the little red dot on your phone or the email messages that sound urgent, they are bait to make you come back. There are all of these intermittent rewards—if you pull down at the top of your feed to refresh, there’s a little skinny wait moment, and that’s actually the moment where you get a spike of dopamine, because your brain is excited about the potential of what might be there. Whether or not the thing that shows up at the top of your feed when you refresh is actually interesting to you doesn’t really affect whether or not you had that habit-building, addiction-building dopamine hit.

AJ: There’s another line from the film that really put things in perspective for me: “We’ve gone from the information age to the disinformation age.”

VC: I think we’re living inside of a good example of the ways in which the algorithms create confusion in society. The algorithm is radically indifferent; it does not know that it’s interacting with people, it does not know that people have lives outside of looking at the content that it’s feeding us. It’s just trying to put out what you are most likely to stare at, and unfortunately, because of the way our brains are wired, we are hyperaware of things that seem to be outlandish, things that seem outrageous, things that make us feel outrage or make us feel fear or anxiety.

The algorithm has learned that those sorts of outrageous stories get more eyeballs and get your eyeballs to stay longer, and are more likely to be something you’ll share. So just purely because it’s trying to get our attention—not because it wants anything bad for humans—it is actually having these deleterious results where we have misinformation spreading six times faster than the truth, because the truth is often much more boring and nuanced.

AJ: How have you yourself changed your own behavior, your own interaction with these platforms?

VC: I used to go on Facebook all the time, I used to go on Instagram all the time. And I just started to feel so disgusted by the platforms. I felt manipulated by them. Even if I could say, “Oh, I know better than to click on an ad or be susceptible,” well first of all, I don’t. I’m just as clueless as to how the magician does his trick as every other person is. And it’s not just that; it’s that these algorithms are causing enormous social unrest; they’re causing genocides in places like Myanmar, where fake news has turned into actual violence. And I don’t want to be feeding that creature.

AJ: At the end, the film talks about starting a conversation. What comes next? Is the solution to remove the advertising revenue stream from these businesses? I know the film contemplates whether they could be taxed for the data that they possess, because that’s an asset and they’re mining data like we mine ore or gold or anything else extractive. Or do they need to be regulated?

VC: The film winds down to the fact that surveillance capitalism is out of alignment with the future of humanity, and that we are now the resource being mined. We see through climate change and the destruction of habitats that there are actual consequences to mining the earth in this exploitative, extractive model, and now we are the thing being exploited; we are the thing being extracted from—our behavior, our eyes, our attention, our beliefs—they’re extracting these from us in order to make that profit.

Now that we see that misalignment, there are three things that The Social Dilemma impact team is focusing on in order to make change. One is changing the way we use tech, giving people better resources to understand the deleterious aspects of their own relationship to technology and some tips for how they can change their own personal behavior.

How we design tech is the second branch. A lot of people who work at Facebook and Google are aware that there are problems with the way the organizations work, and they are aware that the ad business model is out of alignment with what’s best for humanity. So, can they start building safeguards into the way that tech is actually designed?

And the third way is how tech is regulated. These companies are so big and powerful at this point, they are the biggest and most valuable companies that have ever existed in human history. We do have a democracy where we could potentially have a robust regulation of those companies in order to bring them back into alignment with humanity and our goals as a species. We’ve been talking to a few congresspeople who are aware of potential trust-busting that needs to happen, or privacy issues or censorship issues, but are not thinking yet about how the entire business model needs to shift in order to sort of realign us.

AJ: On the one hand, the film introduces the idea that these can lead to things like genocide and destabilizing democracy. That is a very bleak prognosis. On the other hand, maybe we’ll figure it out, maybe we can save the next generation. Where do you land?

VC: Well, I will echo Tristan—he said the first time he felt optimistic about this was when he saw the reception that the film received. That 38 million households streamed it during its first month on Netflix means that people are ready to think about this.



  • Make a Gift
  • Contact Us
  • Alumni Association
  • News & Media Hub
  • Update Your Address