Director Jeff Orlowski ("Chasing Ice"; "Chasing Coral") spent the better part of his early film career focused on the fossil fuel industry: amazed by the vast power its executives wielded, horrified by the consequences it was wreaking on the planet, stunned by how it had altered the course of human history.
Then came a new fixation for Orlowski — an entity that was somehow more lucrative, just as powerful and every bit as woven into the fabric of modern day society: Silicon Valley.
And so the tech industry came to be the focus of "The Social Dilemma," Orlowski's latest documentary, which aired on Netflix earlier this month. The film features key tech innovators, pioneering programmers and former executives from just about every one of Silicon Valley's major players you could name: Facebook. Twitter. YouTube. Google. Instagram. Orlowski's list of sources is long and impressive.
So, too, is the testimony that they provide: "The Social Dilemma" showcases a wealth of insider insight into how these technologies were designed and calibrated to manipulate human psychology. Orlowski's documentary leaves no stone unturned as it dives deep into tech's ugly downside: social media addiction, the spread of misinformation, its negative impact on our mental health (and the mental health of our youth), as well as, according to Orlowski, its potential role in the downfall of democratic institutions.
That's no small thing to say, especially as a high stakes election looms large and a pandemic continues to ravage the globe. And yet social media platforms — even in this critical time — continue to grow, Frankenstein-like, into bad actors with key roles, Orlowski argues: Misinformation and conspiracy theories are spreading through society like wildfire, with social media platforms contributing the fodder and fuel behind them.
But Orlowski insists he's no pessimist: In fact, he describes himself as a technological optimist. It's still his opinion society could change technology and social media in order to use it humanely, or even for our collective betterment.
We caught up with Orlowski to talk about how we proceed from here, whether Silicon Valley should be allowed to police itself — and about what the future of humane technology might really look like.
You're a Stanford alum — I'm seeing you were there from 2002--2007. Tell me about that time — did you engage at all with the 2000s iteration of Silicon Valley? What do you remember (or not remember) about it?
I mean, I was going down the path of working in technology myself. A number of close friends (from Stanford) and I were building a web design company together. As we were graduating, my friends were building different app companies. My friend circle was very heavily involved in technology, which is why in 2017 when I started hearing from Tristan (Harris, a former Google executive) and others about this, those are the friends I went to immediately to ask what was going on.
Ironically, the film only really exists because many of those people and I went to Stanford together. I had access to important people in the industry, and that was the only reason why I was able to pursue it like this. There are... I forget the actual count, but there are at least six people in the movie or involved in the movie that were Stanford alum, and probably more. But yeah, in many ways, this is tech that was born out of our school. It's a very ironic full circle.
Tell me about finding the folks you chose to interview. High caliber executives willing to speak on the culture seems exceptional to me, because we often deal with this insider culture around the tech bubble. Was it the Stanford connection that opened the door?
Yeah, it was. Tristan I knew from Stanford, and Jeff Seibert, too. He and I were close friends (while at Stanford), and he's the Twitter executive who ran product at Twitter. He's the one that when I started learning about all of this from Tristan, I went to Jeff, and I said what do you think? Give me a fact check on him. Is there legitimacy here? Is this accurate?
I remember Jeff's response — he was skeptical at first, but the more he thought about it, the more he realized he agreed with Tristan's perspective, and then he revealed more and more. That was just a really interesting process for me, and then it was through them that we kept asking — who else do you know? Who will speak on the record?
We did anonymous interviews and background interviews that we weren't able to use but helped inform our thinking and opened the door for more access to different executives and insiders. Some were really difficult to get. Some were difficult to schedule. Every person that we interviewed, we could then say — we've spoken to all these people, would you consider it?
Bailey Richardson (an early employee at Instagram) was featured in a piece ... I forget which publication, but an article came out about her deleting Instagram, so we hunted her down. It was looking for any possible lead we could find.
Tell me about your own perception of social media as you were making the documentary. Did it change? Were there surprising themes or revelations that surfaced?
Oh, man. I was a huge social media addict. I loved Facebook. I used it all the time. It was the making of the film that made me look at it in a different way. It completely transformed my relationship with social media. It was understanding what the experts were saying, the engineers were saying, what the critics had to say — that was the way to understand what their business model was and what these companies are actually selling. The experience really, really transformed my perspective on technology.
Can you speak on the idea of implementing positive change to social media? Does Silicon Valley have a role in making those changes?
Yeah. Let me kick off that answer with an analogy, because I've been thinking through the fossil fuel industry for quite a while, and my background is in climate change films. When you look at fossil fuel, we discovered oil and suddenly had amazing access to this incredible resource that allowed us to travel farther and fly, and only years later did we recognize that there were consequences. The fossil fuel industry is faced with an opportunity to change itself to become an energy industry that's more sustainable or digging their heels in to maintain their current business model.
That's the exact story I see with technology. It started so innocently. Twitter started off as an art project, fueled by the desire to connect people and share their stories. We now see the business model that got entrenched in these platforms has become so powerful — it's worth more than the fossil fuel industry, it's the richest industry in the history of money. It's so incredibly powerful that it's really, really difficult to change. Despite good intentions from the people inside these companies, the changes I see happening are Band-Aid solutions. They're not addressing the fundamental problem.
I see the business model as the problem. The business model is misaligned with society, just like burning fossil fuels is misaligned with society. I am hopeful we do make the changes, but I'm skeptical that it's going to happen from within the companies because it requires the same, ground-up rewrite that's needed in the energy industry. My hope is that there are enough employees who say no, I don't want to be a part of this, I don't want to be a part of the breakdown of democracy, a part of increasing harm around mental health.
There needs to be enough engineers that say, this is code, we can reprogram it. Let's do things differently, let's use a different business model, let's do things with society's interest at heart. That's where my hope exists, but really in many ways I hope the film is a rallying cry for the industry to have this wake up moment. That we could see leadership within the companies, start seeing changes from within, or we see a lot of pressure that prompts change — whether public or political pressure. That's where my curiosity lies: How can we change to these systems to work in society's interest?
Right. There's that interesting moment in the film where we see (Mark) Zuckerberg testifying that he believes the solution to combating spread of misinformation is more AI — but then your film makes the point that AI can't know what truth is, it just knows how to generate more clicks.
There is a saying — a problem won't be solved by the same thinking that created it. And I do believe this is where greater diversity of perspective, greater diversity of awareness of how society functions and how humanity functions would inform better technology. This conversation around humane technology, and if you're going to design to be humane — it has to be around human needs and vulnerability. Like the film says, they weren't designed around child psychology or around what improves the emotional growth of my child.
That's what we're really just seeing, that these companies have grown so much bigger than I think they were prepared for, and they have such influence over society that I don't think they were prepared for. We need them to grow really fast. It's no excuse to say, we're in our teen years. We have to work faster.
This has become our public square with no regulatory involvement around the public square. No regulation around what it should look like. Private corporations now control life experience, information and news for 3 billion people.
The film touches upon this — but can you speak at all to why this is particularly important in this moment, where we're facing not only an election but continuing to deal with the pandemic?
We wanted it to be a conversation for the public. Particularly before the election. One of our subjects, Tristan, says that he hopes the film can be a shared truth about the breakdown of shared truth. I've been referencing this other line from Roger McNamee, and I'm paraphrasing, but he basically says — Russia didn't hack Facebook. They just used it. And that's what we're seeing now, literally Russia and other foreign actors just using it. You don't need ads; the organic content is more effective. Political ads don't matter in the scheme of things, because it's the platform itself that creates these inherent problems.
There's another analogy I've been drawing upon lately: American companies have made a weapon of mass destruction on American soil, deployed on American soil to be used by foreign actors for practically no money. From a security perspective, that's frightening.
Anything else that comes to mind about the film or about technology itself?
I would add that I am very much a technology optimist. I believe in the power of technology. There's a line that got shortened in the film — it's from Jaron Lanier — but he basically says that people would ask him, why are you a pessimist? And he would say — I'm not, I'm an optimist. I believe it can be better. It's those who are complacent, those comfortable with the status quo, who are the pessimists.
That resonated with me, because I believe in the power of tech to serve humanity. I believe it can be a bicycle for the mind, that it can increase our capacity, skills and scale. And I think we entered this generation where the technology is designed for someone else — it's designed for another master. And we are now the unfortunate victims of this platform.
I was going to talk about surveillance capitalism, the idea that if you're not paying for the product, you are the product. We are the raw resource that feeds the machine, our life experiences ... everything we do is being collected and codified, and we're being extracted for this multi-hundred-billion dollar industry.
And so I'm hoping the film can help be a wake-up call, and help us realign technology to serve humanity.
Go to TheSocialDilemma.com for resources on the topic of tech as well as ways to participate in efforts towards humane technology.
Sarah Klearman writes for TheSixFifty.com, Palo Alto Online's sister website that covers the best of what to eat, see and do in Silicon Valley. Find more of her work at email@example.com.