Social Media & Partisanship: Polarizing Algorithms and a Blind Date with Dividends, featuring Dr. Robert Elliott Smith

Show Notes, Sources and Transcript

August 6, 2020

How do social media algorithms feed us news, entertainment, advertisements, even suggesting friends and lovers? By grossly simplifying human nature, according to featured guest Dr. Robert Elliott Smith, Research Fellow in Computer Science at University College London and author of Rage Inside the Machine: How to Prevent the Internet from Making Bigots of Us All. 

Dr. Smith details how the same Artificial Intelligence (AI) that reaps engineering benefits has disastrous consequences for civil society. Raised in Birmingham Alabama in the 1960’s, he knows something about polarized situations. That upbringing, and his 30-year career in AI, informs Dr. Smith’s warnings about polarizing algorithms and advice on how to mitigate that damage.  

Guest 

Dr. Robert Elliott Smith, Senior Research Fellow, Dept. of Computer Science, University College London

Guest Book

Dr. Robert Elliott Smith, Rage Inside the Machine: How to Stop the Internet from Making Bigots of Us All (Bloomsbury, 2019)

Original music composed by Ryan Adair Rooney.

Electronic Sources 

Monica Anderson (7/22/20). “Most Americans say social media companies have too much power, influence in politics.” Pew Research Center. 

Emily Dreyfuss (7/24/19). “Netflix’s The Great Hack Brings Our Data Nightmare to Life.” Wired.

Robert S. Engelmore & Edward Feigenbaum. “Expert Systems and Artificial Intelligence.” World Technology Evaluation Center, Loyola University Maryland. 

“Facebook CEO Testimony Before House Financial Services Committee” (10/23/19). C-SPAN.

Maddalena Favaretto et. al. (2019). “Big Data and discrimination: perils, promises and solutions. A systematic review.” Journal of Big Data 6:12. 

Jeffrey Gottfried & Elisa Shearer (5/26/16). “News Use Across Social Media Platforms 2016.” Pew Research Center

Great Lakes Broadcasting v. Fed. Radio Comm, 37 F.2d 993 (D.C. Cir. 1930)

Michael Hameleers et. al. (2017). “The Appeal of Media Populism: The Media Preferences of Citizens with Populist Attitudes. Mass Communication and Society 20:4. 

Noor Ismawati Jaafar et. al. (2014). “Face-to-Face or Not-to-Face: A Technology Preference for Communication.” Cyberpsychology, Behavior and Social Networking 17:11. 

Kate Kenski et. al. (2017). “Broadcasting versus Narrowcasting: Do Mass Media Exist in the Twenty-First Century?.” In The Oxford Handbook of Political Communication. : Oxford University Press.

Young Mie Kim (2017). “Algorithmic Opportunity: Digital Advertising and Inequality in Political Involvement.” The Forum 14:4. 

Thorin Klosowski (5/28/20). “Big Companies Harvest Our Data. This Is Who They Think I Am.” The New York Times.

Andrew Marantz (10/31/19). “Facebook and the ‘Free Speech’ Excuse.” The New Yorker.

Louise Matsakis (2/15/19). “The WIRED Guide to Your Personal Data (And Who Is Using It).” Wired.

Audrey Perry (2017). “Fairness Doctrine.” First Amendment Encyclopedia.

Lee Rainie & Janna Anderson (2/8/17). “Code-Dependent: Pros and Cons of the Algorithm Age.” Pew Research Center. 

Alex Rochefort (2020). Regulating Social Media Platforms: A Comparative Policy Analysis, Communication Law and Policy, 25:2, 225-260. 

Matthew Rosenberg & Gabriel J.X. Dance (4/8/18). “‘You Are the Product’: Targeted by Cambridge Analytica on Facebook.” The New York Times.

Betsy Anne Williams et. al. (2018). “How Algorithms Discriminate Based on Data They Lack: Challenges, Solutions, and Policy Implications.” Journal of Information Policy 8: pp. 78-115. 

Transcript

Robert Elliott Smith (00:02):

By the way, can I curse in this interview? 

So my dad, when I first got a job, I asked his advice about how to handle having jobs. And he said, the first thing is “don’t send me memos.” 

Robert Pease (narrator):

You’re listening to The Purple Principle. And today’s featured guest is Robert Elliott Smith, Professor of Computer Science at University College, London. 

Robert Elliott Smith:

And this is back before social media and back before email, even. The thing is, the further you get from face-to-face communication with another person, the more dangerous the communication becomes.1 And we all know this. It’s so easy in a memo for someone to misunderstand your meaning. When you get down to a tweet, it’s even worse. Even in a phone conversation, when you have something important to say to somebody, you’re going to talk to them face-to-face.

And the reason is, there’s a lot more to communication than simply symbolic communication through the written word, or through the abbreviated written word in Twitter. And that’s because human communication is extremely complex, as is all human interaction. 

Robert Pease (narrator):

This is Robert Pease, host of The Purple Principle, a podcast about the perils of partisanship. Today’s featured guest is Dr. Robert Elliott Smith, an expert on the polarizing effects of algorithms. He’s published a rich, unusual, and important book on this topic, entitled Rage Inside the Machine: How to Stop the Internet from Making Bigots of Us All. I’m here with reporter Emily Crocetti. Emily, welcome. Interesting guest today!

Emily Crocetti (reporter) (01:40):

Great to be here. And yes, very interesting guest.

Robert Pease (reporter) (01:43):

It seems from reading the book, Rage Inside the Machine, Dr. Smith’s quite the renaissance computer scientist, if that makes sense

Emily Crocetti (reporter) (01:49):

Seems to be. He’s also an amateur actor and musician, and obviously a writer too, in addition to his work with artificial intelligence. Plus, his book makes an homage to Salvador Dalí, the surrealist painter.

Robert Pease (reporter) (02:03):

Not your everyday computer scientist.

Emily Crocetti (reporter) (02:05):

Plus he hails from Alabama, but has lived in London for the past 23 years. So that’s a different and interesting perspective.

Robert Pease (reporter) (02:13):

And he does have some insights into social media and partisanship.

Emily Crocetti (reporter) (02:16):

Definitely. And not just on the dangers of algorithms, but also ideas on how to avoid those polarizing traps.

Robert Pease (reporter) (02:25):

So remind me where we’re starting here. Is it right into algorithms?

Emily Crocetti (reporter) (02:28):

Not exactly. We’re actually starting off with a story about a blind date in Birmingham, Alabama 30 years ago, but it is surprisingly relevant.

Robert Pease (reporter)(02:39):

Well, let’s hope so. Here’s Part One of the interview Emily and I conducted with Dr. Robert Elliott Smith, author of Rage Inside the Machine and an expert on polarizing topics such as artificial intelligence and blind dates. 

Robert Elliott Smith:

So back in 1987, when I was a PhD student, one of my mentors set me up on a blind date, which was a strange thing to happen at graduate school. But he had a friend whose daughter was coming from New York to Birmingham, Alabama. And I wanted to make an impression on this woman, so I took her to a bar in Birmingham called Burly Earl’s. And there was some alternative bluegrass on that night. We went and had a conversation, and there were a couple of people there who were, you know, serious locals and her accent and my appearance probably didn’t settle them that much.

Emily Crocetti (reporter) (03:58):

When you say accent, what kind of accent did she have?

Robert Elliott Smith: (04:01):

She had a New York accent.

Emily Crocetti (reporter) (04:03):

So, a New Yorker in an Alabama bar.

Robert Elliott Smith (04:06):

Exactly. So, she asked me what I did at graduate school. And I told her I was working on artificial intelligence. At that time, that was a term that not many people knew much about. And so I had to describe what I was doing. And the people across from me were really listening in and particularly this one gentleman was listening in. And when I got to the end of the description, he looked at me and said “Heil Hitler, pal!” Well that was really weird and a bit threatening. And we left the bar – I don’t think she was having a very good time anyway – but we went out to the car and she kind of called the evening off and said, “you know, I think you’re a bit morally confused about what you’re doing.” She didn’t agree with the guy in the bar, but she thought this stuff was all very questionable. And I kind of filed that away until I wrote the book. 

Robert Pease (interviewer):

That’s interesting, after all that time. So when we asked that question, we thought you were a PhD student who fully believed in the morality of artificial intelligence. Did we get that wrong? 

Robert Elliott Smith:

I think at the time, I believed, like most people probably believe, that technology doesn’t have morals; only its users do. And retrospectively, I think with regard to artificial intelligence, that’s slightly irresponsible. I think that in reality, AI is an extension of the quantification, simplification, and generalization that quantitative social sciences has done with people throughout the history of science. Quantifying people is always something that has not been very far from intolerances and bigotries.2 In many ways, that’s why I make the statement now that I believe algorithms are prejudiced. 

And that’s not to say that I don’t think AI is useful. I think it is. I think AI is powerful and can do good things for people. However, I think it has to be used with appropriate caution. And I think currently we’re seeing some very incautious use of AI and that’s why I wrote the book.

Emily Crocetti (interviewer)  (06:11):

So you’re a computer scientist and an expert in artificial intelligence. So then why write a book? It’s kind of old school! But seriously, why did you choose to write a printed book to tell your story?

Robert Elliott Smith (06:27):

I love books and I love language. There is a chapter in the book about language and about how computers perceive and “understand” language versus the very subtle nature of meaning for human beings, and how very different those two things are. Have you ever read something in a foreign language on Twitter or on Facebook and hit the translate button? What you’ll find out is that those translations are terrible. And the reason is because speech is highly idiomatic. 

Robert Pease (interviewer):

And we love the premise of your book because it’s common for people to think of algorithms as being really complex, but your premise is that they’re actually too simplistic. 

Robert Elliott Smith:

Yeah, it’s interesting. So yes, AI is complex in its scale, in its speed, and in its comprehensibility, but at its root, it’s highly simplifying. And that’s the reason it can yield behavior that can be simplifying of people and therefore really quite unsavory and in some cases dangerous.

And that’s how you solve problems of doing things like recommending what you might buy at Amazon or who you might want to date on Tinder or eHarmony. Ultimately they’re about reducing people to a tractable number of variables and then doing computations around those. And of course the reduction of people to a small number of variables is at the heart of prejudice. To prejudice means to prejudge; to prejudge means to simplify and generalize because that’s how you prejudge something. So in that sense, algorithms are prejudiced. They are making simplifying and generalizing steps. But traditionally, when you point that simplification process at people, you begin to do things that might place people into very simple categories.3 For instance, racial categories, religion-based categories, gender-based categories, etc. 

Robert Pease (interviewer):

And political categories? 

Robert Elliott Smith:

And political categories, indeed. The reality is that people aren’t as simple as Democrats and Republicans. People’s opinions aren’t that simple, but the situation right now is that we have this algorithmically mediated media that’s trying to place us into categories largely for purposes of advertising.4 That of course feeds us our news, that aggravates our emotions. So effectively, it’s the worst kind of narrowcasting. The internet isn’t broadcasting; it’s narrowcasting. Then people can come along and exploit those effects, as we saw in the Cambridge Analytica scandal.5 

Robert Pease (interviewer):

And for our listeners, we should explain that involved Cambridge Analytica, the British firm obtaining the private Facebook data of 80 million Americans and deploying that on behalf of the surprising Trump victory over Clinton in 2016. So speaking of Facebook, it was interesting, the point you made that algorithms feed the wisdom of the crowd back to the crowd and don’t weave expertise into the dialogue. Can you talk a little bit about that feedback? 

Robert Elliott Smith:

Back in the late 1960s through the 1970s and into the 80s a bit, there was something called expert systems, which is basically rule-based AI, where they tried to take the knowledge of experts and put it into a set of rules and then have programs that acted like experts by executing these rules.6

It largely didn’t work. Because it turns out that extracting the information from the heads of experts is both really hard and extremely expensive. The new AI, the stuff that people call neural networks and deep learning – largely those techniques are about statistics and gathering big data to drive statistics – is largely free.7 Some of us even pay for our data to be harvested.8 That’s a much more economically feasible form of AI, but it has its own set of problems. One of the reasons that I think populism and AI and AI-mediated media fit together so well is because it basically says “don’t trust the experts, trust the average common man.” I mean, look, I’m all for democracy, I’m all for people’s votes to be counted, but you know, one of the ways that we function as a society is by trusting people who have the time to do or know something we don’t have the time to do. And if we don’t have that kind of trust in other people, then we weaken our human abilities and that’s the situation I think we’re in now.

Emily Crocetti (interviewer) (11:28):

I find it interesting that not only are the categories becoming more simplified through these different algorithms on different websites, but also it seems like there’s a higher propensity for the users to cling to them as their identities. So in effect, their identities and worldviews themselves become more simplified.

Robert Elliott Smith (11:51):

Identity politics follows from populism, follows from simplification. If all the media you’re getting simplifies things, then unfortunately we have a tendency to follow along and simplify ourselves.9 And in fact, the form of online media encourages us to behave more like algorithms. We “like” things or we share things – very binary operations. Oftentimes people share things on headlines. They just see the headlines. They say, “I hate Trump” or “I hate Bernie Sanders”. So the way that we’re shaping our interactions is very simplified in many ways. We’re behaving like algorithms and actually increasing the effects.

Emily Crocetti (interviewer) (12:41):

Was there any kind of turning point that changed your mind about AI?

Robert Elliott Smith  (12:47):

Certainly the outcome of recent elections affected my point of view because of the influence of algorithmically mediated media. During the 2016 election, I believe that half of Americans got their news entirely from Facebook, and Facebook feeds are arranged algorithmically.10 There is no editor per se. There is only the curation of algorithms. And I think that we saw some very drastic shifts of people’s behavior because of that. 

Robert Pease (narrator): 

A lot of interesting stuff there about algorithms.

Emily Crocetti (reporter) (13:26):

And how they simplify us into categories.

Robert Pease (narrator) (13:28):

And rely on statistics, but not expertise. 

Emily Crocetti (reporter):

And we drew in blind dates. 

Robert Pease (narrator):

Or at least that one. 

Emily Crocetti (reporter):

But he does have some ideas on how to get out of this polarized mess. And so here’s a bit of that.

Robert Elliott Smith (13:39):

And so I would very much advocate that the big social media providers, for instance, should start investigating the idea of giving us stuff that isn’t so personalized, something that runs as a public service, to basically say, “I’m going to show you news that you don’t usually want.” 

Robert Pease (narrator):

Interesting, kind of like the Fairness Doctrine, but resurrected for the digital age? We should explain here that the Fairness Doctrine required pretty objective news standards on TV networks licensed by the FCC.11

Emily Crocetti (reporter) (14:12):

Until 1987, when the broadcast industry and the Reagan administration teamed up to remove that requirement, citing first amendment rights.

Robert Pease (narrator) (14:20):

And expecting, or maybe hoping, greater competition in the digital age would maintain broadcast standards. Oh, well, we’ll be discussing this more with Dr. Smith and in other episodes. So let’s move on then to Part Two of our interview with Dr. Robert Elliott Smith, based on the question, how could we get less partisan, both on and offline?

Robert Elliott Smith (14:45):

And one of the things I say to people when they ask, “how can we make this better?” And I basically say, “make your interactions more human.” When you post something, try to comment on it in as detailed a way as you can; don’t post just on headlines. It’s very important to click through and understand the articles you’re posting, because oftentimes the headlines can be highly misleading. And try to know the people whose content you’re sharing. I think that knowing the name of newspaper writers is something we’ve all forgotten how to do. Columnists used to be people we knew, but now, nobody even remembers to look at their name.

Emily Crocetti (reporter)  (15:27):

In your book, you say that we have abandoned unanswered questions about ourselves and our society in favor of simplified computational models. What kinds of questions are you referring to here?

Robert Elliott Smith (15:43):

I think that the message of the 21st century is that the world is a highly complex place, and that effectively, there is no simplified strategy that will work to govern human society in an ongoing fashion. We have to hybridize. No good engineer says the solution to all problems is to use hydraulics. And oftentimes in aircraft engineering, you’ll have redundancies that are intentionally different technologies. You’ll have a mechanical, electrical, and a hydraulic system backing one another up, particularly because if one of them fails due to something you overlooked, it won’t fail in the other one because it’s completely different. And a similar philosophy probably holds for politics. For the governance of people we should hybridize strategies for robustness. 

Robert Pease (interviewer):

Great. So back to your book again, for a moment, in Rage Inside the Machine you call for changes in laws, practices, and belief systems to try and get people out of echo chambers and really mixing.

But that’s a pretty tall order. Could you give us some concrete actions that businesses or governments could take to start that process? 

Robert Elliott Smith:

I think that, in the first instance, online media needs to be regulated in the following fashion. We need to realize that Facebook, Twitter, and to some extent Google, are media companies. They’re media companies that provide information for living. That’s what they are. And these are media companies and need media regulation. We need to return to the idea that regulation can be appropriate and good.12 So the first thing the government could do is start towards that. The fact that the content is coming from lots of different people doesn’t matter; what matters is the fact that it is being sent to lots of people. And those people are absorbing what is effectively  broadcast content.


Emily Crocetti (interviewer)  (17:46):

ABC can’t broadcast overt hate speech on their nightly newscasts.

Robert Elliott Smith  (17:51):

Exactly, exactly. Facebook really does broadcast hate speech. There’s no doubt about it. They do. I don’t think any of the major media providers are actually deeply evil. I I don’t think that’s true. I do think that the goals that we’ve programmed them with, through programming and AI, these corporations may not be compatible with having an effective society. 

Robert Pease (interviewer):

Right, which is truly scary. But then again, if you look at issues like prejudice and polarization, how much of that is the effect of a flawed paradigm and how much is bad actors manipulating that flawed paradigm? 

Robert Elliott Smith:

It’s really hard to say. Here’s the thing I say about Cambridge Analytica. If you watch The Great Hack13 – and everyone should, everyone should watch The Great Hack on Netflix – What made Cambridge Analytica possible is the change in the way we talk to one another. It’s the fact that we do talk to one another largely through algorithmically-mediated media that has these very particular facts. So it is a complicated set of feedback loops between people who are exploiting the fact that we now talk to one another by largely sitting at home and looking at a screen and saying “like” and share. But once people see that that polarization exists, they can start manipulating it. So it’s this complex series of feedback loops. 

Robert Pease (narrator):

That’s our featured guest today, Dr. Robert Elliott Smith, author of Rage Inside the Machine: How to Stop the Internet From Making Bigots of Us All. He’s speaking here on the need to recognize human complexity, plus how regulation of the big social media platforms might help polarization.

Emily Crocetti (reporter) (19:50):

Which kind of reminds me of Abigail Marsh’s observation in Episode Four on the fact that human beings are animals, and the importance of our senses for real communication.

Robert Pease (narrator) (20:01):

Yep, that was a great insight. Let’s play that bit for those who’ve not heard the episode with Dr. Abigail Marsh, psychologist and neuroscientist at Georgetown University,

Abigail Marsh (previously recorded) (20:10):

It’s interesting seeing the disagreements among psychologists about how disruptive the switch to heavily technologically mediated communication is going to be. We’re animals! The way that the people around us smell and sound and feel, those are all things that moderate our brain activity at a really primitive level. I do think we’re losing that and it kinda bums me out.

Robert Pease (narrator) (20:41):

So, turns out we’re a little more complex than memes or emoticons.

Emily Crocetti (reporter) (20:44):

But if we’re not aware of that, we can really get manipulated. Like what happened during the Brexit vote in the UK and the 2016 election here in the U.S.

Robert Pease (narrator) (20:53):

Let’s hope we’re a little better prepared this time, but what about our final question? Can independent-minded Americans help bridge the political and social divide? Was Dr. Smith able to address this?

Emily Crocetti (reporter) (21:04):

Indirectly, yes, but you have to remember that he’s been living in the UK for 20-plus years. He did say some interesting things about the strength of diversity and how independents can bring diversity to the polarized divide in the U.S. Here’s what I mean:

Robert Elliott Smith (21:21):

One of the things I talk about in my book is that diversity is the fuel of innovation and the fuel of robustness. Effectively, when we become isolated in communities where we’re all believing the same narrative, we are inflexible, and that inflexibility is dangerous. 

Robert Pease (narrator):

Sounds promising. Let’s hear our final bit of discussion with Dr. Robert Elliot Smith on the importance of diversity. 

Robert Elliott Smith: 

We need to ensure that people hear diverse voices. And I think if we ensured that, the tenor of debate would improve. Because now, and I saw this in Alabama when I was growing up, when you divide people, they’ll only interact by screaming at one another in the streets. And I’ve got to say that the anti-segregation busing was really a teachable moment in my life. Like I said, I was scared of those black kids. But when I came into real contact with them, they had an effect on my life that was very positive and that’s not what I expected. I think no one would say that the Fairness Doctrine in broadcast media, which mandated that broadcasters were responsible for providing a diversity of opinions on every licensed broadcast back in the days of old-fashioned broadcasts – that was taken away in the 1980s. I defy anyone to say that the outcome has been positive. There’s a difference between diversity and mixing. Sure, we have the voice of MSNBC, and sure, we have the voice of Fox news, which are entirely different from one another, but they don’t mix their audiences. So effectively we have polarization. There’s a difference between polarization and diversity, and that’s really key. We have lots of different kinds of opinions, but we have walls in between them and those are impenetrable walls.

Emily Crocetti (interviewer) (23:31):

So, I wanted to ask your advice on something. Obviously this is an election year, and a lot of independents and unaffiliated voters are still undecided. How do you think people should be searching for non-polarized information to help them navigate the pressing issues?

Robert Elliott Smith:  (23:49):

I guess turn that around and say, how can they be propagating information better? Which way will allow everyone to see it better? I think the most important things you can do are: don’t just “like” and share on headlines. Read the actual article and try to reflect more and add as sophisticated a human comment as possible. Try to know the authors of the content that you share so that you form a human relationship with them. Rehumanize as much as possible your interactions. And this is a controversial one: try to unblock people. I know it’s really hard, but I find that there are people who I’ve blocked because these people are dangerously offensive, they’re saying really ugly things, they’re trolling me. Those people have to be blocked. Okay, I understand that. But then there are people I blocked in the past who I blocked because they just said something that basically didn’t fit my worldview that well. And I decided I don’t want to see any of that content. Try to ease off that a bit because our studies have definitely shown that opening up the connectivity effectively allows the information not just to flow to you but to flow beyond you. And then you’re opening up the conversation in a realistic way. I would advise that people be more human and try to open up your channels of communication to other people, because you’re a part of the way the network is structured. And if you change your network structure, you’re changing it for many, many people, not just for yourself.

Robert Pease (narrator) (25:35):

And that was our featured guest today, Dr. Robert Elliott Smith.

Emily Crocetti (reporter) (25:39):

Hold on here, we’re not done yet! There is more to the interview. 

Robert Pease (narrator):

But we answered the three main questions. 

Emily Crocetti (reporter):

There’s more about the blind date though, but not much more, which, spoiler alert: is kind of the point.

Emily Crocetti (interviewer) (25:59):

So I have to ask, did you ever hear from the woman from that blind date again?

Robert Elliott Smith (26:04):

Oh no, no! I never heard from that woman ever, ever again. That was a complete dead end. She actually said she would contact me when she got more settled in, and then nothing.

Emily Crocetti (interviewer) (26:19):

I wasn’t sure if she ended up reading your book or anything.

Robert Elliott Smith (26:22):

I have no idea. And I hate to say it, but I don’t even remember her name. Which is a shame, but it was one of those dates that lasted probably 45 minutes.

Emily Crocetti (interviewer) (26:35):

In a way, I guess, the legacy of it lasted decades.

Robert Elliott Smith (26:39):

Indeed, indeed. And it did really make me think at the time. It’s interesting. I would have argued very strongly for the position that technology has no morals in the past, that only users do. But now I feel like we have to consider them a lot more carefully.

Robert Pease (narrator) (26:58):

And that really is the end of today’s episode with special guest Dr. Robert Elliott Smith, Professor of Computer Science at University College, London, and author of Rage Inside the Machine: How to Stop the Internet from Making Bigots of Us All.14 It is not, however, the end of our discussions with Dr. Smith. Based in London these days, he was born and raised in Birmingham, Alabama during the height of racial tensions in the 1960s. In an upcoming episode, we’ll talk to him about those experiences and how he relates them to polarization both on and offline in society today. Here’s a bit of that upcoming episode with Dr. Robert Elliott Smith.

Robert Elliott Smith (27:37):

I was going to school during the anti-segregation busing in Birmingham, Alabama, which was at one time the most racist city in the world. At my elementary school we had black kids shipped in from Airport Heights near the Birmingham airport. And there were these kids who, because I grew up in a very racist background, I was really quite frightened of, and I was a bullied kid. Surprisingly to me, when I was being bullied by other white kids, a fellow student of mine, one of the black students, said to me something that was really one of the most influential things that ever happened in my life. She said, “you go around looking at your feet. If you don’t hold your head up, you’ll be beat down your whole life.” And I’ll never forget it. I mean, those are exactly the words.

Robert Pease (narrator) (28:34):

In our next episode, we’ll turn from polarizing algorithms to polarized politics and speak with former three-term Congressman Jason Altmire about the demise of the political center among both U.S. voters and their representatives in Congress.

Jason Altmire (28:49):

So you are seeing great disgust in the country with the polarization that we see all around us. Some people have chosen to disengage from the political process and just not vote and not participate. That is clearly not the right answer, but the other problem is that people have left their parties. They’ve left the Democratic and Republican party and they’ve become independents. And now they’ve disenfranchised themselves. In many states, they can’t participate in primary elections. What you find is if the only people who are voting in the primary are of one party, that represents the most extreme partisans that that party has to offer.

Robert Pease (narrator) (29:43):

Stay tuned for this and other episodes as we take a 360 degree tour of partisanship. Please check out our social media and website, purpleprinciple.com, for more info and connectivity, hopefully of the non-polarizing kind. This is Robert Pease for The Purple Principle team; Janice Murphy, senior editor; Emily Crocetti, staff reporter; Kevin A. Kline, audio engineer; Emily Holloway, research and fact checking. Our original music playing right now is by Ryan Adair Rooney.

1994
2017

In 1986, 34 Democratic and 29 Republican Senators signed the Immigration Reform and Control Act, creating a citizenship path for illegal immigrants while making it a Federal crime to employ illegal workers. The House also voted in favor, 238 to 173. The bill was signed into law on November 6, 1986. 

congress.gov

In 1972, 17 Republican Senators joined 35 Democratic Senators to override the Nixon Administration veto of the Clean Water Act, a landmark piece of environmental legislation. The House override vote was 247 to 23, including 96 Republicans and 151 Democrats in the majority.

govtrack.us

  • Question for Team Purple?
  • Want to offer Feedback on our Podcast?

GET IN TOUCH