Conversations Live
Media Literacy
Season 13 Episode 6 | 56m 46sVideo has Closed Captions
Penn State experts Prof. Matt Jordan and Prof. Kelley Cotter discuss media literacy.
Host Bill Hallman is joined by Penn State experts Prof. Matt Jordan and Prof. Kelley Cotter as they discuss what constitutes quality journalism, choosing legitimate news sites, the role of social media on news, the potential impacts of AI on news and media, and other related topics.
Problems with Closed Captions? Closed Captioning Feedback
Problems with Closed Captions? Closed Captioning Feedback
Conversations Live is a local public television program presented by WPSU
Conversations Live
Media Literacy
Season 13 Episode 6 | 56m 46sVideo has Closed Captions
Host Bill Hallman is joined by Penn State experts Prof. Matt Jordan and Prof. Kelley Cotter as they discuss what constitutes quality journalism, choosing legitimate news sites, the role of social media on news, the potential impacts of AI on news and media, and other related topics.
Problems with Closed Captions? Closed Captioning Feedback
How to Watch Conversations Live
Conversations Live is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorshipSupport for Conversations Live comes from the Gertrude J. St Endowment.
The James H. Olave Family Endowment and the Sidney and Helen Se Freedman Endowment.
And from viewers like you, thank you.
From the Dr. Keiko Meier Ross W PSU Production Studio.
This is Conversations Live.
Hello and welcome to Conversations Live.
I'm Bill Harmon.
We're coming to you live from the Dr. Keiko Miwa Ross WPC Production Studio.
These days it seems that more and more people are forming an unfavorable opinion of new media.
Public trust in media has been declining over the last 20 years and more and more people are simply avoiding news altogether.
Questions arise such as What should we consider to be good journalism, reliable news sources, and accurate reporting?
Moreover, the role of social media and now artificial intelligence on news and journalism continues to be a major focal point, especially as it concerns maintaining a healthy democracy.
Tonight, we have two experts who will bring some insight on these issues as it relates to media literacy.
Let's meet our guests.
Kelly Cotter is an assistant professor in the Penn State College of Information Sciences and Technology.
Her research explores how data centric technologies shape social culture and political life and vice versa.
Her most recent work focuses on how people learn about make sense of algorithms and how such insight may be mobilized in efforts to govern platforms.
Matt Jordan is an associate professor in Penn State's College.
Donald P Bellisario College of Communications.
He is a critical media scholar who works on the role of media in everyday culture.
He is executive producer of the Penn State Humanities Institute's Emmy nominated documentary series Human Focus, which is broadcast right here on WPSU and on the Web.
Along with being the film production and media studies department head in the Bellisario College, he is also leading Penn State's news literacy initiative, which includes hosting a podcast called News Over Noise.
And whether you're watching us on TV or streaming us online or listening on the radio.
We want to hear from you.
Call us with your questions at 814865212, four.
Or you can email at Connect at WKSU dot org.
So, Matt Kelly, thank you for joining us today.
I'd like to begin by setting a baseline for our audience and for our listeners.
What is media literacy and what should our viewers and listeners be thinking about?
How should they be thinking about that term?
Over the course of our conversation for the next hour, Kelly, I'll start with you.
So I think our main interest in media literacy these days is people's ability to critically evaluate media messages to ensure that they are able to tell the difference between accurate and inaccurate information, false information and true information to assess the authority of the people creating the messages, but also to be able to interpret claims that are made.
Assess who's making the message and what interest they may bring to it, as well as on the other side of things, maybe creating media as well.
So having the skills to be able to produce your own messages in different formats.
When we think about literacy as it is as a concept and think about media as a concept, media is the mediation between the world and the person, and the forms that media uses determines what kind of knowledge we get of it and what what we can think, how we can think with it.
And so allowing giving people an understanding of what those forms are, what are the artifices, what are the incentives, what is the reason that they're getting the picture of reality that they're getting also gives people a handle on it.
It gives people more ability to not be overwhelmed by media content and to understand that it's just an artifice.
I'd like to begin by talking about journalism.
I worked as a journalist for more than a decade.
I have friends that are still working journalists in the field.
I know how hard they work to keep their viewers informed of what's going on in their communities.
Yet the strict standards that journalists today have to adhere to before they publish or print or televise any story, despite that public trust in journalism has been declining for decades.
What are some of the drivers of that distrust, especially here in America, Matt?
Well, it's the drivers of it, largely as the kind of what you might call the disintegration of the mass media.
We used to have three television networks that all had a curated nightly news.
People read the newspaper in the morning, and starting in the late 1980s, you see cable news come along and just more and more and more available things to look at.
And so those things started to move from standards that we're going to be supporting a general public interest that were kind of be ecumenical, like no matter where your partizan leaning was.
It was going to be pitched to you toward things which that were more niche, things that were pitched to a certain audience and their preexisting dispositions.
Right?
So as it does that, one of the things that all of those different niche things do is they usually talk about the competitor and how bad they are, how only they tell the truth, right?
So that's become kind of normalized.
And with it, the public has started to take that on.
Right?
That notion that nobody can be trust.
Right.
So it started as a a the new people on the block complaining about the mainstream media.
And it's kind of continued on as a as a discourse.
And so the public perception of the news kind of follows along.
Yeah, I think the only thing I add to that is that there is now so many different diverse sources of information that are coming from non-journalists that look like journalists, that there may be misconceptions about what journalism is as well.
And what about the role of opinion versus fact based news?
I think you hear a lot of criticisms of certain newspapers, for instance, whether it be The New York Times, Wall Street Journal, because an editorial section is leaning one way or the other or a certain cable news program has opinion programs in the evening hours, and that shapes how people perceive those institutions.
Is there a problem that you would see and people differentiating opinion content versus hard news content?
Yeah, I think there certainly is.
And you know, most people, when they watch TV news, it's is there a guy at a desk with some graphics over here?
And when they see that, they say this is news.
Right.
And so a lot of television programs that are punditry.
Right.
All right.
Are going to exploit that so that people will take what they say is as being journalism.
They even sometimes call themselves news when really there is something more like infotainment.
So that is something in there blurring the lines consciously to between that.
And so and in that the what we know about media is, is that actually is what sells right that we're in an attention based economy in terms of media more and more competition for viewership, more and more noise in the channel.
And so in order to get people's attention to be able to sell advertising and whatnot, you have to be you know, you have to say outrageous things.
And so there is and there are a lot of incentives to use hyperbole, to editorialize, to kind of use moral language that you don't normally see in traditional journalism, and for good reason.
But the incentives in the system are going to give you more and more kind of punditry and opinion.
Sure.
We want to bring up a Facebook comment.
We did offer viewers a chance to weigh in on the topic of media literacy on the PSU Facebook page.
I will read the first one here from Scott who wrote in the mainstream media is no longer trustworthy.
It is biased, presents false information, repeats talking points, omits facts and stories that don't support the narrative.
It has become a mouthpiece for political ideology.
This makes it propaganda.
Yes, fake news.
So we'll take that comment first.
Disappears.
This to me is a as a critique that I hear frequently is it's something that, you know, sounds familiar to you.
Absolutely.
Yes.
I mean, I think this goes with the declining trust and institutions generally and particularly within with news.
I think we have seen, you know, the rise of diverse perspectives and news that is not just meant to follow this traditional, well, maybe traditional in the 20th century view of journalism as trying to be objective and following a code of ethics and certain values that we're adhering to in order to present a sort of clear eyed picture of things that are happening.
But increasingly, as Matt was talking about, there is more news outlets, news organizations, journalists who whose job is to interpret the news as well.
There's a lot of things going on in the world that are complex and require analysis for people to sort of wrap their head around things.
So just giving people the facts isn't always helpful to them.
So we can have some news that tries to interpret what's happening and give people a little bit more context or scaffolding of the information that's happening.
But sometimes that means carrying also an ideological point of view.
So we have lots of partizan news sources these days, especially on social media and independent journalists who maybe aren't part of news organizations that also are kind of freer to try to put their own spin on news.
But one thing that I'd love to hear from Matt, too, is this increasing awareness that it is very difficult to maintain a sort of neutral position on the news and the way that you present it, that there are ways that we can kind of approximate neutrality, but that we can't always perfectly achieve that and that we see that with in science as well.
So there's science is a way of knowing it has a series of steps that we're supposed to follow that allow us to agree upon a way of evaluating or assessing reality, of understanding reality with that's rooted in some shared practices and and criteria for the credibility of information.
And that's sort of the same in journalism, but it's never perfect.
Yeah, I mean, it's this is you know, this comment is a it's it's not a new one, right?
You start to see the emergence of the word fake news at the end of the 19th century, really, when the robber barons, the oligarchs, owned all the media mouthpieces.
And so the progressive press, the muckrakers, as we call them, started saying they weren't fake news, everything else was fake.
And the you know, soon after they started doing that, politicians started to realize, hey, you mean that's something we can use to deflect criticism from the press?
We can call what they say about us fake news.
So it's been something that's been percolating out there in the system as a way for people to manage their trust.
When people said they don't trust things, mostly what they're usually saying is they're not saying things that I already agree with.
And what we know about how niche media works is that if people aren't hearing what they already agree with, they go find it somewhere else.
Right?
So the Fox News viewer is going to go to Newsmax or one American network if they're not hearing what they're going to say.
And in fact, Fox News does minute by minute ratings to hear.
So they know exactly what their audience wants to hear.
And so the audience has been trained to to bail as soon as they're not hearing something that kind of confirms their preexisting biases.
So when you hear something like that, that that news is untrustworthy, it's usually means I don't trust what they're saying.
But, you know, the same kind of standards of journalism, though, they're practiced by fewer and fewer, as Kelly said, because a lot of people on YouTube, in order to get engagement, are going to, you know, be partizan in their leanings.
But that that neutrality that sometimes we go for also seems to be something that's turning viewers away from it.
Because sometimes some of the ways that you present the news and kind of mainstream media is, is a kind of a feigned objectivity where you say, here's an issue, here's what the guy over here says on the point of point counterpoint, and you figure it out.
Right.
Instead of kind of here's the truth, here's what I think.
And so instead of people saying, here's what I think believed to be true and telling you where they arrived at, that oftentimes they just say, here's the guy on the left, here's the guy on the right.
You figure it out.
And that does frustrate a lot of a lot of people.
Yeah, I remember, you know, about five years ago now, there was a debate in a lot of newsrooms over what you could call a lie publicly.
So, I mean, that was an interesting debate to listen.
And what and what words did you come up with in terms of what you had to call it, a fabricated truth, untruth?
Something like that.
So we'll go back to another comment with another perspective from Della that says, during any serious situation, especially political, we flip back and forth between channels that are influenced by the left and the right and figure the truth is somewhere in the less sensationalized middle ground.
And that's kind of what we're talking about here.
What are your thoughts on Della's Della's comment for us?
We'll start with Matt this time.
I mean, what I always tell my classes and I think if you think about this, that if you ask anybody about any question in the world, the world is complicated, right?
There are never two sides on every issue.
There's 20 sides on every issue.
So but one of the things that the media does in terms of the artifice is that it uses it.
It sets the world up in two sides.
And and then there's something in the middle, right?
It's it's more complicated than that.
You know, I would like to see people who are experts in the field, somebody who is a political figure in the field, somebody who's an average person in the field and see them deliberate on it and come to some kind of consensus, I think that would make for good TV.
But we tended to kind of fall into these kind of ways of presenting the world that has left, right and center.
Yeah, And I like this.
I'm interested in Della calling out the sensational nature of News two, because I think that's a good intuition that if what you're hearing feels extreme, surprising, out of out of place with what you kind of feel to be true, it maybe is a good indication that you need to look beyond the source that you're considering at that moment.
Let's talk about some of the economics that might be involved in this.
So according to the Pew Research Center, the number of U.S. adults that say they closely follow the news has dropped about 13% in seven years.
We can assume that this leads to a less informed society, but there's a pocketbook issue here and fewer people are watching the news.
There's less money to print local newspapers.
There's less money to fund local newsrooms, hire journalists, especially investigative journalists.
How are the economics of this contributing to the decline in media literacy in a big way?
So, you know, in a way, America is unique in that we have a very kind of free market approach to to the production of what I consider to be public and public interest information.
Right.
And a lot of other countries that have much stronger democracies have a much more robust public media system that isn't prey to the economics of this.
But what we know about the economics of this is that what used to be, you know, your average newspaper would have car ads, would have classified ads, would have all these different streams of revenue where that was the only game in town.
Stuff moved online and the cost per ad decreased.
That kind of creates a snowball effect where the economics goes down.
And then what has happened, and especially in the news sector, is that once those assets become distressed, once news organizations become distressed companies, then vulture capitalist sweep in and start conglomerated.
So we've had deregulation or reregulation toward corporate, pro-corporate policies that has allowed enormous concentration of ownership in those.
And a lot of the time what you see is them coming in and buying up distressed assets, stripping them of their reporters, selling off the building, etc.. And it's not that it's not profitable.
It's just it's much more profitable to sell them off and liquidate the asset.
Right.
So some of the the profits that say Alden Capital, which is a hedge fund that owns an enormous amount of newspapers across the country, or Fortress is another hedge fund that owns a lot of work in that some of the profits for these things are 30% a year.
So the companies are doing great.
The journalists not so much.
Right.
So that's the economics are complicated and this also creates a space.
So when we see the decline of local news, it creates a space for new players like Facebook and Twitter.
Well, X now and YouTube and all of these new platforms to kind of step in and fill a need when we don't have robust sources of local information.
Air.
So some of my research looks at what happens when those platforms sort of step in and how it changes the character of civic information that circulates within communities.
So important information like, you know, professional opportunities, things happening in government, the economy and local economy and then commerce.
And Kelly's got a great point there, because in terms of trust, which is something we talk about a lot in relation to, you know, kind of news and information, People trust local news because it's about the people they know, right?
They they live in that community.
They know the police chief, the Partizans stuff doesn't matter as much.
But when local news degrades and what comes in are tends to be national news and stuff with a lot of partizan cues telling people how they should feel about this or that based on their partizan affiliation.
So that's another thing that leads to distrust is because the stuff starts to be detached from the worlds that people live in.
And that's a good segue way.
We'll get into technology here in just a minute.
But if you're just joining us, I'm Bill Hallman and this is Conversations Live Media literacy on WPC.
Joining us tonight are Penn State professors Kelly Cotter and Matt Jordan.
Our toll free number is 8148652124.
And we're ready to take your calls.
You can also send us questions by email at Connect at PSU dot org.
So let's shift into the technology side of the conversation.
And technological advances have been changing the way we consume media and have been for many, many years.
Matt, I'd like to get your perspective on some of the historical changes, first of how technology has changed, how we consume media, and then we'll talk about some more modern day examples where it's moving forward into today.
Right.
Well, I mean, one thing you could say is that every time there's a big shift in the way that information is kind of distributed, you're going to see the kind of figures in the way that things are doing, right.
So even with the printing press, printing presses always associated with, you know, kind of distributing rationality, mathematical tables, geographical data maps, etc., etc..
But it also is the time where you get the rise of conspiracy theories because now all of a sudden people can multiply those and hand them out.
So similarly, when you see the introduction of the electronic wire, the telegraph wire, you start to see these places where all of a sudden misinformation can spread very, very quickly right?
And so that creates certain incentives to create, you know, fake stories that track really well.
You know, there was a even in the 1830s or, you know, right after you get the telegraph and then moving forward, you see the series of stories which are kind of take the public's imagination, but they're completely bogus.
Right?
So so in that sense, every time you see an advance in technology, they're going to be these places where there's a problem for society and kind of how we determine reality.
And I think that's Kelly can speak to that's really been the case now that we have this new digital economy kind of moving.
Yeah.
Yeah.
I think one of the first examples I can think I'm old enough to remember the email forward.
So that was one of the first things I can remember where I read those and thought, This is weird.
And that was really easy to just silo those in the early days of email and say, I'm not going to read this, this stuff.
What?
What is this?
It's a lot more difficult now when the technology or the feed of your Facebook page or your ex account or whatever it is that you're going to for your information, is getting those things blasted directly into your feed.
Yeah.
Yeah.
So, I mean, we might want to talk about one of the key actors in this equation, which is algorithms.
My, my area.
So most major platforms, all major platforms, let's say like Facebook and YouTube and Instagram and TikTok all use these automated processes to try to serve people content that is going to be personally relevant and interesting to people.
And then increasingly also to to do some of the evaluative work to assess whether the information is credible, accurate, abides by company or platform policies.
So these these processes, these automated processes will take all the content that you could possibly see from the accounts that you're following or the people that you're connected to and then decides which ones to prioritize for you.
And it's all based on it's meant to capture what you're most interested in, but that is captured through a proxy, which is your activity.
So the things that you watch, the things that you click, that you like, that you comment on, that you share.
All of those things are meant to be indicators that the content that you're you're interacting with is interesting to you, but that also has the impact of incentivizing information or packaging information in ways that is going to generate engagement, which doesn't always lead to hard hitting news, which may not be as interesting as the Kate Middleton, you know, Photoshop scandal.
So, I mean, this is a story that just happened to me.
My students came into class this week.
He looked very tired and I asked him what was wrong.
He said he stayed up till 4:00 in the morning watching YouTube and he said he knew it was time for bed when he was watching someone steam clean, dirty carpets to feed.
So but I tell this story as an example of I congratulate him for finding the end of the Internet first.
But I think there's a cautionary tale there in what we're allowing the algorithms to feed to us and how much we're getting engaged.
So any thoughts on what handing yourself over to that?
Well, I mean, the word feed, right?
Yeah.
You know, it's yeah, there's a hit as if that's just, you know, pouring stuff into our mouth and it takes it takes away some of the, you know, the kind of autonomy of the user.
It's not in the newspaper.
You're going to decide where to utilize your literally.
It's going to get your attention.
It's going to serve you things that you're going to engage with.
And we know certain algorithms, TikTok algorithm, YouTube algorithm are very good at what they do.
They can get people watching and they keep people watching because that's how they make revenue, right?
The more you're watching YouTube, the more ads you're going to see, the more revenue that's going to come in.
And as more and more of these platforms are driven by the same thing, newspapers were always driven with ads.
The incentives are to make sure you're engaging in them, make sure you keep engaging, because that's where you're going to see ads.
So you've got congratulations to your student for finding the end of the Internet.
It was tough.
Toughest time leaning content is quite entertaining.
Well, he loved it.
So we do have a viewer email.
Let's go and check that out from Lauren.
Lauren writes, News broke this week that two major newspaper chains, Gannett and McClatchy are ending their content relationship with the Associated Press.
What impact do you think this will have on local news?
While what will the savings be reinvested into local news?
Will anyone notice the change?
Do you see other news companies heading in this direction?
What are the pros and cons?
There are a lot of questions here, but they're good questions.
So let's take this with the first question.
What impact do you think this will have on local news?
And for anyone not familiar with this, the Associated Press is a source, a national source that that sends news via wire to several subscribers around the country.
They do have reporters in cities around the country.
So they are locally embedded in some places, but in large part as it is a wire service that sends content to paper companies or television studios around the world.
So what kind of impact do you see this having on local news?
It's also a collectivity, right, that if you're a member of the Associated Press, you distribute your content to one another.
And I think this is probably indicative of the way in which the, you know, the kind of concentration of local media, which when we say local media, really what we're meaning is the local masthead that you trust in your community.
Because more and more of that content is not local.
Right.
So more and more it's syndicate.
Did content from the central office being put in there.
And a lot of what people do see are are kind of national stories anyway.
So this may mean that the Associated Press isn't as robust because there are fewer and fewer local journalists who are contributing to it.
But it also might mean that Gannett and McClatchy and some of the big news services are so concentrated that they feel like they don't need it anymore.
There might be one other thing at work here, which is that we're starting to see the use of algorithms and chat GPT to scrape content from things like the Associated Press and rewrite it almost instantaneously, sometimes without any human interaction.
This these things are now exploding on the Internet.
And this is an old but good thing that you used to see when the Hearst kind of newspaper chain had its own wire service versus the Associated Press, that they would write their own content often by stealing what the Associated Press had and rewriting it.
And these things cases went all the way to the Supreme Court.
So I think what you're seeing is these power nodes like Unit and McClatchy and the Associated Press kind of in competition with one another as opposed to collaborating with one another.
Hacking, You know, and I think there's a lot else we could dig into.
Any other questions from Lauren that we think we should dive into here?
I see that there could be some pros and cons to this.
And Matt, you did talk about that.
Do you see this as a way that those companies could reinvest and hire additional local reporters?
That's that's a dream scenario, that that would be lovely if it happened.
That is not what we know about the way that these companies behave.
They tend to do the exact opposite.
Right.
If they if if there's something they can sell, they will.
And in fact, a lot of the law right now, there's this save local journalism legislation that was just up in Congress.
And one of the troubles with that piece of legislation which is meant to kind of gather revenue up from some of the major social media platforms and re-invested in local papers.
One of the troubles this is that so many of these local papers are now owned by these huge conglomerates.
And they were the worry is, is that they may get more Facebook revenue, but they're not going to reinvest it in local.
So there needs to be reinvestment in local news.
But I'm not thinking it's going to happen from Gannett or McClatchy, from the savings of a subscription to the Associated Press.
We also have another email from a viewer, and Bill writes, Do news organizations need to dumbed down much of the facts today simply because the audience is not as literate as it was in the past?
And do people just want their own biases affirmed by the news they seek out now?
So we definitely know that people seek out and attend to and tend to consume news and information that confirms existing beliefs.
So that is a phenomenon that's been around, pre-dates the modern era, pre-dates the modern platforms that we have for news and information.
But I think I think this is a problem that sort of we been dealing with for for a very long time and the amount of choices that we have today with news organizations can be a benefit, but also comes with this challenge of knowing that people are going to be more likely to just sort of stick with what they know.
I mean, one of the ways that people are consuming news more and more now is through something they call vertical journalism, which is essentially what you would get a TikTok video or Instagram reel or something like that.
And and those are interesting ways of reformulating a news content and giving it to people in visual form.
Young people especially like to see things as opposed to read things.
And that doesn't necessarily mean it's dumbed down.
It can be really smart reporting that is done.
And there's some really interesting tik-tok channels that that repackage news in that way and make it more engaging for people.
So just because it's not being read in, it's not print form or digital print form doesn't necessarily mean it's dumbed down in any way, shape or form.
Let's get back into the technology technology conversation we were having before we went to some of the viewer emails.
So we discussed that it's expensive to run local newsrooms, run a television studio, keep the lights on.
But now we have social media, we have YouTube, TikTok.
What are some of the things we need to be wary of as more people are using those social media accounts for their main source of news?
Kelly I think the biggest thing is that there is kind of a misconception that the problem is fake news on social media, which is a problem.
But the bigger problem is that most people are not seeing a lot of news.
So the statistic that's been around for a little bit is that about 4% of people's newsfeeds is composed of news.
So a lot of people don't see any news at all.
And the people that do it tends to be very sort of bifurcated, where you have people who consume a lot of news that might see things in the Facebook newsfeed and then everyone else who sees virtually nothing.
So the problem I think, is not having the circulation of news across sort of the broader population rather than only perhaps the the issue of false information.
So false information is is a big problem on social media because, again, the algorithms sort of incentivize quick sensational headlines that people will want to click on.
So if something is surprising or novel to someone, they're going to be more interested in in reading the story and clicking on things.
And those stories also are faster to produce, especially now that we have generative AI tools like chat chip to create stories that maybe on the surface sound look like news but are not news.
So this is another issue.
In your Facebook newsfeed, if you're just sort of scrolling through different posts, a news story is going to look quite similar to a post from your grandma or any other sort of content that's coming from a not reputable source.
So all of this information on the surface looks very, very similar and it can be difficult to distinguish in the moment between good information and bad information.
And unfortunately, most of the time when we're sort of scrolling through our feeds, we are not always, you know, thinking deeply thoughtfully about what we're seeing and critically so it might be easy to sort of let the more sensational information wash over us or the things that say, wow, really?
That's that's so that's so surprising.
That's so crazy.
I need to share that with all of my friends right now with the stories that are definitely not true but sound so interesting that you just have to, you know, to send that information along.
And I mean, the the trust that people used to have in kind of new trusted news organization has now kind of gone over to your Aunt Sally.
Right.
Because because Sally just gave you a story that's really interesting.
You trust it, right?
And that's that that that's one of the the things there's a lot of study on what is called the news finds me mentality in relation to how people consume information, which is that they kind of assume that the algorithm is going to give them what they need.
Right.
They trust being on the platform and they think that the algorithm knows them pretty well and that they'll get enough news.
Right.
But as Kelly said, what we know is that they don't.
Right.
They're getting stuff that they engage with.
But the perception is, is that they're getting news where actually they don't.
So we always trying to tell them in the news literacy initiative that be wary of giving yourself over to the feed and be more deliberative kind of go search things, get off the platform, you know you know look at diverse sources, all that stuff to to kind of give them some more power and then their news habit.
There's a few more things I'd like to discuss with that.
But first, if you're just joining us, I'm Bill Hemmer, and this is Conversations Live Media literacy on WPSU.
Joining us tonight are Penn State professors Kelly Carter and Matt Jordan.
Our toll free number is 814865212, four.
And we're ready to take your calls.
You can also send us questions by email at Connect at WPSU dot org.
We also send out queries to some of our students to see if they would get involved in the conversation this evening.
And we have a student response from Katelyn.
She's a first year student in public relations and she writes, I think it's been polarizing in many ways.
People want to be able to use social media to get news and information, but it is hard to trust who is biased or unbiased nowadays.
How should young people navigate using social media and new sources of information?
Should younger people shy away from social media or hear different perspectives?
And I think this is building off of your point, Matt.
But Kelly, do you have anything to to weigh in for?
Caitlin I don't think we need to shy away from social media, especially because we know younger generations are not consuming news in the traditional formats like TV and, you know, newspapers going online.
So if we want younger generations to consume news, then we have to expect that they're going to do so on social media.
And it's not the case that social media is all bad.
So all major news outlets, politicians, people who who have information that we want are all on social media.
And it's just a matter I mean, that's why media literacy is so important, because it's the ability to sort of sift through all of these different sources that, again, are are put right next to one another.
Look, very similar, but may have very different credentials or may have very different levels of quality information.
So we want people to be able to have that critical eye when they are just sort of scrolling through their feed, you know, semi mindlessly or oftentimes people use social media for entertainment and are not necessarily going there for information, but often, incidentally encounter information.
So if they're sort of using social media in that way, we want people to be able to reflect on the things that they're hearing.
And for social media not to be an end point, but a starting point.
And we just showed a graphic on the screen, I believe that was from the Pew Research Center as well, that said, that showed adults under 30 are now almost as likely to trust information on social media sites as information from national news outlets.
So how should we think about a shift like that?
Matt I think it's just what the it's habits, right?
It's it's what people are doing that you trust what you do every day.
And I think that that's the trust is formed through doing.
And older people TV more so they trust those sources or they read the newspaper more so they trust that.
And then young people are are on social media because that's where their friends are.
That's where they form kind of social relationships.
So those that's the way that trust emerges is thinking that people understand you and you see they see what they see.
So again, I think Kelly's right that it's not about one medium having priority over another or one medium being better than another.
It's giving people the literacy skills to understand that there's garbage on the mall, but there's also good stuff on the mall.
And to be able to tell the difference between, you know, the news in the noise and a lot of this too, I think the the mode of news consumption that we get with social media is very reliant on sort of personalities and relationships, even if it might be what we call parasocial relationships where people may not know a creator or somebody who's creating content in real life, but feel a connection to them based on the sort of sustained following them, sort of knowing a little bit about their lives, knowing them sort of as a person, as they create content.
And we know that people that news and information passes more quickly.
It has it's stickier when it's coming from somebody that you do.
You know that you appreciate, that you feel like you have a relationship with if not have a relationship with.
So since social media thrives on those things of of parasocial relationships or connected to other people, that, you know, a lot of this means that when we are dealing with a media environment that is it's cacophonous there's there's so many different sources of information, there's so many different ideas floating around.
There's so many different perspectives that one way to kind of cut through that noise is to be able to rely on our relationships as to guide us towards what we should believe or we shouldn't believe.
So we can trust if we if we know somebody again, even if it's not somebody that we know in real life, but that we feel like we have a relationship with.
But we're going to be more likely to to listen to and trust the information that they get from them, those kind of creators.
So it's just a way that I think people use to grapple with this.
The complexity of media environment.
That's a good point.
We do have another email from a viewer.
Greg writes What responsibility, if any, do social media have to moderate content with respect to false or misleading information?
Are they obligated to serve the public good or to serve shareholders?
And I think this is a great question.
It's very timely.
There's a few current events that we could get into after this, if you want.
What do you think about Greg's question that, well, there's responsibility to whom, as always, is an interesting, great thing, right?
So and I think they do have a social responsibility.
But what we do know about social media and anything in Silicon Valley is it's always a question of scale, right?
That these things are optimized to have enormous scales that really go beyond a kind of moderation and curation.
Right.
And so one of the things that Silicon Valley has done is it has kind of come up with a notion of freedom of speech and which is used to justify not moderating anything.
Right.
They say if we were to moderate it and Twitter used to call this trust, right.
We want people to come on to our platforms, which are private platforms and know that they can trust what they see.
It was much easier for Elon Musk to say free speech, we're just going to let everything go.
That that is so his responsibility in that case.
And the second part of the sentence is to shareholders.
And he's the major shareholder, although he's losing money hand over fist.
But but it's not to the public good.
So it's that kind of way of kind of thinking about freedom of speech as the good as opposed to something like public interest being being the good.
I think it should be in the service of the public interest, but I'm often overruled.
I think I certainly agree with you on that one.
I think the question is what what what kind of responsibility we're talking about.
So moral, ethical, social responsibility.
Absolutely.
I think many people would agree that social media outlets do have a responsibility to moderate content because it's of such consequence for our democracy.
But if we're talking about legal responsibility, the problem is that our regulatory framework in the US is very heavily reliant on idea of free speech.
So we don't have a lot of regulation.
Well, we have laws that sort of relieve platforms of the burden of or responsibility of dealing with these things.
Interestingly, though, in Europe, we've seen a completely different approach.
So recently they passed the Digital Services Act in the European Union, which places many more demands on social media platforms for moderating the content.
So legal responsibility for the content that shows up on their sites.
Now, of course, we're kind of outside of the jurisdiction of the EU here, so we may continue to see this sort of same environment that we're seeing right now.
But absolutely, there is there is there are a lot of risks to not dealing with false and misleading information very well.
So platforms in the last actually probably since about the 2016 election when we when fake news sort of has been in the public's mind.
I very prominently platforms have instituted increasingly more policies and develop new technologies to be able to grapple with false and misleading information.
So they have partnered with third party fact checking organizations over the years to be able to vet information that is flagged by their systems or by and or users that might contain false information.
So to be able to more quickly kind of make those determinations of, yes, this is something that is true.
No, you can ignore this, please.
And the solutions, again, aren't perfect.
So usually the responses to false information that platforms detect and again, they don't detect everything, but when they do detect false information, it's usually one of three options.
So it'll be a warning label on the content to say like this is this contains false information or this is, you know, some warning that what you're seeing may not be completely true.
The content may be suppressed a little bit, so demoted, which means that it is less likely to show up in people's feeds.
So hopefully less people see it doesn't have as far of a reach and hopefully doesn't travel as far and fast as some of the the most pernicious, the most difficult fake news stories that we've seen.
And then the third one would be the the the sort of the hardest response, which is to remove the content content entirely.
And usually the removal is restricted only for content that the platforms deemed to have some significant potential for harm.
And usually it's more like physical harm, like somebody could be, you know, experienced physical harm as a result of the information circulating.
And I want to talk about some of those things and some contradictions coming out of Washington, DC this week.
But if you're joining us, I'm Bill Hammond.
This is Conversation's live media literacy on you.
Joining us tonight are Penn State professors Kelly Cotter and Matt Jordan.
Our toll free number is 81486521, two, four.
And we're ready to take your calls.
You can also send us questions by email at Connect at WPSU dot org.
There's a couple of things I'd like to discuss first.
Just this week, the United States Supreme Court heard an argument from two Republican led states, Missouri and Louisiana, accusing the Biden administration of violating the First Amendment by allegedly working with tech platforms to censor content.
I think the Biden administration would argue they were working to combat the spread of dangerous misinformation.
But just how potent is that idea of government censorship when it comes to media literacy on governments have always been concerned about the flow of information, right, since the since there was the press.
Right.
And because and for good reason in in America, we've had a long and very robust tradition of doing this, especially after World War One, when the power of propaganda and modern forms of propaganda started to come into it, into mind.
Right.
There were these cases and a lot of our ownership laws that some of the stuff in Washington is coming from are based on the idea that a foreign owner is not going to have the public interest.
And in fact, they're going to use the affordances of the technology often to sow dissent, to sow distrust, to kind of make people hate one another.
Right.
So that we've we've often treated information, whether it's through print journalism, whether it's through radio, whether it's through TV with similar lenses.
So it's interesting now that all now that, you know, this is free speech has become kind of a weaponized term that has been doing it.
But there's a long history of of doing this.
The idea that there are people who have bad faith and intent are going to use mass media to pollute the dialog is something we should be concerned about.
Right.
Measles are on the rise again.
I just read that there are people who won't vaccinate their puppies because they think puppies.
You know, nobody needs a vaccine.
And so they're dying of parvo, Right?
There are public health concerns that I think a good government should be kind of working to limit the harm of those types of things.
So the interesting thing, too, is that platforms in the last several election cycles, well, so often actually, let's say in the 2016 election, the campaigns had strong relationships with representatives from the platform.
So they work with them on their campaigns and get sort of specialized help to communicate the information that they want to their voters.
So there are some relationships that exist between the platforms.
This is this is a little bit it's changed a little bit since then.
But there are these relationships between, politicians and platforms because of the nature, the importance of getting this information out to users.
So they want politicians to be able to speak effectively to their constituents.
But we also know that platforms often treat elite users, let's say, differently than they treat everyday users.
So there's some allowances made for public figures and celebrities in the way that they deal with content created by those figures than everyday people.
So essentially they might be more lenient or give people more passes for people who who have these higher profiles.
And right now, Facebook's prioritizing political news, moving it out of its main feed.
How do you think a different?
Well, first of all, why would the parent company matter make that change now?
Why is it in their interest to do something?
Well, partly that's in response to attempts to kind of put more money back in the hands of because so the major platforms, Google, Facebook, Instagram, whatnot, most of the revenue gets scraped by them.
So the content producers are your local journalists, your reporter doing the stuff makes the story.
It gets shared on the platforms, and the platforms take the revenue.
So in Australia and in Canada, they've come up with laws and we've kind of started to do something.
In Canada.
They come up with a law which said that any content that is shared news, content that is shared, that you have to give a percentage of the back to the news, to the reporters who do it.
Facebook's response matters response was just to take news off of the feed, and that's how we're going to deal with.
There's just no news.
So it may be them trying to get out in front of it, but it also has to do with the desire of these platforms to keep people on the platforms.
Right.
So if if you have news that is kind of a link and a headline, people are going to click on that and they're going to leave the platform, maybe they'll come back, maybe they won't.
Facebook would rather just you stay on their feed and not be jumping back and forth.
Yeah, I think it comes down to the economics of it to first of all, again, like not many people are seeing news and politics in their news feeds, but it is just it is more trouble and more expensive to deal with the fallout and the criticism and controversies that come out with the criticism that the legitimate criticism that they receive for the circulation of fake news, propaganda, disinformation campaigns, all these things, it's a lot easier to sort of wash their hands of things than to sort of deal with that fallout.
And again, it's a lot of people don't want to see news and politics.
So if they want to keep people on the platform, it's probably the puppies that they want to see and not the politicians.
I'm A sucker for a good cat video.
Yes, absolutely.
There's a stream cleaning videos, too.
I might not get into that, but, you know, we'll see.
We'll see how late I stay up tonight.
So also filling some of the void in local news are different groups like Next Door, even group chats.
But they have some limitations as well.
How should viewers navigate interactions on these type of platforms, Keli?
man.
Next door is has such potential to be really useful.
It is not like let me be clear, it's not a replacement for local journalism.
That is not it's not the same thing.
This is more of like having places like libraries, places like coffee shops, parks, things where where people in a community come together, have conversations, and are able to share information with one another, create opportunities for one another, and strengthen sort of bonds that can help the civic health of a community health and wellbeing, but so that that can be really good.
Next door is still pretty new though, and from what I can tell, still needs still sort of like working out the kinks in the the way that it moderates information and allows information to circulate.
So I don't know.
I don't know about everyone else, but in my experience, my personal experience with Next Door, there's a lot of a lot of kind of garbage circulating on there.
There's a lot of good information, but a lot of garbage, too.
And I think there's still sort of working out how best to to make this a very valuable source of information.
And there's also, yes, like group chats, subreddits, like from Reddit.
Local Facebook groups are really big one.
And those local Facebook groups, I think are probably one of the things that are becoming a mainstay within local communities because it's a familiar technology for a lot of people at this point.
A lot of people are on Facebook, so it's easy to sort of tap into their local networks on those spaces.
And again, that can be a really good thing.
I think we should also say, though, that there's a really good quote, if I can remember, by Clay Shirky.
It's something like we don't have a public sphere online.
We have a corporate sphere that allows public speech, something to that effect.
Maybe correct me if you remember there, but I think that's exactly it.
And I think that's the issue when we talk about any kind of went platforms stepping in to play the role that was traditionally played by journalism, that it's not following the same values that we we maybe want for the kind of information that we want circulating in the communities.
Let's talk a little bit with the time we have left.
I'd like to get some thoughts on regulation.
Of course, social media companies believe they'd be more successful with little to no government regulation, and we've touched on a few examples of where government regulation has has aided the public trust specifically in Europe.
You mentioned some regulation in Australia that changed the way Facebook did business.
But what are some type of regulations that you think would be helpful that we should be considering as a society as far as social media and media literacy?
Keeping that specifically in mind?
I think what we all need to remember is that democracy is us, right?
And that we are the people who get to decide things.
If we have a robust democracy and the corporations would love for you and me and everybody else to say, let's give up on democracy, it's messy.
We'll have to figure out what the public good would be.
And so it's easy for us sometimes to just say, let's give it over to the corporations, which know better.
And as Kelly was mean, they have their economic interests, they have shareholders.
The Mark Zuckerberg needs, you know, to feed his Wagyu beef on the island of Hawaii that he owns.
Right.
So rather than doing the dirty work of democracy, which is to figure out what we want it to be serving, right.
So in Europe, it's a lot of these things are considered public utilities for exactly the reason Kelly was saying like that people use these now as the public square.
So there's a concern that it needs to be regulated in the same way the utility would be.
Everybody needs it.
So we have to control it and we have to make sure that it's serving the public interest.
But that also requires citizens to be willing to be part of a debate as to what it means to be something in the public interest.
Yeah, and I'll just add that again, we have a good model in some of the legislation that European Union has introduced in the last several years.
So I mentioned the Digital Services Act that actually creates new demands on platforms to moderate content in different ways.
So there's different provisions about disinformation, political ads and things of that nature, and there's fines that are attached to that.
So significant fines that platforms face if they fail to properly, adequately moderate content right now or for at first it applied only to the very, very biggest platforms.
It's broadened a little bit in actually, I think like last month it changed slightly, so it broadened a little bit.
But anyway, we have this model that we can look to in the US to follow.
However, again, our our sort of like legislative framework, our our sort of ideological framework is a little bit different than in the European Union.
So I'm not sure if we will be able to do exactly what they have done, but they're certainly have given us a template for, you know, something that could be done.
One thing to keep in mind is that there is a there is also a you know, where is there the kind of corporate libertarian, your version of free speech is just anything goes Wild West.
But there's also a view of free speech that's emerging in legal discourse that looks to what the free speech rules were supposed to do, which is to promote a diverse set of public of opinions being available.
The trouble when one or two corporations and we mostly have oligopolies of media now own everything, is you don't have that diversity, really.
You have only a few things.
So one of the regulations I would like to see is more antitrust regulation, where we say, like we used to, that you can't have one company owning three of the newspapers and two of the TV stocks and one media market, because then you're not going to get that diversity of opinion.
You're going to get very homogenous opinion.
I just read the the Baltimore Sun, you know, was just sold to David Smith, who also happens to own Sinclair Broadcasting, which has gobbled up local broadcasting's three of the four in our media market are Sinclair Broadcasting.
So in Baltimore, which is his hometown, he now owns the major newspaper, three of the five local broadcasting companies and of a series of the other regional newspapers around that.
So one company with a very strong ideological viewpoint is dominating the diversity of opinion in that.
So using the the freedom of speech to say that type of conglomeration of ownership is now getting in the way of a diverse and robust public dialog.
Certainly a lot to think about.
This has been a fascinating conversation.
I hope our viewers at home and viewers in our live audience also, thank you for being here.
So we've been talking with Kelly Cotter, an assistant professor in the Penn State College of Information Sciences and Technology, and Professor Matt Jordan from Penn State's.
Donald Bellisario College of Communication.
I'm Bill Hodgman.
On our next episode of Conversations Live will be will be on April 11th.
We'll talk about spring gardening from all of us at PSU.
Thank you for joining us.
Rewatch this and previous episodes of Conversations Live and more of your favorite w PSU programs on the PBS app.
Conversations Live is a local public television program presented by WPSU