×

Usamos cookies para ayudar a mejorar LingQ. Al visitar este sitio, aceptas nuestras politicas de cookie.


image

CrashCourse: Media Literacy, The Dark(er) Side of Media: Crash Course Media Literacy #10

The Dark(er) Side of Media: Crash Course Media Literacy #10

Say you have an evil twin.

They're just like you but...different – somehow evil-er.

Maybe they have a fancy twirling mustache and are just constantly listening to emo music.

Maybe they hate chocolate and fun and bubbles and the greatest film of all time, Titanic.

What if your evil twin not only looked and sounded just like you, but pretended to be you.

They stole your family and your friends and your significant other and your favorite pair of shoes.

They persuaded everyone they were the real you and you were the evil twin.

And then you were left all confused and alone and you didn't even get a fancy twirling mustache.

Sounds like a nightmare, right?

We've talked all about persuasive techniques and advertising and public relations during this course.

But we haven't talked about their evil twins: propaganda, misinformation and disinformation.

These are the big baddies of the media world, the villains you really need to watch out for.

You could call them the dark side of media.

Though, that would make advertisements that sell you things you probably don't really need the “bright side.”

Let's go with the darker side.

Either way, understanding these evil twins in their many forms is mission critical.

There's no way to be media literate without them.

But to understand evil you'll need to think evil.

Are you ready?

[Theme Music]

Remember, advertisements are public notices promoting a product, event or service.

Public relations is the management of the relationship between the public and a brand.

Both advertisers and PR people use campaigns, or planned, systematic efforts to intentionally persuade us of certain beliefs or to act a certain way.

For example, a company that makes headphones might launch an advertising campaign where a dozen celebrities are filmed and photographed using their product.

These may all be released at the same time and in different locations so that everyone sees their favorite celeb wearing them.

This campaign wants you to like their product because you like their spokespeople.

Or, a public relations firm might start a publicity campaign to get their client all over the media.

Like when your favorite actor is in a new movie and suddenly they're singing carpool karaoke and dancing with The Roots and reading mean tweets about themselves on TV.

This campaign wants you to be aware that this star has a new project coming out.

Hopefully you'll want to experience it, too.

Campaigns that saturate the media landscape with a united theme and message can be really effective.

They can convince us to buy new phones and stop buying cigarettes and vote for one candidate over another in the next election.

One of the key components of a campaign is its coordination.

For a campaign to have the biggest impact requires multiple people working in tandem to accomplish a cohesive goal.

But what happens when that same technique – the widespread coordination of people bent on shifting the media landscape –

what happens when that's taken up for evil?

That's where propaganda comes in.

Propaganda is information used to promote a particular point of view, change behavior or motivate action.

Sometimes that information is facts and ideas, sometimes it's opinions, or intentionally misleading or biased.

Though technically propaganda itself isn't inherently evil, it is usually associated with bad actors.

That's because it's often used to manipulate the public into things they might not naturally do,

like supporting a war or believing harmful stereotypes about others.

And most typically, the people doing the coordinated propaganda campaigns are part of governments.

During World War I, the U.S. Committee on Public Information was formed for just such a purpose – to produce pro-war propaganda.

In World War II it was the Office of War Information.

They made films and posters and advertisements and more to promote patriotism and nationalism.

The government even teamed up with advertisers to get them to push patriotic propaganda.

The propaganda focused on fulfilling one's national duty to join the war or save food for the war or buy bonds to support it.

It was like peer pressure with beautifully decorated posters.

That famous image of Uncle Sam saying “I want you for the U.S. army”?

Oh yeah, that's propaganda.

And it was so good they brought it back for World War II.

Rosie the Riveter? Oh yeah, she is too.

Sorry if I just ruined your favorite Halloween costume.

Other types of wartime propaganda make the opposition seem violent or barbaric to stoke fear in the enemy.

U.S. propaganda during World War II sometimes featured racist depictions of Japanese people for this purpose.

Similarly, in Germany the Nazi party sent around anti-Semitic propaganda before and during World War II.

If propaganda is used to psychologically persuade,

disinformation is used to confuse and distract the intended audience using deliberately false or misleading information.

Disinformation campaigns can be used to poke and prod opposing groups and heighten the tension between them.

And just because these campaigns aren't being done by official government propaganda offices doesn't mean they're small scale, or ineffective.

With the reach of the internet, and the ability to make digital media, people all over the globe can organize themselves for coordinated campaigns.

By working together, flooding different media outlets with carefully crafted messages, a group can drastically change public information.

During the 2016 U.S. election, Russian operatives purchased misleading and extreme Facebook ads targeted to both liberal and conservative American voters.

They even appeared to organize both sides of a protest in front of a Texas Islamic Center.

So sometimes disinformation can work like propaganda, trying to get people to act.

But more often, what disinformation is best at is confusing the facts of an issue.

Disinformation can whip up a smokescreen, and disperse the attention of the masses.

This style of disinformation can also be used to excuse or dismiss bad actions or behavior.

In Beijing in 1989, students led pro-democracy demonstrations in the capital's Tiananmen Square.

The Chinese government responded violently, killing hundreds or even thousands of peaceful protestors.

Why do I say hundreds or thousands?

Because the government stymied efforts to make a full accounting of the dead.

Since the massacre, the Chinese government has called reports of the events misleading and suggested the Western media is exaggerating it just to demonize them.

They still censor information about it today.

When powerful governments become set on disinformation campaigns, it can be difficult for its citizens to discover the truth.

It can be even more difficult for outsiders to get well-sourced information, too.

Disinformation can even include magic tricks, to – well, kind of.

Let's head into the Thought Bubble for this.

Some disinformation is full of lies, like we said – but some of it is full of distraction, too.

The art of active misdirection is often used by political pundits and celebrity press agents.

They'll plant stories in the press about their party or client or the opposition to distract from something they don't want to talk about.

It's like how a magician does that funny thing with his hand to distract you from wherever he got that rabbit.

Or take, for example, this headline: Pope Francis Shocks World, Endorses Donald Trump for President.

That sounds kind of wild, right?

The Pope never gets involved in US politics like that – an attention-grabbing headline for sure.

The thing is, this headline is purely fabricated news.

Published in July 2016 by WTOE 5 News, a now defunct website, it was entirely made up by an unknown writer.

The site was actually part of a network of websites that published more than 750 similarly made up articles.

Why? Who would do such a thing?

Well, apparently lots of Macedonian teenagers distracted angry, partisan American voters with stuff like this leading up to the 2016 election.

Magician Sam Sharpe actually describes this distraction as lowering our attention vigilance.

By slightly shifting our gaze to something else, we're lulled into an atmosphere of susceptibility, making us more gullible to improbable situations.

When we find ourselves in an atmosphere we usually trust, like Facebook for example, we're less likely to question the info we find.

Plus, many of us only read headlines – 59% of links shared on social media aren't even clicked.

We just share away without a second thought.

Tricking us is like taking candy from a baby, apparently.

The moral of the story: always double check the veracity of information and sources we see, lest we become victims of misdirection.

Thanks, Thought Bubble!

The key thing to understand is just how coordinated disinformation can be today.

Not just a white lie told in a forum post, but whole networks of people working to create an alternate reality.

One of the reasons disinformation is so effective online is because of the existence of a related phenomenon: Misinformation.

This is a different beast altogether – misinformation is unintentionally inaccurate information.

Accidents, or mistakes in reporting.

Often the most egregious examples of misinformation happen during a breaking news situation.

When there's a lot of information floating around during a crisis and members of the media want to be the first to report on the news, mistakes happen.

They get it wrong. They don't double check. They make a typo.

Reputable news organizations will issue a correction when they've made a mistake like this.

Sometimes misinformation becomes a pretty funny story.

Like that time The Chicago Tribune printed 150,000 newspapers saying that Thomas Dewey had beat Harry S. Truman in the 1948 election.

Spoiler alert: he lost. Awkward.

Misinformation has always been a problem.

As long as there have been news sources, there have been errors and corrections and updates.

But our new online media environment changes how those mistakes get made, and the impact they have on people.

Increasingly, people get information from a variety of sources online, often shared and mixed together over social media,

rather than from a small number of central institutions.

It can make for some laughable mistakes, but the darker side of media is no joke.

We base important decisions on the media every day, from what we'll buy to who we'll vote for.

Bad information can lead to bad decisions with serious consequences.

Disinformation, misinformation, and propaganda are even easier to spread in the digital age.

Media literacy scholar Renee Hobbs has even said that today, “Everyone, it seems, has become a propagandist.”

Weeding through it all can be hard to do.

Especially if the initial misinformation goes viral.

Once a consumer hears or reads misinformation, it's often hard to correct it in their minds, even when confronted with the right information.

Plus, once we've deemed a source trustworthy or safe, it's hard for us to even criticize their content.

Our brains are pretty stubborn.

What's the best way to determine if what you're seeing is from the darker side of media?

Don't worry, we're going to walk you through it in our next episode on media skills.

Until then, I'm Jay Smooth and this is Crash Course Media Literacy.

Crash Course Media Literacy is filmed in the Dr. Cheryl C. Kinney Studio in Missoula, MT.

It's made with the help of all of these nice people, and our animation team is Thought Cafe.

Crash Course is a Complexly production.

If you want to imagining the world complexly with us, check out some of our other channels like Eons, Animal Wonders, and SciShow Psych.

If you'd like to keep Crash Course free for everyone, forever, you can support the series at Patreon,

a crowdfunding platform that allows you to support the content you love.

Thank you to all of our patrons for making Crash Course possible with their continued support.


The Dark(er) Side of Media: Crash Course Media Literacy #10 Die dunkle(re) Seite der Medien: Crashkurs Medienkompetenz #10 El lado oscuro de los medios de comunicación: Curso acelerado de alfabetización mediática nº 10 La face cachée des médias : Cours accéléré d'éducation aux médias n°10 미디어의 어두운 면: 단기 속성 과정 미디어 리터러시 #10 O lado negro dos media: Curso intensivo de literacia mediática #10 媒体的阴暗面媒体素养速成班 #10

Say you have an evil twin.

They're just like you but...different – somehow evil-er.

Maybe they have a fancy twirling mustache and are just constantly listening to emo music.

Maybe they hate chocolate and fun and bubbles and the greatest film of all time, Titanic.

What if your evil twin not only looked and sounded just like you, but pretended to be you.

They stole your family and your friends and your significant other and your favorite pair of shoes.

They persuaded everyone they were the real you and you were the evil twin.

And then you were left all confused and alone and you didn't even get a fancy twirling mustache.

Sounds like a nightmare, right?

We've talked all about persuasive techniques and advertising and public relations during this course.

But we haven't talked about their evil twins: propaganda, misinformation and disinformation.

These are the big baddies of the media world, the villains you really need to watch out for.

You could call them the dark side of media.

Though, that would make advertisements that sell you things you probably don't really need the “bright side.”

Let's go with the darker side.

Either way, understanding these evil twins in their many forms is mission critical.

There's no way to be media literate without them.

But to understand evil you'll need to think evil.

Are you ready?

[Theme Music]

Remember, advertisements are public notices promoting a product, event or service.

Public relations is the management of the relationship between the public and a brand.

Both advertisers and PR people use campaigns, or planned, systematic efforts to intentionally persuade us of certain beliefs or to act a certain way.

For example, a company that makes headphones might launch an advertising campaign where a dozen celebrities are filmed and photographed using their product.

These may all be released at the same time and in different locations so that everyone sees their favorite celeb wearing them.

This campaign wants you to like their product because you like their spokespeople.

Or, a public relations firm might start a publicity campaign to get their client all over the media.

Like when your favorite actor is in a new movie and suddenly they're singing carpool karaoke and dancing with The Roots and reading mean tweets about themselves on TV.

This campaign wants you to be aware that this star has a new project coming out.

Hopefully you'll want to experience it, too.

Campaigns that saturate the media landscape with a united theme and message can be really effective.

They can convince us to buy new phones and stop buying cigarettes and vote for one candidate over another in the next election.

One of the key components of a campaign is its coordination. One of the key components of a campaign is its coordination.

For a campaign to have the biggest impact requires multiple people working in tandem to accomplish a cohesive goal.

But what happens when that same technique – the widespread coordination of people bent on shifting the media landscape – But what happens when that same technique – the widespread coordination of people bent on shifting the media landscape –

what happens when that's taken up for evil? what happens when that's taken up for evil?

That's where propaganda comes in.

Propaganda is information used to promote a particular point of view, change behavior or motivate action.

Sometimes that information is facts and ideas, sometimes it's opinions, or intentionally misleading or biased.

Though technically propaganda itself isn't inherently evil, it is usually associated with bad actors. Though technically propaganda itself isn't inherently evil, it is usually associated with bad actors.

That's because it's often used to manipulate the public into things they might not naturally do,

like supporting a war or believing harmful stereotypes about others. like supporting a war or believing harmful stereotypes about others.

And most typically, the people doing the coordinated propaganda campaigns are part of governments.

During World War I, the U.S. Committee on Public Information was formed for just such a purpose – to produce pro-war propaganda.

In World War II it was the Office of War Information. In World War II it was the Office of War Information.

They made films and posters and advertisements and more to promote patriotism and nationalism.

The government even teamed up with advertisers to get them to push patriotic propaganda.

The propaganda focused on fulfilling one's national duty to join the war or save food for the war or buy bonds to support it. The propaganda focused on fulfilling one's national duty to join the war or save food for the war or buy bonds to support it.

It was like peer pressure with beautifully decorated posters.

That famous image of Uncle Sam saying “I want you for the U.S. army”?

Oh yeah, that's propaganda.

And it was so good they brought it back for World War II.

Rosie the Riveter? Oh yeah, she is too.

Sorry if I just ruined your favorite Halloween costume.

Other types of wartime propaganda make the opposition seem violent or barbaric to stoke fear in the enemy.

U.S. propaganda during World War II sometimes featured racist depictions of Japanese people for this purpose.

Similarly, in Germany the Nazi party sent around anti-Semitic propaganda before and during World War II.

If propaganda is used to psychologically persuade,

disinformation is used to confuse and distract the intended audience using deliberately false or misleading information. disinformation is used to confuse and distract the intended audience using deliberately false or misleading information.

Disinformation campaigns can be used to poke and prod opposing groups and heighten the tension between them.

And just because these campaigns aren't being done by official government propaganda offices doesn't mean they're small scale, or ineffective.

With the reach of the internet, and the ability to make digital media, people all over the globe can organize themselves for coordinated campaigns.

By working together, flooding different media outlets with carefully crafted messages, a group can drastically change public information.

During the 2016 U.S. election, Russian operatives purchased misleading and extreme Facebook ads targeted to both liberal and conservative American voters. During the 2016 U.S. election, Russian operatives purchased misleading and extreme Facebook ads targeted to both liberal and conservative American voters.

They even appeared to organize both sides of a protest in front of a Texas Islamic Center. They even appeared to organize both sides of a protest in front of a Texas Islamic Center.

So sometimes disinformation can work like propaganda, trying to get people to act.

But more often, what disinformation is best at is confusing the facts of an issue.

Disinformation can whip up a smokescreen, and disperse the attention of the masses.

This style of disinformation can also be used to excuse or dismiss bad actions or behavior.

In Beijing in 1989, students led pro-democracy demonstrations in the capital's Tiananmen Square.

The Chinese government responded violently, killing hundreds or even thousands of peaceful protestors.

Why do I say hundreds or thousands?

Because the government stymied efforts to make a full accounting of the dead. Because the government stymied efforts to make a full accounting of the dead.

Since the massacre, the Chinese government has called reports of the events misleading and suggested the Western media is exaggerating it just to demonize them. Since the massacre, the Chinese government has called reports of the events misleading and suggested the Western media is exaggerating it just to demonize them.

They still censor information about it today.

When powerful governments become set on disinformation campaigns, it can be difficult for its citizens to discover the truth.

It can be even more difficult for outsiders to get well-sourced information, too.

Disinformation can even include magic tricks, to – well, kind of.

Let's head into the Thought Bubble for this.

Some disinformation is full of lies, like we said – but some of it is full of distraction, too.

The art of active misdirection is often used by political pundits and celebrity press agents. The art of active misdirection is often used by political pundits and celebrity press agents.

They'll plant stories in the press about their party or client or the opposition to distract from something they don't want to talk about.

It's like how a magician does that funny thing with his hand to distract you from wherever he got that rabbit.

Or take, for example, this headline: Pope Francis Shocks World, Endorses Donald Trump for President. Or take, for example, this headline: Pope Francis Shocks World, Endorses Donald Trump for President.

That sounds kind of wild, right?

The Pope never gets involved in US politics like that – an attention-grabbing headline for sure.

The thing is, this headline is purely fabricated news.

Published in July 2016 by WTOE 5 News, a now defunct website, it was entirely made up by an unknown writer.

The site was actually part of a network of websites that published more than 750 similarly made up articles.

Why? Who would do such a thing?

Well, apparently lots of Macedonian teenagers distracted angry, partisan American voters with stuff like this leading up to the 2016 election. Well, apparently lots of Macedonian teenagers distracted angry, partisan American voters with stuff like this leading up to the 2016 election.

Magician Sam Sharpe actually describes this distraction as lowering our attention vigilance. Magician Sam Sharpe actually describes this distraction as lowering our attention vigilance.

By slightly shifting our gaze to something else, we're lulled into an atmosphere of susceptibility, making us more gullible to improbable situations. By slightly shifting our gaze to something else, we're lulled into an atmosphere of susceptibility, making us more gullible to improbable situations.

When we find ourselves in an atmosphere we usually trust, like Facebook for example, we're less likely to question the info we find.

Plus, many of us only read headlines – 59% of links shared on social media aren't even clicked.

We just share away without a second thought.

Tricking us is like taking candy from a baby, apparently.

The moral of the story: always double check the veracity of information and sources we see, lest we become victims of misdirection. The moral of the story: always double check the veracity of information and sources we see, lest we become victims of misdirection.

Thanks, Thought Bubble!

The key thing to understand is just how coordinated disinformation can be today.

Not just a white lie told in a forum post, but whole networks of people working to create an alternate reality.

One of the reasons disinformation is so effective online is because of the existence of a related phenomenon: Misinformation.

This is a different beast altogether – misinformation is unintentionally inaccurate information. This is a different beast altogether – misinformation is unintentionally inaccurate information.

Accidents, or mistakes in reporting.

Often the most egregious examples of misinformation happen during a breaking news situation. Often the most egregious examples of misinformation happen during a breaking news situation.

When there's a lot of information floating around during a crisis and members of the media want to be the first to report on the news, mistakes happen.

They get it wrong. They don't double check. They make a typo.

Reputable news organizations will issue a correction when they've made a mistake like this. Reputable news organizations will issue a correction when they've made a mistake like this.

Sometimes misinformation becomes a pretty funny story.

Like that time The Chicago Tribune printed 150,000 newspapers saying that Thomas Dewey had beat Harry S. Truman in the 1948 election.

Spoiler alert: he lost. Awkward.

Misinformation has always been a problem.

As long as there have been news sources, there have been errors and corrections and updates. As long as there have been news sources, there have been errors and corrections and updates.

But our new online media environment changes how those mistakes get made, and the impact they have on people.

Increasingly, people get information from a variety of sources online, often shared and mixed together over social media,

rather than from a small number of central institutions.

It can make for some laughable mistakes, but the darker side of media is no joke.

We base important decisions on the media every day, from what we'll buy to who we'll vote for. We base important decisions on the media every day, from what we'll buy to who we'll vote for.

Bad information can lead to bad decisions with serious consequences.

Disinformation, misinformation, and propaganda are even easier to spread in the digital age.

Media literacy scholar Renee Hobbs has even said that today, “Everyone, it seems, has become a propagandist.”

Weeding through it all can be hard to do.

Especially if the initial misinformation goes viral. Especially if the initial misinformation goes viral.

Once a consumer hears or reads misinformation, it's often hard to correct it in their minds, even when confronted with the right information.

Plus, once we've deemed a source trustworthy or safe, it's hard for us to even criticize their content. Plus, once we've deemed a source trustworthy or safe, it's hard for us to even criticize their content.

Our brains are pretty stubborn.

What's the best way to determine if what you're seeing is from the darker side of media?

Don't worry, we're going to walk you through it in our next episode on media skills.

Until then, I'm Jay Smooth and this is Crash Course Media Literacy.

Crash Course Media Literacy is filmed in the Dr. Cheryl C. Kinney Studio in Missoula, MT.

It's made with the help of all of these nice people, and our animation team is Thought Cafe.

Crash Course is a Complexly production.

If you want to imagining the world complexly with us, check out some of our other channels like Eons, Animal Wonders, and SciShow Psych.

If you'd like to keep Crash Course free for everyone, forever, you can support the series at Patreon,

a crowdfunding platform that allows you to support the content you love.

Thank you to all of our patrons for making Crash Course possible with their continued support.