On Modern Propaganda

The propaganda of yesteryear used to be of "four legs good, two legs bad" kind. It praised its authors and denounced their enemies. As kids during the communist era we've learned how, in capitalist countries, food is burned or dumped into the sea while children in Africa are dying of hunger. As teenagers we've listened to Radio Free Europe which taught us about human rights violations in the eastern block. From our grandparents we've heard about the high standard of living under wartime Slovak Nazi puppet state.

And while that kind of propaganda is far from being dead we are already seeing an advance of a new kind of propaganda, one that is very different from what we were used to.

Many examples can be found here but let me quote at least a few of them.

YouTube videos of police beatings on American streets. A widely circulated internet hoax about Muslim men in Michigan collecting welfare for multiple wives. A local news story about two veterans brutally mugged on a freezing winter night. All of these were recorded, posted or written by Americans. Yet all ended up becoming grist for a network of Facebook pages linked to a shadowy Russian company that has carried out propaganda campaigns for the Kremlin, and which is now believed to be at the center of a far-reaching Russian program to influence the 2016 presidential election.

The Russian pages — with names like "Being Patriotic," "Secured Borders" and "Blacktivist" — cribbed complaints about federal agents from one conservative website, and a gauzy article about a veteran who became an entrepreneur from People magazine. They took descriptions and videos of police beatings from genuine YouTube and Facebook accounts and reposted them, sometimes lightly edited for maximum effect.

In early 2016, Being Patriotic copied and pasted a story from the conspiracy site InfoWars, saying that federal employees had taken “land from private property owners at pennies on the dollar.” The Russian page added some original text: “The nation can’t trust the federal government anymore. What a disgrace!”

This past March, another of the Russian pages, Secured Borders, reposted a video that it attributed to Conservative Tribune, part of the conservative and pro-Trump sites run by Patrick Brown. The video, which falsely claims that Michigan allows Muslim immigrants to collect welfare checks and other benefits for four wives, originated on a YouTube channel called CleanTV.

The Blacktivist Facebook page appears to have specialized in passionate denunciations of the criminal justice system and viral videos of police violence, many of them gathered from Facebook and YouTube.

The old kind of propaganda have served a double purpose. First, it have disseminated an ideology. Second, it have pursued a geopolitical goal of weakening the enemy by sowing unrest within their ranks.

It seems that the new kind of propaganda comes from the realization that the first, ideological, component of propaganda was not particularly effective. And that the second, seditious, component can fare much better if it is not burdened by the dead weight of ideology.

Let's consider an idealized model of the new kind of propaganda — and yes, it's so apolitical that it can be expressed as an algorithm:

  1. Search the web for heated argument.
  2. Discard irrelevant stuff (e.g. personal quarrels).
  3. Promote the most heated content on both sides of the issue.

While that itself would be effective enough you can go even further by:

  1. Look at opinion polls on the contentious issue.
  2. Promote the losing side.
  3. Rinse and repeat.

Ideally, you want one half of the country to vigorously promote one point of view and the other half to fiercily defend the opposite one. (And if we want to go conspiracy-theoretic here, why do so many elections and referendums of late end with approximately 50/50 result?)

There are some point to make about the system.

On the side of offense:

  1. By using the content generated by local people you are always going to hit the vulnerable spot. If the content was generated in your dept. of propaganda the effectiveness would be much lower.
  2. That being the case, the entire process can be fully automated. No human intervention is needed. Which also means that it's super cheap.
  3. By targeting both sides of the conflict your target audience is much larger than it would be otherwise.
  4. Also, polarized argument results in psychological pressure to pick a side and thus eliminates the middle ground that would otherwise be a meeting place for people immune to the propaganda.
  5. There's synergy with the enemies of your enemy. Given that there's no ideological component involved, the target is going to get attacked by Russia in exactly the same way as by North Korea, China, Iran or Germany or Liechtenstein. The result would look like a concerted effort, yet it doesn't have to be premeditated.
  6. "Us versus them" mentality is so strong in human beings that even if the scheme is widely known and understood most people would still have very hard time trying not to fall prey to it.

On the side of defense:

  1. The lines of attack are predictable. If the enemy can use algorithm to pick most contentious issue so can you.
  2. The level of attack is predictable. Given that there are many actors, states and non-state entities alike, the fact whether one of them joins in or not doesn't really make any difference. And given that the attack is very cheap it's not going to be limited by enemy's resources but rather by the amount of anger your society is willing to sustain.

To sum it up, what we see here is an extremely cheap attack on the most costly asset in the society — it's mutual trust and trust in its political institutions.

EDIT Oct 17th: Here's a piece of news from today that is explicit about the fact that genuine woes are the ideal point of attack:

“The task wasn’t to support Trump,” one of the factory’s employees told RBC. “We raised social issues and other problems that already existed in the US, and tried to shine as bright a light as possible on them.”

October 16th, 2017

Discussion Forum