I watched The Great Hack this past weekend – a documentary on Netflix about the Cambridge Analytica scandal. While I don’t agree entirely with the fatalist premise in the narrative, it will give anyone working in PR, marketing or media a lot to think about. For example, at what point does data and targeting cease to become marketing or PR and cross the line into manipulation?
While some of the data mining techniques described in the film are new – the methods used to act on data are not. For example, Edward Bernays, who is widely revered as the father of PR, doubled the market for tobacco by using similar techniques in 1929. Bernays didn’t have 5,000 data points on every American, but he was still able to tap psychographics to drive behavior change.
While the film is largely about political communication, the Bernays’s example is a reminder that there is precedent for the application of in business. I think it’s easy for a business to dismiss the risk of manipulation or worse, disinformation, as only relevant in politics, which makes it vulnerable.
The concern here isn’t just for the ethics of data and psychology in business persuasion – but also the ability to recognize when it’s being used against you. And that’s the theme for this week’s Unscripted Marketing Links (UML), where on an occasional Saturday, I roundup three links centered on one idea and worthy of your perusal.
1) The risks of disinformation in business.
The speed and openness of communications on the internet have also made the spread of disinformation easier. The 2019 Cyber Threatscape Report by Accenture iDefense provides an excellent primer and literature review on disinformation.
The report summarizes disinformation as an action to:
“dismiss an opponent’s claims or allegations, distort events to serve political purposes, distract from one’s own activities, and dismay those who might otherwise oppose one’s goals.”
The goal of dismissal, distortion and distraction is to:
“divide, discredit, distract, deny, and demoralize.”
So, these things work together – doctrinal action and objective – distort to discredit for example. In practice, some of the examples look like this:
- flood “the media with multiple versions of a story to confuse the audience”;
- publicize “scandalous information to discredit a critic or adversary”; and
- distract from negative information, by highlighting or even creating some other crisis or scandal.”
We’ve seen this in politics and geopolitics. What’s the applicability in business? The report references several specific possibilities:
- connecting with business executives on social media, like LinkedIn, to perform surveillance, or foster an air of familiarity for social engineering;
- “breaching a target’s e-mails, analyzing the stolen content to find unflattering information, possibly distorting or embellishing that information”;
- “false hacktivism and creating inauthentic online personas and troll bots to broadcast the negative information and to influence popular opinion against the target”;
- spreading false information designed to trigger algorithmic trading on stock exchanges, which begins to undermine the integrity of the financial system; and
- promoting false information in healthcare that leads to mistrust of the medical community.
Importantly, disinformation doesn’t have to be salacious or scandalous to create a real PR crisis that distracts the business – to derail an acquisition deal, rattle investors, upset customers or rile up employees. The Accenture report is free to download without registration. While it’s 100 pages long and centers on cybersecurity, the first 30 pages or so make excellent reading for professionals in PR, marketing and journalism.
2) Do new facts reinforce or change beliefs?
Is it possible to counter disinformation with facts if the facts contradict strongly held views? It seems it is possible, according to a study by Thomas Wood, Ph.D. of the Ohio State University, and Ethan Porter, Ph.D. of George Washington University.
The two professors conducted five experiments, across more than 10,000 subjects and tested 52 issues and found, “By and large, citizens heed factual information, even when such information challenges their ideological commitments.”
These findings contradict a decade-old study of the so-called “backfire effect” where subjects discounted new facts that challenge “partisan and ideological attachments.”
“The authors found no evidence of a backfire effect in any of their experiments, even when the false claim and the speaker aligned with the participants’ own political affiliation. Participants generally used the new information to update and inform their current beliefs when provided with a correction and in no situation did the correction motivate them to strengthen their misperceptions. The findings support the notion that most people tend to avoid the mental effort of arguing against corrections.”
And so, it seems, facts do matter.
3) Diplomacy and trust as countermeasures.
Frank Shaw, who leads communications for Microsoft, said the skill of diplomacy is more important than ever for PR professionals. That’s according to a story about his PRSA presentation compiled by Ted Kitterman for PR Daily, titled, Tips from Microsoft’s Frank Shaw for fighting disinformation.
“Disinformation drives dissent,” the article quotes him as saying. “We have to build consensus. We have to bring everyone together.”
He also contends that “trust is entropic” it’s not something that you or an organization earns and keeps, but one that needs continuous cultivation and reason for retention. Shaw said trust “will decay over time unless invested in.”
He counts employees as the “best defense against misinformation,” but cautions it cannot be an afterthought. Businesses and leaders need to engaged employees “before you need it.”
* * *
There’s an interesting but dated video that circulates YouTube called “Edward Bernays on Propaganda and Public Relations.” It runs less than eight minutes long, and it spells out how Bernays approached his tobacco campaign and includes an interview with the man himself.
If you enjoyed this post, you might also like:
Intensify or Downplay? The Hugh Rank Schema for Propaganda
Image credit: Unsplash