• Skip to main content
  • Skip to primary sidebar

The Author's Website of T. W. Cox

Hello, and Welcome!

  • True Magic
  • About Me
  • My Writing
  • Blog
  • Resources
  • Contact Me
  • StoryBuilder

Blog

The Future of Fake News – Fake News 5

September 17, 2019 by tcox@svsoft.com

A new feature for Google Assistant, Duplex, stole the limelight at this year’s annual developer’s conference.[1] Duplex allows the app to book appointments for you by making phone calls on its own, speaking and sounding so life-like that it fools just about everyone into thinking it’s a real human.
The outcry was immediate. While some saw the feature’s usefulness, others found it frightening that computers could now fill the world with robot impostors.
It’s not hard to visualize how this technology could become a part of the fake news problem, but it’s only the latest: this is, after all, the age of Photoshop, Russian twitter bots, and phishing websites. As a Guardian article[2] points out, we now live in an era when audio and video tools such as Face2Face[3] can make public figures appear to say anything and act any way you wish. As the article puts it, ‘ We’ve long been told not to believe everything we read, but soon we’ll have to question everything we see and hear as well.’
However, the theme of this series of articles isn’t to add to the chorus of dystopian voices telling us we’re doomed to a world of fake news, but to discuss how to recognize and overcome misinformation and bias. Looking to the future, if AI and technology can be part of the problem, can they also be part of the cure?
Google certainly thinks so. It recently announced the Google News Initiative [4], an umbrella project designed to tie together all of the company’s existing projects to work with journalists, such as Google News Lab, First Draft, and the Digital News Initiative. The company will ante up $300 million over the next three years on its various journalism projects. Google’s interest in combating fake news is obvious; its core business is advertising; last year Google paid out $12.6 billion to publishers through advertising revenue splits. If users distrust and turn away from the sites the search engine take them too, both Google and its partners suffer.
From the 20,000 foot level, Google’s efforts appear to largely be centered on steering viewers to ‘more authoritative sources’, and to educating both journalists and viewers. For example, MediaWise [5], a joint project with Poynter Institute, Stanford University, and the Local Media Association, will help middle and high school students become smarter consumers of news.
Google’s education efforts aim to steer news consumers toward what Google calls ‘more authoritative content.’ For example, YouTube users now can have a ‘Top News’ shelf which contains, in Google’s words, ‘highlighted relevant content from verified news sources.’
While this is a laudable goal, I’m a bit of a skeptic. As a previous article [tk link to Trust No One article] points out, a news story can be completely truthful but still try to influence your views and push the author’s (or his organization’s) opinion or agenda- in other words, be biased or slanted. These days this is all too often the case for both alternative and mainstream media. It’s therefore up to you, the reader or viewer, to decide what distinguishes true from false, and belief from opinion. The good news is that software is increasingly able to help with this process.
For instance, Google’s Jigsaw[6] is an internal incubator that ‘builds technology to tackle a range of global security challenges ranging from thwarting online censorship to mitigating the threats from digital attacks to countering violent extremism to protecting people from online harassment.’ The projects created, if successful, graduate to wider use in various ways. One example is Perspective[7], an API which uses machine learning models to score the perceived impact of a comment, to provide realtime feedback to help moderators do their job better. Another is Share the Facts[8], developed jointly with Duke University, which allows readers to share fact-check information either by querying or by embedding a link in an article or blog post.
Meanwhile, Facebook has revamped its news feed by putting more emphasis on interpersonal interactions from friends and family than on news sources. Twitter sent an email to 678,000 users informing them they may have received posts from a now-suspended Russian propaganda outfit called the Intenet Research Agency. According to one paper[9], bots make it easy to generate high volumes of low quality or inaccurate information.
These actions are unlikely to end the social media giants’ controversies. Facebook argues that it’s not a news company in the traditional sense, but as long as it distributes other publishers’ content, it’s an editor. As for Twitter, estimates suggest that as many as 48 million accounts- fifteen percent of all accounts- are actually robots- mostly benign, scheduling posts for timing, providing news alerts or customer service- but that’s still a lot of bots, and as the beginning of this article suggests, the ability of bots can only increase. Even without bots, its users- you and I- have our biases, and they’re going to bump into other peoples’ biases on social media platforms.
The media giants aren’t the only groups confronting the fake news problem. Advances in machine learning and natural language programming now make it possible to develop systems that can examine news articles for factual truth and biases, and anyone can try. For example, there’s Logically[12], an intelligent news feed that ‘uses complex analytics to identify fake news, separate fact from falsehoods, and illustrate discrepancies in the sentiment of journalism from across the political spectrum.’
What’s currently possible is automatically scraping news transcripts to find claims that can be tested as true or false, then matching them against libraries of existing fact checks like Share the Facts[14], one of the aforementioned Jigsaw projects. Examples of such software are Full Fact[15], The Duke Reporter’s Lab tool[16], and Checkueado[17].
Finding assertions to fact-check is half the problem; the other half is checking them. According to a Reuters report[13], fully automated fact-checking isn’t even close to being capable of the judgment that journalists apply on a day-to-day basis. The process of fact-checking is labor-intensive; few asserted facts are checked compared to the number of assertions made, despite an increased number of fact-checker sites in recent years.
One way to get around this is to use crowdsourced volunteer fact-checkers. Perhaps the most interesting attempt at this approach is Wikitribune, created by Wikipedia founder Jimmy Wales, which plans to hire journalists and pair them with volunteers. How well this will work (and scale) remains to be seen, but if, as Wales hopes, Wikitribute can do as a news service what Wikipedia accomplished as an encyclopedia, it’s worth watching.
A paper by Chloe Lim, a Ph.D. student at Stanford University, reports little overlap in the statements that fact-checkers check[18]. Out of 1065 fact-checks by PolitiFact and 240 fact-checks by The Washington Post’s Fact-Checker, there were only 70 statements that both fact-checkers checked. The study found that the fact-checkers gave consistent ratings for 56 out of 70 statements, which means that one out every five times, the two fact-checkers disagree on the accuracy of statements. One reason cited by Lim is that politicians’ statements are often vague or general, and thus subject to interpretation.
Meanwhile, there are other attempts. There’s even a competition, Fake News Challenge[11], a grassroots effort of over 100 volunteers and 71 teams from academia and industry around the world whose goal is ‘to address the problem of fake news by organizing a competition to foster development of tools to help human fact-checkers identify hoaxes and deliberate misinformation in news stories’.
As a previous article[19] points out, bias takes many forms. Fact-checkers won’t catch all bias and may contain biases of its own, such as which facts to check and which to ignore. Other substantial issues include opinion vs. Objectivity in reporting, how sources are identified and used, and linguistic manipulation such as slanted language, exaggeration, over-generalization, and appeals to emotion.

[1]
https://ai.googleblog.com/2018/05/duplex-ai-system-for-natural-conversation.html
[2] https://www.theguardian.com/technology/2017/jul/26/fake-news-obama-video-trump-face2face-doctored-content
[3] https://web.stanford.edu/~zollhoef/papers/CVPR2016_Face2Face/paper.pdf
[4]
https://blog.google/topics/google-news-initiative/announcing-google-news-initiative/
[5]
https://ed.stanford.edu/news/stanford-education-scholars-create-resources-help-young-people-spot-fake-information-online
[6] Google Jigsaw
https://jigsaw.google.com/
[7] Perspective
https://www.perspectiveapi.com/#/
[8] Share the Facts
https://www.sharethefacts.org/
[9] Measuring Online Social Bubble
https://arxiv.org/abs/1502.07162
[10] Google Cloud Language
https://cloud.google.com/natural-language/
[11] Fake News Challenge http://www.fakenewschallenge.org/
[12] Logically
https://logically.co.uk
[13] Factsheet: Understanding the Promise and Limits of Automated Fact-Checking
https://reutersinstitute.politics.ox.ac.uk/risj-review/factsheet-understanding-promise-and-limits-automated-fact-checking
[14] Share the Facts
http://www.sharethefacts.org/
[15] Full Fact
https://fullfact.org/
[16] Duke Reporter’s Lab
https://reporterslab.org/
[17] Chequeado
http://chequeado.com/
[18] Checking How Fact-checkers Check
https://drive.google.com/file/d/0B_wUaJ01JSddZTNWVWpkRzVXUzg/view
[19] Recognizing Bias
(my article)
[20] FakeBox
https://towardsdatascience.com/i-trained-fake-news-detection-ai-with-95-accuracy-and-almost-went-crazy-d10589aa57c

https://www.nbcnews.com/mach/science/fake-news-still-problem-ai-solution-ncna848276
https://www.technologyreview.com/s/609717/can-ai-win-the-war-against-fake-news/

https://channels.theinnovationenterprise.com/articles/how-can-artificial-intelligence-combat-fake-news
https://www.theverge.com/2018/4/5/17202886/facebook-fake-news-moderation-ai-challenges
http://www.bqlive.co.uk/creative-media/2017/09/29/news/new-start-up-aims-to-combat-fake-news-27967/
(note that https://thelogically.com, Jain’s site, is down)
http://bigdata-madesimple.com/could-ai-help-stop-fake-news-in-the-near-future/
http://www.fakenewschallenge.org/
https://www.facebook.com/verge/videos/1618100861559584/
https://www.forbes.com/sites/alanwolk/2018/01/17/can-enterras-advanced-ai-systems-stop-the-fake-news-epidemic/#4daa3cc204
https://www.entrupy.com/technology/
https://en.wikipedia.org/wiki/Sentiment_analysis
https://www.prnewswire.com/news-releases/media-industry-titans-to-launch-vidl-a-transformational-news-technology-platform-utilizing-ai-technology-and-blockchain-to-report-accurate-global-news-events-and-incidents-in-an-automated-real-time-manner-300579694.html
https://www.vidlnewscorp.com/
https://web.stanford.edu/class/cs224n/reports/2710385.pdf
http://graphics.wsj.com/blue-feed-red-feed/
https://venturebeat.com/2017/06/11/how-ai-is-winning-the-war-against-fake-news/

Social media orchestration & bots
Mozilla web literacy course
Education vs. Brainwashing
Fact checking will be expected; if it’s not checked it’s not true
Can software do it if we can’t?

Filed Under: Uncategorized

Remembrance

September 11, 2019 by tcox@svsoft.com

“Those who cannot remember the past are condemned to repeat it.” George Santayana’s quote seems especially appropriate today, September 11th, a day we should remember with care- and which remembrance some are careful to avoid mentioning.

911 Memorial

Filed Under: Uncategorized

Information Overload – Fake News 4

September 11, 2019 by tcox@svsoft.com

Information overload is nothing new. Ralph Waldo Emerson, contemplating the Boston public library, lamented man’s ability to read and keep up with the glut of books: ‘Should he read from dawn till dark, for sixty years, he must die in the first alcoves.’
In the 1980s, Jeremy Weisner described his educational experience this way: ‘Getting an education is like taking a drink from a fire hose.’
These days, we’re washed away by a flood of news and information Emerson or even Weisner couldn’t have imagined. Google, streaming video, tweets, texts, emails, the endless links and posts on Facebook, those viral videos we can’t help but click, the phone photos and videos we share… phew.
But while we’re drowning in the flood, we lose our ability to evaluate and make decisions- to tell what’s real and what’s fake. It’s hard to breathe underwater.
Blame the like button. Once, much of the information we took in was filtered, judged, analyzed. And we trusted the curators- the journalists, editors, and authors who did the work. Nowadays things are different- and according to a recent Nature paper[3], in networks like Facebook and Twitter, accurate and inaccurate reporting have an equal chance of success, because of a tendency to view something as higher quality because it’s shared frequently. “The wisdom of the crowd’ isn’t always true.
If social media was just about cat videos, the accuracy might not be an issue. But according to a Pew Research Center study[4], the majority of us get some of our news on social media, and roughly one in five does so often- and the percentages are increasing.
Nor is traditional mainstream media as immune from this problem as it once was. According to Eugenia Siapera, who lectures on new media and journalism at Dublin City and Aristotle universities, “In my view, as mainstream news organizations cut more and more staff, they need to rely on other sources; mostly, these are news agencies, but often they are alternative news sites. Secondly, a prime journalistic concern is to feel the pulse of the public – to do this, they often turn to alternative media if only to acquire a different kind of understanding. A third means of influence is a more aesthetic one, as the kind of personalized, informal, experience-based style of writing is sometimes replicated in mainstream media.”
In other words, as Daniel Levitin of McGill University puts it, “On a daily basis, the onslaught of information is preventing us from being evidence-based decision-makers, at our own peril.”[5]
Think of it this way: all of this is more information than the brain is configured to handle. The conscious mind can pay attention to three, maybe four, things at once. “If you get much beyond that, you begin to exercise poorer judgment, you lose track of things and you lose your focus,” Levitin says.[1]
Information overload also paralyzes us into a state of inaction, and if we don’t use the information that we’re learning immediately, we lose up to 75% of that information from our memories and brains, making all of the information we’re taking in nearly useless.6]
There are plenty of reasons why we should all find ways to cope with information overload. How do we go about it?
It helps to have a system. Elizabeth Harrin suggests the following approach[8]:
Identify the sources. Categorize where your data is coming from. This will probably involve email, IM, social media feeds, websites, blogs and video such as YouTube, magazines, books, and television or streaming. It may involve work-related data such as reports, ERP software, and such.
Filter the information. This requires deciding which data is significant, how significant, and how soon you need to process it. Set up a system to deal with each source. For example, create email rules to push email to separate folders to review. If your email reader doesn’t help you, take a look at apps like SaneBox [9]. File magazines for reading in downtime such as during your lunch break.
Make time to review the data. This is basically scheduling your data use. For example, you might review emails first thing in the morning, right after lunch, and just before you leave the office. You might schedule your offline reading during lunch or your commute. One advantage of scheduling this way is that it groups similar tasks- such as reviewing emails- together. It also separates ‘work time’ from time processing information.
Act on it or delete it. This is sometimes called the F.A.S.T. rule: File, Act, Suspend, or Trash. Each document you process should be ‘once and done’. Deal with it now, if you can; if it’s going to take more than a few minutes to do add it as a task or suspend to another time. Either file or trash the document. Delete everything you can. An email that remains in your in-basket or a magazine article you’ve already read is an invitation to distraction.
Turn it off. Take a second look at the information you’re receiving. Do you need it? Is it adding value? If not, turn it off. Un-subscribe, ask to be removed from copy lists, or set email rules to move to trash. Think of it this way: every piece of information you process is taking precious seconds of your life you’ll never get back. Spend only as much time on information as it’s worth.
Try to improve the quality of the information you follow. Recognize fake news and quit paying attention to it. Don’t strike at clickbait. The warning sign is a headline that entices you to click on a hyperlink or read an article. Do you really want to know the details of ‘Air Force reprimands three airmen over puppet video’? A lot of clickbait is advertising; use features like Facebook’s ‘Hide ad’ feature to turn it off. Avoid appeals to emotion, teasers (’A schoolgirl gave her lunch to a homeless man. What he did next will leave you in tears!’), celebrity gossip, anything that’s a slide show. Smart people work hard to get you to click on their links, but you’re smart too- too smart to fall for it.
Limit your time on Facebook and Twitter. Turn off non-critical alerts- think of them as Pavlovian bells.
When you actively manage your information and say no to overload, you’re saying it’s your time. Don’t waste it.

[1] The Organized Mind, Daniel J. Levitin, Plume, 2014
Getting Things Done: The Art of Stress-Free Productivity, David Allen, Penguin, 2001
[2] One Small Step Can Change Your Life: The Kaizen Way, Robert Maurer, Ph.D., Workman Publishing, 2004
[3] Limited individual attention and online virality of low-quality information, Qiu, Oliveira, Shirazi, Flammini, & Menczer, Nature Human Behavior Letters, Vol. 1, June 2017
[4] http://www.journalism.org/2016/05/26/news-use-across-social-media-platforms-2016/
[5] https://www.theguardian.com/media-network/media-network-blog/2013/apr/10/alternative-news-cycle-mainstream-platforms
[6] https://www.psychotactics.com/art-retain-learning/
[7] https://www.csmonitor.com/Science/2017/0627/How-information-overload-helps-spread-fake-news
[8] https://www.imindq.com/blog/5-steps-for-dealing-with-information-overload
[9] https://www.sanebox.com/

Filed Under: Uncategorized

Recognizing Propaganda and Bias (Fake News 3)

August 19, 2019 by tcox@svsoft.com

“What is truth?” Pilate asked.

In today’s world, we’re frequently forced to ask the same question. Much of the information we receive, particularly in areas such as politics and social issues, is fake news- yellow journalism or propaganda designed to deliberately mislead or deceive and affect our decision making.

Much news- some would say all news- is biased, that is, prejudiced in favor of or against something or someone. Unbiased information would try to present both sides of an issue. Propaganda is a way of wording or structuring something so it appeals mostly to emotions, and distorts facts. It’s intended to win you over to a certain cause or belief. Here’s a question for you: is biased reporting propaganda? If your answer is no, can you explain the difference?

Depending on your news sources, you may feel you’re in an echo chamber, hearing the same messages with the same slant, repeated over and over. Or, given multiple points of view about an issue or story, all claiming to be the truth, you may start to feel like the blind men and the elephant.

The good news is that, with a little practice, we can usually recognize propaganda and bias when we see it. It frequently involves one or more of the following:

  • Lies masquerading as facts
  • Denial of facts
  • Opinions stated as facts
  • Presenting just one side of an issue
  • The use of unverified sources
    The use of secondary rather than primary sources
  • The use of anonymous sources
  • False experts and witnesses
  • Slanted language
  • Exaggeration
  • Over-generalization
  • Appeals to emotion rather than reason

Here are some ways to recognize and deal with these.

Lies / Denial of Facts

If the source is a website, is it a legitimate site, or a rip-off? Does it redirect? Go to the official site from a new browser page to verify (don’t use links.)
If the source of the ‘fact’ is unknown or anonymous, flag it. This applies to mainstream ‘anonymous sources’ too. The alleged facts can’t be verified, and the motive for the leak is often questionable.

Look for disclaimers like ‘as reported to’ or ‘as received’. This is a dodge.
Is the source a personal blog? If so, it’s almost certainly an opinion.

Does the article contain grammatical or formatting errors, or other errors of fact? These indicate poor research and suggest the piece may contain factual errors.

Is the source indirect (such as a chain email)? The fact that even a friend or relative forwarded it to you doesn’t make it accurate.

Is your source a reporter or a commentator? Talking heads are expected to have opinions. Reporters are expected to report facts.

Stick to reputable and official news sources that follow good journalism practices. One way to identify such sources is to check the source on mediabiasfactcheck.com or allsides.com. Media Bias/Fact Check identifies both an indicator of bias and (usually) an indicator called “Factual Reporting”. The later identifies how carefully facts are sourced (remember that even sites that are careful with facts can be biased in other ways.) If a story you’re interested in doesn’t come from a reliable source, try to cross-check with a more reliable source. Reputable sites often have an additional advantage: wider readership. More readers (or viewers) means more people to call out inaccuracies.

An additional resource to check truth and falsehood are fact-checking websites. Media Bias/Fact Check has a list of these [2]. Be aware that even these sites, which do careful fact-checking, may contain other logic errors, such as prejudicing what is or isn’t checked. You will frequently find that the fact you need checked isn’t. A second issue is timeliness. Stories, particularly stories involving investigative reporting, take time. A politifact.org report on voter fraud in the 2016 federal election [3] written on December 17, 2016 found little evidence of voter fraud, but an article by the Heritage Foundation [4] written on July 28, 2018 reported much more fraud. The point here is not that the PolitiFact report is wrong, or that the Heritage Foundation report is correct, but that there is a time-line factor in reporting: early reports are frequently incomplete or inaccurate.

Opinions Stated as Facts

Is what you’re looking at a fact or someone’s opinion?
You may think this is obvious, but sometimes the lines blur.
Consider:
‘The house was painted on November 18, 1999’ is a statement of fact (which may or not be correct), but
‘The house was painted recently on November 18, 1999, so it looks as good as new’ is an opinion, using a statement of fact as supporting evidence.
A web page from Auburn University [1] expands on this issue and how to deal with it at length.

Presenting one side of an issue

To examine this fake news component, let’s look at an example from BBC news dated May 26, 2012, an article titled Syria Crisis: Houla ‘massacre’ leaves 90 dead. [5] The article laid the blame for the massacre squarely on Syrian government forces.

But a few days later, May 28, 2012, Russ Baker’s WhoWhatWhy published an article, Syria: The Dangers of One-Sided Reporting, which faults the BBC coverage specifically because the reports were based almost entirely on the word from activists on one side in the conflict, not from journalists or neutral observers.

In other words, the reporting was one-sided. By June 8th, the BBC had released another story, Houla: How a massacre unfolded,[7] which painted a somewhat different story, in which who committed the atrocities was more clouded.

It’s easy to excuse errors in reporting from a war zone, where the reports are not first hand, and the BBC article cited is not completely inaccurate. But it’s not completely accurate, either. It’s simply not complete, because it didn’t dig deep enough.

In academic writing and rhetoric, you’re expected to present two or more points of view and discuss the pros and cons of each one. You might then choose one you favorite and present your conclusions. By presenting multiple sides and arguing their merits you’re giving your audience a sense of what you used for evidence, how you weighed it, and the methods you used to reach your conclusions and try to influence theirs. Presenting just one side of an issue skips key pieces of that process.

The relative effects of one-sided reporting in news is sometimes hard to evaluate. A variation of the problem has, however, been well-studied: that of building effective survey questions.[8] Good surveys are critical in some areas such as health care and public policy, and there are of course a lot of sins in survey building, such as leading questions, loaded questions, and double-barreled questions. The study [8] asks if presenting both sides of an issue creates better survey, and uses as an example a 1981 ABC News/Washington Post poll that asked respondents whether they favored or opposed ‘stronger legislation controlling the distribution of handguns’ as opposed to split-ballot questions like ‘Would you favor a law which required a police permit before he could purchase a handgun, or do you think such a law would interfere too much with the right of citizens to own guns?’ The study is complicated, but shows that the form of the question affects not only what the responses are but the rate of responses. It also suggests that the less educated are more willing to ‘acquiesce’ to one-sided agree/disagree forms of questions (given what we know about confirmation bias and how little critical thinking skills are taught, I’m leery of that part of the results; we may all be a little ‘less educated’ in this regard.)

One website that attempts to deal with one-sided reporting is AllSides. [9] The site attempts to report hot button issues and topics in a more neutral way by showing articles on an issue from all sides (left, right, neutral.) Their reason d’etre is well-described in a TED talk. [10]

Sources, Witnesses, and Experts

On September 12, 2017, Hillary Clinton’s book on how she lost the election, What Happened, went on sale. By noon the next day, the book had collected nearly 1,700 reviews on Amazon, close to evenly split between one-star (negative) and five-star (positive) rating. By 3:05 that afternoon, Amazon had apparently deleted over 900 reviews.

What happened? Only 338 of reviews logged that morning were from users who were verified purchasers of the book on Amazon, according to ReviewMedia. [11] Many of the reviews, especially the negative ones, were bogus.

Think of each book review as testimony: you read the book, and you review it; you’re witness to your own feelings about it. Except in this case, the reviews were by false witnesses.

In journalism, a ‘source’ is a person or document that provides timely and relevant information. Examples are official records, publications or broadcasts, officials’ statements, witnesses of an event, and other people affected by a news event or issue.

Developing and evaluating sources is a reporter’s job, and it’s not an easy one, but your job as reader or viewer- the consumer of the reporter’s news product- isn’t easy either. You need to evaluate the same sources, but indirectly, through the reporter’s filter- to evaluate the evaluation, so to speak. Your best advice is the same advice given to reporters: “If your mother says she loves you, check it out.”[12] Here are some tips on how to do so.

A primary source is direct or firsthand evidence of the event. This contrasts to secondary sources, which are descriptions, discussions, or interpretations based on primary sources. The vast mountains of information you face every day reduce to more manageable foothills when you begin eliminating secondary sources. With the exception of scholarly journals that publish research, and books that are collections of letters and speeches, diaries or the like, most periodical publications and books are secondary sources.

Both primary and secondary sources have their uses and problems. The problems can be lumped into categories: are the sources accessible, accurate, and understandable?
Every student knows the problem of finding an interesting reference in a bibliography and then being unable to access it. The culprits are usually time and money. Books must be borrowed or purchased. Online publications and article reprints are often hidden behind a paywall. Borrowing a book or document means a trip to the library, and worse, primary sources in archives often require interlibrary loans, which take time. Either way- buy or borrow- you’re making a significant investment.
But as a consumer of news, there’s another way to think about accessibility. Does the article or book or show you’re examining even list its sources? If so, how thoroughly? A bibliography is useful, and an annotated bibliography even better: a citation’s annotation gives you a sense of how the author evaluated the source and how relevant it was. Beware of hype in bibliographies. A spot check of one or two can tell you how relevant a citation is to the reporter’s story.

Although documents can be forged (or non-existent), accuracy is more of a problem with websites. Is the website reputable? The domain name may help tell; .edu and .gov websites are generally more credible. Is the website author reputable? Does he have publications, or is he referenced on other sites? Does he publish regularly? An intermittent blog by someone you’ve never heard of may not be the most reliable source. Is the website professional in appearance? Are there spelling and grammatical errors?
Witness accuracy is always worth skepticism. As a reader or viewer, you usually don’t have direct access to witnesses (and wouldn’t want to if you could.) But again, some heuristics can help make sense of it.

Is the witness attributable- either identified by name or title? There may be cases, such as in crime-related stories, where witnesses don’t want to be identified for good reason, but common sense should make this clear.
A ‘fact witness’ should only report things he or she has seen first-hand. Post-hoc and hearsay witnesses aren’t really witnesses at all.

Is the witness relevant? A report on a purported chemical weapon attack in Syria [13] cites unnamed doctors and aid workers that up to half of those killed in the attack were children, and another citation, the Syrian American Medical Society, that many victims had symptoms of exposure to a chemical agent. As medical personnel these witnesses are credible as to what was done, but would be less so as witnesses to who perpetrated the attack.

Is the witness an expert? Expert witnesses are considered valuable in law cases because a trial or lawsuit’s subject is sufficiently complex: the severity of an injury, the degree of sanity, or the cause of a structural failure, for example. If an expert’s cited, a search for that person should quickly verify his or her credentials. But even in courtrooms, expert witnesses are often hired guns, and the other side may use its own expert witnesses to advocate a differing position. Which bias do you choose to believe?

How many witnesses are there? One? A few? Many? Consider a ball game: There’s no need for eyewitnesses to tell you the winning team.

Is the witness biased? The BBC news piece mentioned in discussing one-sided arguments [5] begins this way: ‘At least 90 people, including many children, have been killed in Syria’s restive Homs province, opposition activists say, calling it a “massacre”.’ Opposition activists can hardly be expected to place blame anywhere but on government forces. If bias is evident in witnesses, it’s the author’s responsibility to either verify the witness or identify that the witness is unverified, for example, by presenting counter-claims.

A final word on sources. We live in a complex age, filled with complex issues, from nuclear proliferation to drug addiction. In order to interpret news on a complex issue, it helps to understand the issue in broad terms: to have background. One of the best uses of secondary sources is to find good survey books and articles and read. Identifying ‘good’ is usually as simple as looking at reviews and ratings.

Misuse of Language

‘War is Peace, Freedom is Slavery, Ignorance is Strength.’ These words, inscribed upon the Ministry of Truth building in The George Orwell’s 1984, introduce the reader to the idea of doublethink: as Orwell describes it,‘to be conscious of complete truthfulness while telling carefully constructed lies.’

Words are tools, and, sometimes, weapons. One way to weaponize language is to slant it. Slanting is using a statement or description in favor of a particular position. Slanting begins with the inclusion or omission of facts, but extends to the writer’s choice of words. A politician who supports a reform may be ‘enthusiastic about reform’ or ‘a fanatic about reform.’ A woman may be described in two ways: ‘a well-cut black dress draped subtly about her slender form’, or ‘a plain black dress hung on her thin frame.’

Emphasis can also be used to slant writing, even to something as simple as word order: ‘he was plain but kind’ is not identical to ‘he was kind but plain.’ When an author selects specific words to covey an implied meaning, he’s taking a stand: pro, con, or neutral. If that stand is not neutral, or the writer doesn’t prove or justify his stand, he’s biasing his work.

Exaggeration is another form of slanting. It takes many forms, but is often used to dramatize or emotionally charge a story to make it more appealing to a reader. When a headline doesn’t match the content of a story, it’s often because the headline exaggerates. Consider the following headline from CNN: ‘Marijuana legalization could help offset opioid epidemic, studies find’ [14]. As HealthNewsReview pointed out a few days later, there was no way to know if anyone was actually choosing to use marijuana instead of opiods [15]. Exaggeration can be used in more subtle ways, such as in using loaded images and pictures [16]. Videos of protests and rallies often uses camera angles to make the crowd look larger or smaller, depending on the point being made. Even the amount and type of news coverage of a story can constitute exaggeration [17].

Over-generalization is often the result of an attempt to make sense of an issue. It’s another form of exaggeration, and can be found in headlines, but can also be more subtle. One way to over-generalize is to include a witness over-generalization. A news piece on increased rate of homicides in Baltimore included a statement from an anonymous Baltimore police officer that police were staging a work slowdown[18]. The officer’s statement was an opinion about the entire police force; the reporter includes it in his story, but doesn’t point it out as opinion or offer other evidence (such as interviews with other policemen) to support the statement.

Biased language often appeals to emotions rather than reason. The Knife Media offers an interesting analysis of words used to describe the dialog tag ‘said’ in “50 ways to spin the phrase ‘Trump said” [19]. Selection and arrangement of material also plays to emotion; a different piece from The Knife offers an example in a story about an accident involving a self-driving car [20].

When you start looking for fake news and biased reporting, you see it everywhere. It’s easy to be cynical. But developing skill in critical reading will also make you more aware of good reporting when you see it. In today’s fast-moving and complex world, you need good information. It’s up to you to go find it.

[1] How Do You Separate Fact From Opinion?
http://www.auburn.edu/~murraba/fact.html
[2] The Ten Best Fact Checking Sites https://mediabiasfactcheck.com/2016/07/20/the-10-best-fact-checking-sites/
[3] Fact checking the integrity of the vote in 2016
http://www.politifact.com/truth-o-meter/article/2016/dec/17/fact-checking-claims-voter-fraud-2016/
[4] New Report Exposes Thousands of Illegal Votes in 2016 Election
https://www.heritage.org/election-integrity/commentary/new-report-exposes-thousands-illegal-votes-2016-election
[5] Syria Crisis: Houla ‘massacre leaves 90 dead
http://www.bbc.com/news/world-middle-east-18216176
[6] Syria: The Dangers of One-Sided Reporting

Syria: The Dangers of One-Sided Reporting


[7] Houla: How a massacre unfolded
http://www.bbc.com/news/world-middle-east-18233934
[8] Effects of Presenting One Versus Two Sides of an Issue in Survey Questions, George F. Bishop, Robert W. Oldendick and Alfred J. Tuchfarber, The Public Opinion Quarterly Vol. 46, No. 1 (Spring, 1982), pp. 69-85
[9] AllSides
https://www.allsides.com/unbiased-balanced-news
[10] Free Yourself From Your Filter Bubbles, TED Talk

[11] https://qz.com/1076357/hillary-clintons-what-happened-amazon-just-deleted-over-900-reviews-of-hillary-clintons-new-book/
[12] Blur: How to Know What’s True in the Age of Information Overload, Bill Kovach & Tom Rosensteil, Bloomsbury USA, 2010
[13] https://www.thetimes.co.uk/article/syria-attack-we-found-bodies-on-the-stairs-they-didn-t-see-the-gas-in-time-xk0d8vrnt
[14] https://www.cnn.com/2018/04/02/health/medical-cannabis-law-opioid-prescription-study/index.html
[15] https://www.healthnewsreview.org/review/cnn-leaves-out-key-limitation-to-study-on-legalizing-marijuana-and-opioid-use/
[16] http://www.straitstimes.com/multimedia/photos/in-pictures-injured-children-in-a-hospital-in-douma-syria
[17] https://www.cnn.com/2015/04/20/us/police-brutality-video-social-media-attitudes/index.html
[18] https://www.cnn.com/2015/05/26/us/baltimore-deadliest-month-violence-since-1999/index.html
[19] https://www.theknifemedia.com/world-news/50-ways-spin-phrase-trump-said/
[20] https://www.theknifemedia.com/world-news/tesla-and-self-driving-technology-how-the-media-may-deter-progress-with-faulty-reasoning/

Credible witness – Wikipedia (eyewitness,unimpeachable
The echo chamber – time-lines
Anonymity begets bad behavior

Filed Under: Uncategorized

Confirmation Bias (Fake News 2)

August 1, 2019 by tcox@svsoft.com

“What is truth?” Pilate asked. In today’s world, we’re frequently forced to ask the same question, albeit for different reasons. Much of the information we receive, particularly in areas such as politics and social issues, is fake news- yellow journalism or propaganda designed to deliberately mislead or deceive and affect our decision making. Depending on your news sources, you may feel you’re in an echo chamber, hearing the same messages with the same slant repeated over and over. Or, given multiple views of an issue or story, all claiming to be the truth, you may start to feel like the blind men and the elephant.

Here’s a thought: the problem isn’t really fake news, but our ability to sort through the noise and make sense of it- in our ability to think critically. Unfortunately, that ability is constantly tested, and we don’t always come out the winner. As Walt Kelly put it, “We have met the enemy, and he is us.”
One of the biggest problems with fake news is that we love it so. Our brain chemistry contains a reward system that gives us a dopamine hit, a feeling of pleasure, when we do something that deserves reward, such as surviving a bear attack or learning a new task. We also get that hit when we process information, such as reading an article, that supports our beliefs- that tells us what we know is correct. It’s the basis of reinforcement learning, a good thing. But it also makes us more unwilling to accept facts or opinions that disagree with our own beliefs. It’s called confirmation bias. It’s why Democrats watch MSNBC and Republicans watch Fox, instead of vice versa.
Confirmation bias is a cousin of cognitive dissonance, the stress we feel when we’re forced to hold two contradictory beliefs or opinions at once. We humans like consistency; we each have a model of the world that works for us, and tend to avoid situations that are inconsistent with that world model. But avoiding cognitive dissonance means you’re avoiding a chance to change your mind. It’s the ‘my mind is made up; please don’t bother me with facts’ mindset.

In order to avoid confirmation bias, you have to first understand it thoroughly. Why do we believe as we do? This can be framed as a question. Do we first take in information- understand it- and then decide whether to believe it or not? Or do we simply believe whatever we see and hear, and then perhaps change our minds later when we come across evidence to the contrary? This was the essence of a famous mid-17th century debate between French philosopher and mathematician Rene Desacrtes and Dutch philosopher Baruch Spinoza. Descartes believed that we take in information and then decide rationally whether or not to believe it. This seems proper; it would be wrong to believe something on the basis of insufficient evidence. Spinoza disagreed. He felt that understanding some bit of new information was believing it- that we tend to believe anything we hear, and have to work to ‘unlearn’ it if we find evidence to the contrary. This is less appealing; it implies that we’re gullible, and will have to work hard to root out garbage people spew at us.

Who was right? In 1993, Harvard psychologist Daniel Gilbert and his associates ran a series of experiments to see which of these two theories was correct.[2] The conclusion they reached is discomfiting; Descartes was wrong, and Spinoza was right.

If we all have a built-in tendency to believe everything we see and hear, it follows thast we should be careful what we see and hear. Of course, a lot of information is truthful, and should be accepted prima facie; cynicism isn’t always a good idea. If you’re driving and you see a ‘bridge out’ sign, you’d be foolish not to pay attention. But the Internet and social media have enabled new ways to share information with very little regulation or editorial standards, and there is evidence that even the mainstream media is increasingly biased in some areas. Instead of cynicism, a dose of skepticism is often in order.

How do you combat your own confirmation bias? It’s not easy. Nobody wants to admit that they’re wrong, but that is exactly what you’re asking yourself to do: to consider that there is a truth, or at least an objective opinion, that isn’t identical to your opinion, at least in part. We all want to be fair and objective, but it takes work. Here are some techniques to help you get there.

1. Don’t take the opposite side of the issue. It doesn’t work. A study by Charles Lord and colleagues[4] identified a [tk fix this] It’s not that we believe just what we want to believe, but that when we filter new information, we view whether or not it supports our beliefs, we tend to see it as a confirmation, that is, we interpret it as supporting the conclusion we’ve already formed. We accept confirming evidence at face value, but subject contradictory evidence to critical evaluation. Lord called this ‘biased assimilation.’

A better approach is to simply attempt to weigh the evidence as objectively and impartially as possible. Ask yourself, for each piece of evidence, if you would evaluate it the same way if the evidence was different. Play devil’s advocate. For example, if faced with a piece of research that suggests tariffs harm the economy, imagine that the results aid the economy. Then examine the research with an eye to understanding why it had the results it had. This approach will help you focus on the evidence and its strengths and weaknesses rather than your interpretation of it as support for your beliefs.
As an aside, this research reinforces the notion that you’ll do well to focus on more objective news sources and avoiding sources that you know are biased, even if you tend to agree with them.

2. Avoid polarity. If you look at new information as black or white, pro or con to beliefs you already hold, you’re just adding to the problem. Instead, consider the data as an opportunity for creativity. Brainstorm a bit; think of at least three explanations for the data. Three is a magic number because it steers you away from ‘this or that’ thinking.

3. Be aware of your hot buttons- any topic or issue that is highly charged emotionally. Hot buttons are magnets for social media and yellow journalism precisely because they’re highly charged: they’re click bait. But understand that the more emotional you are about a topic, the more likely you are to have biases. Awareness doesn’t mean avoiding these issues; avoidance isn’t going to help you overcome your biases. But don’t let others push your buttons.

4. Avoid looking for causes. Just take in the data and look at it as objectively as possible. The ‘why’ question looks for reasons and motives, and we are far to good at inventing them, most of which are pure conjecture. Even the people who make decisions are often poor at explaining why they made them. People will go to great lengths to justify bad decisions. Be wary of someone who tries to tell you he knows what’s going on in someone else’s mind. Instead of asking a subjective why, stick to the more objective who, what, when, where, and how questions.

5. Moderation in all things. When it comes to information, the best way to practice moderation is to look for moderate sources. In politics, instead of depending on left-leaning or right-leaning news outlets, look for more centrist sources. Media Bias/Fact Check[5] provides curated ratings of new sources’ biases; sticking to the ‘Least Biased’ category have minium bias and use very few loaded words (wording that attempts to influence an audience by using appeal to emotion or stereotypes.) SUJO[6], a new app available on the web as well as on both IOS and Android, shows news articles with objective, opinionated, and user-submitted articles’ titles in different colors- and it’s instructive to see the difference.

6. We need to be open to people with diverse opinions and to new experiences as a way to actively challenge our beliefs. If we can’t do that from time to time, we are, by definition, narrow minded. As John F. Kennedy reminded us, ‘Change is the law of life. And those who look only to the past or present are certain to miss the future.’

References:

Critical Thinking: Tools for Taking Charge of Your Learning and Your Life, Richard Paul & Linda Elder, Prentice-Hall, 2001

[1] Why You can’t Help Believing Everything You Read
https://www.spring.org.uk/2009/09/why-you-cant-help-believing-everything-you-read.php
[2] D.T. Gilbert, R. W. Tafarodi, P.S. Malone, “You Can’t Not Believe Everything You Read”, Journal of Personality and Social Psychology Aug 1993
[3] Do We Choose What We Believe?
https://blog.oup.com/2015/05/spinoza-ethics-of-belief/
[4] Biased Assimilation and Attitude Polarization: The Effects of Prior Theories on Subsequently Considered Evidence
http://psycnet.apa.org/record/1981-05421-001
[5] Media Bias/Fact Check
mediaBiasFactChecker.com
[6] SUJO: Organizing Reality
https://web.sujoapp.com/

Filed Under: Uncategorized

Trust No One (Fake News 1)

June 16, 2019 by tcox@svsoft.com

There’s a PolitiFact editorial entitled “The media’s definition of fake news vs. Donald Trump’s” (http://www.politifact.com/truth-o-meter/article/2017/oct/18/deciding-whats-fake-medias-definition-fake-news-vs/) which begins:
‘When PolitiFact fact-checks fake news, we are calling out fabricated content that intentionally masquerades as news coverage of actual events.’ It goes on to say:
‘Instead of fabricated content, Trump uses the term to describe news coverage that is unsympathetic to his administration and his performance, even when the news reports are accurate.’
PolitiFact’s definition, ‘fabricated content’, is understandable: their mission is fact-checking. Trump’s claimed definition is similar to another common one: a fake news story is one that deliberately misinforms or deceives readers. The Digital Forensics Research Lab uses a definition similar to PolitiFact’s, “deliberately presenting false information as news” (https://medium.com/dfrlab/fake-news-defining-and-defeating-43830a2ab0af) which they distinguish from disinformation, “deliberately spreading false information”.
When you have differing definitions for something, it’s worth paying attention to the differences. Suppose a story doesn’t outright lie, but does have a point it wants to make: it’s trying to influence your views or pushing the author’s (or his organization’s) opinion or agenda. In other words, it’s biased or slanted. News is ‘a report of new and noteworthy information about important events.’ Suppose we fail to report an important event? Or puff something that isn’t relevant into ‘news’? What if someone picks and chooses which stories to tell, or how to tell them, in order to push an agenda? Is that legitimate news, or is it fake?
Words can and should be precise tools. Consider the Politifact article’s use of the word ‘unsympathetic’. What if you substitute ‘biased’, ‘opinionated’, ‘one-sided’, or ‘partisan’ for ‘unsympathetic’?
I’m not suggesting that facts don’t matter. But a perusal of many articles labeled as news is likely to show you more than facts. As Dennis Prager wrote in the National Review: ‘When it comes to straight news stories – an earthquake in Central America, say – the news media often do their job responsibly. But when a story has a left-wing interest, they abandon straight news reporting and take on the role of advocates.’ (https://www.nationalreview.com/2017/08/mainstream-media-left-wing-bias-dennis-prager-santa-monica-symphony-orchestra-new-york-times-los-angeles-times-npr)
The dictionary definition of bias is ‘prejudice in favor of or against one thing, person or group compared with another, usually in a way considered to be unfair.’ The Digital Forensics Research Lab separates disinformation (deliberately spreading false information) from misinformation (unintentionally spreading false information.) I think it’s fair to consider most biased reporting as deliberate; reporters use words for a living.
How widespread is bias in news reporting? Politico polled 63 members of the White House press corps anonymously. Of this group, 45% admitted that media coverage was biased against Trump; 2% said they were biased in his favor, and 53% said they weren’t biased at all. Politico comments: ‘Let us repeat that: Nearly half of all White House correspondents admit the media’s coverage is biased against Trump.’ Many people might not believe the other 53%.
The article continues:
‘A post-election poll conducted by the Media Research Center found that 69% of voters don’t believe the news media are “honest and truthful,” while 78% felt that coverage of Trump was biased. Some 59% felt the media were biased for Clinton, while just 21% said they were biased for Trump.’ Michael Goodwin put it this way, in an article in the New York Post: ‘Every story was an opinion masquerading as news, and every opinion ran in the same direction.’
Needless to say, not all bias runs in one direction. There’s liberal bias, but there’s also conservative bias: Fox for CNN. There’s also institutional bias, defined by Noam Chomsky as ‘systematic biases of U.S. media as a consequence of the pressure to create a stable and profitable business.’ (Herman, Edward S.; Chomsky, Noam. Manufacturing Consent. New York: Pantheon Books, 1988) As Chomsky describes it, American commercial media encourages controversy within a narrow range of opinion, in order to give the impression of open debate, but does not report on news that falls outside that range. Although Chomsky’s book predates the 2016 election by almost 30 years, it could have been written specifically about recent political news. In a weird way, the liberal media benefits from Trump exactly as he benefits from the media: each appeals to its base by deploring the other side.
There was a time when the mainstream media were trusted sources, precisely because news organizations and journalists followed a strict code of ethics. Those codes still exist. For example, there’s the Society of Professional Journalists ethics document, , which contains, as one of its tenants, “Distinguish between advocacy and news reporting. Analysis and commentary should be labeled and not misrepresent fact.” https://www.pbs.org/newshour/extra/app/uploads/2014/03/mediaethics_handout5.pdf )
Is such objectivity really a goal anymore? It’s debatable. CNN reporter Christiane Amanpour stated that in some circumstances ‘neutrality can mean you are an accomplice to all sorts of evil.’ (https://www.quora.com/War-correspondent-Christiane-Amanpour-has-said-that-strict-journalistic-neutrality-can-mean-that-you-are-an-accomplice-to-all-sorts-of-evil-Is-she-right)
As I see it, the problem is that when media gets to decide what’s evil and biases their reporting to make that point, they’re reducing your ability to decide.
We’ve wandered far afield. Let’s circle back to the PolitiFact article. PolitiFact holds a Media Bias Fact Check designation of ‘Least Biased.’(https://mediabiasfactcheck.com/politifact/) It’s won the Pulitzer Prize for its reporting. But the cited editorial, by editor Angie Holan, has a few issues of its own. Let’s look more closely.
Fact-checking, PolitiFact’s reason d’etre ignores other components of disinformation. The slide from reporting facts to stating opinions masquerading as facts is a self-serving generalization and excuses a lot of mainstream bias. Calling out just the news that fails PolitiFact’s fact-checking also begs the question of which stories are, or are not, checked. Does PolitiFact cherry-pick? An article by Matt Shapiro in The Federalist (https://thefederalist.com/2016/12/16/running-data-politifact-shows-bias-conservatives/) thinks so.
But also consider that Trump’s ‘definition’ of fake news is a claim Holan makes, not a statement by Trump. In other words, it’s an opinion, not a fact. That doesn’t make it wrong. I’m fairly sure Trump’s definition of fake news differs from Holan’s. But Holan has set up a straw man here. She’s arguing with a definition and examples she created. Straw man arguments are, by definition, logical fallacies, usually created for the purpose of persuasion.
The precise manipulation of Holan’s wording in her asserted definition (’unsympathetic’ as opposed to ‘biased’) is also obviously slanted language, linguistic bias.
The PolitiFact article states: ‘If you define fake news as fabricated content, then 2016 was the year fake news came into its own.’ Apparently so- witness the Politico editorial.
This is not to say that fact checkers, including PolitiFact, aren’t useful- they’re valuable tools in the war against disinformation. The Internet created new ways to publish, share and consume information and news with very little self-regulation or editorial standards. Much of this new alternative media, as the article rightly points out, uses or copies ‘facts’ which are in fact false. But that doesn’t make mainstream media blameless, and it doesn’t make news what the mainstream media says it is.
Ultimately it’s up to you, the reader, to decide what distinguishes true from false, and belief from opinion. My advice is to beware of opinion masquerading as news. And don’t trust anybody, not even the fact checkers.

Filed Under: Uncategorized

  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Interim pages omitted …
  • Go to page 8
  • Go to Next Page »

Primary Sidebar

I am writing

Sponsored by Featured Fiction

I am writing

Carbon Copy

11%

11044 of 100000 words

Sponsored by Featured Fiction

Copyright © 2025 · Author Pro on Genesis Framework · WordPress · Log in