Trust Projection on Facebook

This begins a series called Challenges in Media. We’ll also touch on ways to manage those challenges. Today’s topic is Trust Projection. We’ll parse what that means next.

As humans, for some reason, when we see printed text next to a famous photo and attribution, we automatically lend credibility of the text as valid, not because we trust the original person who created the photo, but because we trust the person or venue who sent the photo to us. Take for example this photo of Pope Francis with an atheist quote:

False Quote attributed to Pope Francis
False Quote attributed to Pope Francis

It is not necessary to believe in God to be a good person. In a way, the traditional notion of God is outdated. One can be spiritual but not religious. It is not necessary to go to church and give money – for many, nature can be a church. Some of the best people in history did not believe in God, while some of the worst deeds were done in his name.

Somehow, the faith we have in our friends is projected onto the random item they shared. Let’s call this phenomenon “Trust Projection”.

The above image caused one of my friends from high-school (a devout Catholic) to comment and share the photo with, “Hey, I think I like this new pope!” I pointed out that the quote is fairly atheist in intention, and that Francis would never say that. The Pope Francis atheist meme is an example of someone trying to trip us in our logic. At first the intentions seem nice. When you find out he would never say this, you begin to question your own belief system. It’s a clever tactic, and yet this photo went viral a few months after the new pope was chosen. It was shared widely amongst Catholics and other Christian sects on Facebook as a positive thing, even though the statements made were entirely counter to the religious beliefs they affirm every Sunday.
My intention is not to undercut any particular faith, but to show that as humans we need to have a more critical eye when we view anything that comes across our Facebook feed. Just because your friend shared something doesn’t mean your friend took any time to actually verify it.

Image crafters spend hours or even days crafting semi-credible artwork meant to evoke just enough of one emotion or another to cause whoever sees it to repost, share, like or comment. The image makes you feel joy, makes you laugh or makes you angry. The image goes viral. Make no mistakes. The entire goal of this type of image craft is to make the image go viral. It doesn’t matter whether the content is true, valuable or meaningful in any way. All that matters is that the content is transferred virally through the magic of human Projected Trust.

To be sure there are other types of viral memes that include cat photos, cute kid memes or daily affirmation videos. These come across as more legitimate in their presentation as entertainment, as seen in the image directly below.
Often, I’m happy to share something meant as entertainment because the reason for sharing is transparent. But sometimes the reasons for sharing are pretty gray. In the below Facebook post, the comment photo is so beyond the pale of stupidity, I have to categorize it as a fake viral meme. Nobody with the ability to text or comment on Facebook could be that stupid. Yet here we are sharing and commenting on a Facebook page that literally advertises its intent, to make us cringe. It has to be fake.
I classify the next type of post as chain-reposts. Their only goal is to continually propagate any particular meme as long as possible. Perfect examples include the share-a-hug style post below. The friends who share these are often honest however unfortunately clueless in their intentions, but the crafters who create the meme purely intend for the meme to continually be shared.
 
Chain reposts are passive-aggressive. First, you’re told to do something: share the post for a reward (in this case a “Hug”). I’m being trained to respond like Pavlov’s dog. Second, there is a weird reverse projection of trust. You are meant to infer that it actually is your friend who shared this meme who will know whether you will hug her or not. You might feel more compelled to share and like your friend’s share because your friend will like you more if you do share. For me, this comes across as a twisted reason for sharing or liking any Facebook post. I generally don’t share anything of this ilk. I don’t like being controlled by some image or word crafter via some Facebook meme page dedicated solely to meme flow. It’s highly unlikely my friend actually knows the person who created the chain repost. I am not some random image crafter’s puppet. I’m less likely to respond in any way to this type of shared post.
Make no mistakes. I think no less of my friends for sharing these types of posts. They are as much victims of foul image crafters as the rest of us are. I simply refuse to endorse the original creator’s intent — continual meme repost for its own sake.
The worst are political posts that have absolutely no basis in reality. Case in point, this example post below from someone who is obviously image-crafting stuff that makes people angry. There is no basis in fact for the claims made in the cartoon, but since it’s in writing, thousands of people have shared this item. It uses the currently popular “whataboutism” method of deflection to make people angry. It ends up illogically inferring first that it should have been okay for Nixon to wire-tap and second, condemns Obama for wire-tapping when he never did. The image crafter didn’t start with “Are you beating your wife?”. He started with, “When did you stop beating your wife?”, which causes the reader to automatically assume that (let’s say Nixon) beat his wife. Ostensibly, now he has to defend himself against that. Except it never happened.
Claims made against President Obama are illogically compared to those made against President Nixon

These are probably the most heinous of items to share on Facebook. They contain not a shred of truth and are intended only to spread angry emotion.  But don’t take my word for it. Below are just a few of the example angry replies found in the comments section of this particular Nixon cartoon.

They all assume Obama _did_ something because that’s what is written in the image. One commenter denigrates Obama by calling him “Obumer”. Another goes even further and blames everything on “Liberals”. The point is not whether or not the opinions are well-founded. The point is that these reactions are based in anger, and that these unfortunate souls are allowing some image crafter to manipulate their emotions to such a degree that they feel it necessary to make such ugly statements. They are in essence merely puppets being controlled by one image crafter with a Facebook page that has taken the view that anyone who dislikes a particular politician cannot be trusted — except that the original page crafter may not even live in our country or even espouse the values suggested by the page. In other words, the page itself might be fake.

I have no idea whether the above cartoon was posted by a Russian. I have no idea whether the Citizens for Trump page was created by an American. It looks like it came from an American, but it’s hard to tell these days. Sure, you could put a restriction telling which country the page was created in based on IP, but today, anyone can use a VPN account to log in as coming from nearly any country in the world, including the US. Adding insult to injury, quite a number of these US-oriented political Facebook hate pages are apparently run by people who live in Russia. Content is measured and crafted by Russians who understand the emotions behind US politics better than many US citizens do. If you doubt this, here’s an article from the New York Times with example ads confirmed as posted by Russian image crafters and from Russian owned pages. I remember some of them in my own facebook feed. These images hit the emotional jackpot with wide and varied segments of US society, targeting not merely one extreme. It’s a bit chilling to see how our citizens have been manipulated. When we talk about fake Facebook ads, it’s not an ad for Oops, All Berries Cereal or Doritoes Blaze. They look and sound like any other satirical cartoon or video put out by American political campaigners, except they’re not American ads.

Below is a classic example that looks like it could have been crafted by someone in the US, except it wasn’t. It was crafted by a Russian. Regardless of my sentiments on the below image post, the most critical phenomenon to recognize is that we’re being manipulated by people outside our real friends and family on Facebook through the projected trust of our actual friends and family who share these posts with us.

Photo Courtesy The New York Times

Still not convinced? Here’s another list of Russian ads from a reputable news outlet, CNN in their “Money” section. Need more? Here’s a breakdown of how the ads work as compiled from nearly 3000 examples released by our lawmakers for public review. Facebook allows you to target specific audiences, and it doesn’t care if you’re American or Russian when you do it.

At this point, I give up. Rather than attempt to determine whether something is true of false, I’m skeptical of any political image or word crafting on Facebook. When I see an angry post about a particular politician, I either ignore it or research whether the claims are true.  If I know the person well enough who shared it and I know the post is in error, I might point out that the shared post is false. Otherwise, I ignore the post. I recognize the likelihood that these ads are created solely to divide Americans along various social strata from politics to race to gender issues, and dismiss them all as un-American in tone and design, regardless of origin.

When I do share, like or comment, I acknowledge deliberately to myself that this is my projection of trust to friends and family on whatever it is I’m sharing. I own it if I’m sharing garbage. I stand behind what I share as worth their time to look at. I will not waste my friends time with ridiculous stuff that will absorb their time uselessly. If what I post is untrue and someone calls me on it, I’ll certainly acknowledge the error and apologize. I’ll probably take the image down. If something seems like sketchy news, but might possibly be worth sharing, I’ll research its validity before sharing. If I find it’s not true, I don’t share.

Finally, I don’t generally unfriend people in Facebook just for sharing things that I disagree with, so long as they post on their own timeline. It’s important to me that I understand the world they’re living in, because as often as not, the things they’re seeing on Facebook are entirely different from what I see. Any glimpse I can get into what’s shaping their opinion will help me better understand my friends. I don’t block pages or unfollow friends because I want to be sure I’m not cocooning myself in my own little influence bubble.

For this reason, I get a lot of shared posts from opposite US political, racial and gender issue extremes, and frankly, they’re almost always poorly informed, regardless of party affiliation or social strata when the object is to make the viewer angry. In instances where they are well-informed, it’s still pretty obvious that the page it comes from is propagandist in nature and not news as much as it is opinion meant to change my perspective. I will be the one in charge of deciding whether my perspective changes, and I’m willing to change my opinions based on facts and reality, not lies and myths.

And Mr. Zuckerberg, you say you want to make Facebook better in 2018? Give me a method for determining which ads were paid for, where these ads are actually posting from and the specific human who has actually paid for them in real time at the top of each ad. When one of my friends shares something that’s obviously false propaganda, give me a way to mark an ad as false propaganda until Facebook can determine whether something is actually false. Facebook could review the number of views vs. marks for a specific ad and research it. If someone abuses the marking system, remove their marking privileges. If the ad is hateful or includes lies, take it down. If they’re not American, and influencing an American political campaign, remove the page that created the ad. Don’t make me wait until you’ve determined the source after the damage has been done. Even better, shut down your post boost system. The boosted-post model discourages real grassroots activism and encourages anyone, domestic or foreign who has money, to game the political system. You’re no better than the corporate-bought bureaucrats running Congress if you punish real and worthwhile political activism with this crummy “boost” system. Fix this.

Have you been influenced by Russians? Here’s a link to a Facebook page that will tell you whether or not any of the pages or friends you currently follow have been identified as Russian influencers.

If you have been influenced by Russian ads on Facebook, why not post comments below with the link to the Facebook page or group in question here!

Projected trust. Your friends and the things they share are separate. Don’t let strangers on facebook pull your strings through the trust of your friends. Question authority and research validity.

Leave a Reply

Your email address will not be published. Required fields are marked *