Posted on June 16, 2021 (July 9, 2021) by Mikko Wolf Spread the loveDr. Fauci is a tool for the deep state. Chemtrails are controlling our weather and our emotions. Kiwis kill cancer cells. Conservatives believe that poor people should die. Liberals want a future with no straight people. Bill Gates is actually Annie Leibovitz. Bat child escapes! Our least favorite uncle loves to share them. That woman who lived on our floor freshman year of college retweets them. Our step-grandparent-in-law posts them with their own commentary. And we engage with them and click on the link to read whatever nonsense is behind the headline or title tag and meta description. Or worse yet, we reply, begging for more comments and more engagement. Why are we even arguing about whether or not antifa and Jeff Bezos worked together to start a forest fire to get people to get vaccinated before the next election so a pizza place in rural Nebraska can continue selling babies to the Chinese government? Or whether or not Iowa’s favorite hot dog is actually a sausage? Whatever the reason, our behavior — our clicks — inform the algorithms that determine what content is put in front of us and millions of other uses that “AOC is Actually 3 Children Stacked on Top of Each Other: Where Did Our Country Go?” is a great piece to put into everybody’s feed. “It’s engaging!” screams Mark Zuckerberg riding a jet ski at his enormous Hawaiian estate. At least metaphorically anyway. This tale has repeated itself every minute of every day since The Facebook opened up the world of social media to our collectively detached once-loved ones. Soon thereafter, they all found Twitter. Suddenly the advice of our high school teachers, concerned parents, and college professors (“Not everything you read online is true!” and “Be careful what you put online! It stays there forever!”) was completely forgotten. And now here we are, living in an age of misinformation and hate-clicks. Reading outlandish headlines isn’t new for those of us old enough to remember eyeing headlines about Hillary Clinton adopting alien babies on the covers of The Sun and The Weekly World News while standing in line at the grocery store. Thankfully we knew what to expect when we saw those. They were very clearly fake headlines. The hate-baiting and ill-informed click-bait headlines that drive us to engage and inadvertently spread misinformation today aren’t always so obvious. Today’s clickable content can often be found on seemingly reputable sites that play to and exploit their respective audience’s existing biases. Or on websites that feed us buzz words from their content farms. These sites put an appropriate click bait-y “og:description” in the <head> (IYKYK) and expect eager users to engage with the content they don’t agree with. Add an algorithm hungry to drive more engagement, and you’re sure to have a fake news barn burner spread across huge sections of our country. It’s not always false information that spreads either. Hateful and/or controversial hot takes flood our social media streams. “Most Overrated Films of 2020,” “Why You’re Making Coffee Wrong,” and “Why Jojos are Just Potato Wedges” are all examples of this. These traps work. We see an opinion that we don’t agree with, and we will do our best to correct what we think are ill-informed opinions. It’s difficult for sources and sites to avoid producing polarizing content. A study done by researchers from the Computer Science Department at the Federal University of Mina Gerais in Brazil and the Qatar Computing Research Institute analyzed nearly 70,000 headlines made by major media outlets and their relationship between the expressed sentiment and their popularity. “We discovered,” the study’s authors wrote, “that the sentiment of the headline is strongly related to the popularity of the news.” In other words, if a headline expressed an extreme sentiment — either positive or negative — the article was more popular. Essentially, the more polarizing the sentiment was in the headline, the more engagement it received. The study’s authors continue: “Our results suggest that a headline has more chance to be successful if the sentiment expressed in its text is extreme, towards the positive or the negative side. Results suggest that neutral headlines are usually less attractive.” How can we not produce this kind of content? What have we got to lose? A lot actually. Content is a critical component of search engine optimization. Good content boosts authority, expertise, and trustworthiness. It answers user queries surrounding keywords related to a business and will dramatically improve positions in search engine results. Expertise, authority, and trust, or E-A-T, is an ever-increasing factor in search engine rankings. Articles that rely on clickbait and outrage are extremely successful in their quest for engagement and increased traffic, but they decimate any trust a website has. You can only fool users so many times with hate-bait before they find another source and start driving traffic to competitors. We produce and publish a lot of content here at Webfor, and we understand the importance of creating trustworthy content that doesn’t rely on clickbait. Our data driven insights on keywords applicable to our clients’ businesses help us achieve fantastic returns on investment by driving traffic to client sites and boosting their rankings in search engine results in order to increase revenue. It is crucial that content fulfills the user’s request for information and establishes trust so that they take action and drive sales. A click-bait or hate-baiting headline may drive engagement and traffic, but it will deteriorate the expertise, authority, and trustworthiness of our clients. As industry leaders, that’s not something we’re going to do. We can all do our part in tempering the spread of misinformation and hate-baiting articles, but we’ll need to be careful. These articles play on emotions, and we don’t always see them coming. Let’s do our best to disengage and refuse to comment on the latest article your second cousin retweeted. We’ll do our part by not producing that content in the first place so they don’t have the opportunity to spread misinformation and outrage far and wide.