top of page

Section 4

Critical Creativity

In an Age of Misinformation

With contemporary advancements in technology, information is accessible almost everywhere. James Potter writes in “Why Increase Media Literacy” (2019) that “We are already able to access a wide range of entertainment and information, so why would we need to learn a lot more about the media?” It's easy to say there's so much out there, so why should we be open to change? Or that we've learned enough, so why should we put ourselves in a position where we need to absorb new information? With so much information out there and available, the efficacy can be both good and bad.

​

The flip side of that is that information is easily manipulated or fabricated. Fact-checkers are incredibly valuable, but are not as readily available or as easily manipulated. This is a big problem because many people take the information they gain as fact instead of verifying it. We run into this vicious cycle of sharing misinformation without ensuring correct information. Potter (2019, p. 10) writes, “People who do not periodically examine their automatic routines are defaulting to influences outside their control.” This reciprocal sharing of misinformation can negatively impact us as people and society. Knowing how easily misinformation can be spread is essential to protect ourselves and allows us to benefit our society.

​

For Millennials and Gen Xers guarding themselves from misinformation is essential. "Digital life" can be overwhelming. Protecting oneself from the effects of misinformation has become a more and more valuable activity. Knowing the effects of misinformation is one way to combat it. However, there are other ways to protect oneself. Continue on for more information about misinformation!

​

​

​

​​​​​

​

Information comes to us from all directions now: social media, advertisements, targeted content, news, memes, and so on. We need to understand how information (and misinformation) is shared with us. 

The sharing of information can be exponential in some cases. In most cases, the information we gain is based on our biases. We choose the information we want to consume, who we follow or friend on social media, and what we take in and absorb. In 2024, Filipe Menczer, a co-author of an article about political bias on social media, said, “Our main finding is that the information Twitter users see in their news feed depends on the political leaning of their earliest connections.”

 

The flow of information can be compared to a raging waterfall at times. With our biases, we want to share our information with people who feel the same way we do. In some cases, we want to share information to better educate others. However, data and biases can be manipulated and used negatively in the age of misinformation. This is why verifying facts and other information is so important to ensure validity.

  • In 2023 Harvard Public Health wrote about how misinformation makes America sicker. In that article, the author writes, “On social media, misinformation circulates much faster. People with fringe ideas are much more connected now.” That interconnectedness can mean a lot. It can have both positive and negative impacts. Of course, the impacts can be affected by the biases we carry when gaining information (or misinformation) in almost every case. Those personal, political, or cultural biases influence what we intake regarding information.

This influx of newly created media and information, as well as the impact of misinformation, can significantly affect the message. This is how misinformation spreads. Misinformation works very similarly to how synapses create new connections. It just takes one piece to create a torrent of misinformation. Establishing ways of fact-checking the misinformation is vital.

 

We see media presented in many ways in our waking or processing hours. Each day, the world generates over 400 million terabytes of new data. It's also said that almost 90% of all the data ever created has been generated in the last 2 years alone. That's a lot of data and new media to absorb! With the viewer focused on critical media literacy, they can see and process media and other related information and content more effectively.

​

Misinformation comes in all shapes, sizes, and platforms. In the following comic, you'll see some real-world examples of misinformation. 

1.png

The information we obtain (the “inputs”) directly affects how we process it (the “outputs”). As I’ve previously written, our biases can directly affect how we process information. Whether those are unknown, inherent, or perceived biases, they still exist in people’s thinking processes. While the items in the image below aren’t all-encompassing, they're a sample of what could affect and influence people. 

 

​

​

​

​

​

 

 

 

 

 

 

 

​

​

​

​

 

 

 

 

 

​

​

​

Screenshot 2025-04-16 at 4.02.03 PM.png

The green arrows (with blue descriptions) represent the inward influences on thoughts. The purple arrows (with red descriptions) are the outward expressions of thoughts.

How does misinformation tie into a media framework? The Center for Media Literacy’s (or CML) framework consist of the following:

​

CML’s Five Core Concepts

  1. All media messages are ‘constructed.’

  2. Media messages are constructed using a creative language with its own rules.

  3. Different people experience the same media message differently.

  4. Media have embedded values and points of view.

  5. Most media messages are organized to gain profit and/or power.

​

Regarding media literacy, these influences also affect how media is perceived and applied. It helps us think and act critically towards the media presented to us. It also allows us to process the media and information as "software." This information is processed as a connection (or intersection) between what was learned and how we will apply it. The brain helps us make the connections between neurons in our brains. Neurons send signals that help us create what are essentially thoughts. These connections are also known as synapses, and they help these new connections communicate with each other. Dr. Mendell Rimer says, “The synapse is essential for life” (Rimer, 2018). Our brains are constantly processing information, always working even when at rest. In other words, our brains are always on.

​

Here we'll focus on Core concept 5

​

The co-founder of Android, Andy Rubin, was quoted as saying “We don’t monetize the things we create…we monetize users” (Amnesty 2019, p. 8). Social Media companies have a framework that’s not too dissimilar from a media literacy framework. The difference is that companies like Google, Meta, Apple, etc. run off of a business model. Amnesty International writes: “The services provided by Google and Facebook derive revenue from the accumulation and analysis of data about people.8 Instead of charging a fee for their products or services, these businesses require anyone who wishes to use them to give up their personal data instead. (Amnesty 2019, p. 9). It’s safe to assume that social media companies are in the money-making business. By analyzing user data, interactions with others, their clicks, and search history companies are able to tailor algorithms to each individual user. 

​

These algorithms are driven by our biases regarding the things we want to read and intake on social media. The information we obtain (the “inputs”) directly affects how we process it (the “outputs”). Our biases can directly affect how we process information. Whether those are unknown, inherent, or perceived biases, they still exist in people’s thinking processes. While the items in the image below aren’t all-encompassing, they're a sample of what could affect and influence people.

​

Companies could use the information they’ve gathered from us to push misinformation. Habitual social media users can be targeted to share information on a more frequent basis. Gizem Ceylan, a Yale School of Management postdoc scholar, writes: “the reward systems of social media platforms are inadvertently encouraging users to spread misinformation” (Yale 2023). The reward system of “engagement” is what drives this. Ceyland also writes that “They (social media users) post simply because the platform rewards posting with engagement in the form of likes, comments, and re-shares.” 

 

Mark Twain once said "A lie can travel halfway around the world while the truth is still putting on its shoes."

​​

Disinformation-Infographic.jpg
  • How can we combat misinformation? While there aren’t many ways we can prevent misinformation (at least right now), we can take precautionary steps to avoid it.

​

  • The image to the left is from the National Center for State Courts. In an article on Fake News, Pew Research (Pew, 2016) states that: “Americans express a fair amount of confidence in their own ability to detect fake news, with about four-in-ten (39%) feeling very confident that they can recognize news that is fabricated and another 45% feeling somewhat confident. Overall, about a third (32%) of Americans say they often see political news stories online that are made up.”

​

  • By following a “trust but verify” approach to news and media, we can be proactive in helping prevent the sharing of misinformation. The list provided by NCSC provides a clear guide that can help lessen the impact of misinformation.

It's estimated that over 1.1 billion social media posts are created daily. Misinformation can be detrimental to our consumption and sharing of media. However, if the proper steps are taken to verify the information presented, we can avoid having it negatively affect our media intake.

​

We can protect ourselves from misinformation by not taking a piece of media at face value. It’s easy to see information presented to us and believe it’s true. However, we have to always remember the importance of information validation. By ensuring the validity of the presented information, we can further protect ourselves against misinformation.

​

​
 

IMG_0056.jpg

Image citation: “Appalachian State University.” CollegeVine, 2025, www.collegevine.com/schools/appalachian-state-university. Accessed 6 Mar. 2025.

bottom of page