top of page

How We Sometimes Mistake Artificial Intelligence for the Real Thing: Feelings Over Facts

Updated: May 24, 2023


I've been fascinated recently by AI-generated images of former President Trump’s arrest and appearance before a New York court. Many of these images are lifelike and convincing; others appear raw and fake.


How do we know which images are real, and which ones aren’t? Media literacy tools can help us to gain a critical understanding of the AI digital realm and how it captures our curiosity. When we gain strategies to recognize AI at work, we become more discerning viewers.


AI often uses appeals to emotion, which can be highly effective. Stirring emotions in a target audience is a way to get us involved and create more opportunities to persuade us to take action. We need to step away from the feelings that any image evokes and ask questions about its composition if we are to figure out whether it’s authentic or has been composed intentionally to create certain feelings in us. Much like the internet empowered certain power brokers who could effectively use it, AI shifts persuasive capacity towards those individuals who are savvy about its persuasive potential.


We can make sense of AI-generated images if we slow down and study them. Let's do an analysis of an image together, shall we?


Image Analysis: AI or a Real Photo?

What’s the first thing you notice as you glance at this image?


Eight police officers surround former President Donald Trump, who is in the center of the composition. The photo adheres to the “rule of thirds,” which breaks an image down into three sections -- both horizontally and vertically. Our eyes fall on Trump because he is at the core of the action and arrangement.


What prior knowledge do you draw upon to begin to analyze this image?



Trump surrendered and was placed under arrest on April 4, 2023 before he was arraigned in an historic and unprecedented court appearance. The former president heard the charges against him presented by Manhattan District Attorney Alvin Bragg at that time. The arraignment was routine, by all accounts, including from conservative media outlets like Fox News and NY Post.


What features of this image suggest it might have been altered?


Lighting is an important component that can help us determine authenticity. In this image of Trump tackled to the ground, the lighting on his hair and face is much brighter than that of the officers around him. On closer scrutiny, Trump’s face seems less like a real person and more like a cartoon animation than do the faces of the officers around him. The featured persons in the photo have exaggerated facial expressions on them. The edges of Trump’s head appear a little distorted instead of solid; AI editing often causes digital manipulation in the form of pixelation or imperfect coloring. A couple of the hands don’t look like human hands -- they’re oversized, awkwardly placed, and have a distinct light coloration. Trump’s head is placed awkwardly on his shoulders -- another clue that points to this image’s AI alteration.


Days before former President Trump turned himself in to face criminal charges in New York City, images like these and others created by Eliot Higgins, a British journalist and founder of Bellingcat, an open source investigative organization, circulated through social media. Higgins clearly marked them as creations to draw attention to AI’s power.


The image below is the actual photo from Trump’s arrest. He is escorted and protected rather than wrestled. Shadows, lighting, body part positioning, facial expressions, and pixelation are all appropriate to the actual Manhattan courthouse scene.



As a recent article in Wired notes, AI images are becoming harder to “differentiate from the real deal.” Synthetic content increasingly presents a complex and often unsettling world.


AI advancements are already fueling mis- and disinformation and being used to stoke political divisions. The technology may hasten an erosion of trust in media, government, and society. If any image can be manufactured -- and manipulated -- how can we believe anything we see?


Taking the time to determine where feelings might hide facts in images is important intellectual work. Disinformation in images can decrease public trust in a community, organization, or even the whole country, which can ultimately endanger democracy. You can spot and stop the spread of disinformation by being vigilant on social media, doing your research, and recognizing that your own biases and opinions may be a barrier to getting the full story. Don’t feel compelled to share until you’ve determined a message’s veracity. Think about the repercussions of mis- and disinformation. Reflect on the impact you can have by thinking critically about the story and finding your voice to share the facts.


Final Thoughts: Facts over Feelings


What do you accomplish when you distinguish between feelings and facts in social media imagery and posts? You come to terms with the limitations of reasoning and evidence as you go through your everyday routines. It’s clear that the power of stories and emotions are important influences in how we think and make decisions. When we see polished AI images from influential people whose views are similar to our own, we feel as if what they say is true because we respect their competence and credibility. At the end, we can find that the actual facts are unsettling, or disappointing, or disheartening when stories that resonate with our own lived experience end up being falsified to persuade us.


By developing self-reflective habits as we consume social media, we step outside a mist of manipulation. We realize that our emotions can, and often do, shape our beliefs more than any logic. That is why, to recognize facts, we need to be calm. That’s a huge step toward controlling our emotions -- and we can teach ourselves to stop and think at any point of our lives.



Join our Mailing List

Receive the latest news from our community right in your inbox.

bottom of page