Day 255 – Emotional Sovereignty in the Age of Snippets

Last night we had a party at our house—a fun social gathering my wife hosted in honor of one of her work colleagues. I took on the role of pizza chef, and my amateur status was on full display with the first few woodfired disasters. But after a couple of casualties, I managed to produce a decent pizza—or at least one that wasn’t burned to a crisp on one side and raw on the other.

Speaking of being raw on one side, I had promised myself not to take the bait on any political references during the party. I wanted to be on my best behavior, carefully avoiding any political triggers that might throw off the pizza zen I had worked so hard to achieve.

Thankfully, there weren’t many. But every now and then, you could hear a subtle narrative framing or the occasional ad populum fallacy float through the conversation.

Let me pause to explain something. My first college class—which I failed—was economics. The professor was a Jesuit priest who had spent his life studying in Latin. He knew economics, yes, but what he really knew was logical fallacies. He wouldn’t let a student get away with one. When someone used flawed reasoning, he would stop the class, diagram the error on the board, and make the student rephrase their argument without the fallacy.

One fallacy he frequently paused on was ad populum—the idea that something must be true because many people believe it. That was the early ’90s, long before social media. Today, we live in bubbled ecosystems where it’s all too easy to believe that “everyone” agrees with us.

At the party, I heard it a few times—people casually referencing political issues as if they were self-evident truths shared by the whole world:

“Well, since we now live under a fascist regime…”
“Our government is full of corrupt crooks…”
“We all know how much power the NRA has…”
“Big Pharma controls everything…”

I was proud of myself—I didn’t take the bait, tempting as it was. The affective polarization was easy to spot if you listened closely. This is the process by which people increasingly base political or moral beliefs not on policy or logic, but on how they feel about “us” vs. “them.” Emotions become a kind of evidence, and people mistake their passion or disgust for moral certainty.

My wife and I often notice a similar phenomenon when we watch TV together. During a show, there’s usually some plot point that involves technology. The writers often have only a superficial understanding of the tech, so they toss in a laughably inaccurate explanation to move the story along. My favorite is the classic:

“Enhance the image.”
And suddenly, a blurry photo transforms into crystal clarity, revealing the license plate number down to the bolt.

I scoff and mock it, and my wife rolls her eyes and tells me to stop ruining the show.

Then there’s the reverse: a legal drama shows someone getting arrested, thrown into a holding cell, and having every constitutional right violated for the sake of a confession. That’s when she scoffs and yells at the TV. I turn and say, “Hey, stop ruining the show!”

This dynamic has a name, courtesy of none other than Michael Crichton. In a 2002 speech at the International Leadership Forum, he coined the Gell-Mann Amnesia Effect, named after physicist Murray Gell-Mann. Here’s how Crichton described it:

“You open the newspaper to an article on some subject you know well. In Murray’s case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues.

Often, the article is so wrong it actually presents the story backward—reversing cause and effect. I call these the ‘wet streets cause rain’ stories. The paper’s full of them.

In any case, you read with exasperation or amusement the multiple errors in the story and then turn the page to national or international affairs and read as if the rest of the newspaper was somehow more accurate.

That is the Gell-Mann Amnesia effect… When, in fact, it almost certainly isn’t. The only possible explanation for our behavior is amnesia.”

Because my wife is a lawyer, she consumes media about the legal system with intense scrutiny—but she’ll read a technology article and just believe it. If the article claims AI is now developing failsafes to protect itself from human interference, she’ll turn to me and say, “Wow, this AI stuff is scary!”

The reverse is true for me. I question every tech article and go straight to primary sources—but when I read a story about a legal case, I swallow it whole.

This phenomenon—whether you call it tribal truth, confirmation bias, or the echo chamber effect—is being amplified by social media algorithms. Just a few days ago, I read an article in the Associated Press about an immigration bust at a restaurant I’d eaten at. The account painted government agents in a terrifying, Gestapo-like light. I was horrified.

But then I clicked into the comments and saw entirely different perspectives, full of overlooked details. As I read further, it became clear: both sides were appealing to emotion, using selective facts to feed the worldview of their audience.

If that had been a tech article, I would’ve second-guessed everything and gone digging for the truth. But when it’s news I expect to be true, I just… believe.

So today I reminded myself of something important:
Don’t let external voices manipulate how you feel about a subject—especially when they’re only giving you snippets designed to provoke a reaction. It’s too easy to be swept up in outrage, certainty, or righteous judgment based on someone else’s framing. But when I dig deeper, the truth is rarely salacious—it’s almost always nuanced, complicated, and inconvenient.

This is the real takeaway:
Emotional independence requires intellectual patience.
To be grounded in truth rather than manipulated by noise, I have to pause, investigate, and sit with the discomfort of complexity.

I’m tired of information manipulation—facts selectively twisted to shape how I see the world. This has been happening for centuries—by propagandists, politicians, media, even religious leaders. The intentions may vary, but the result is the same: distorted public perception built on partial truths.

And too often, I realize I’m a willing participant. Scrolling through social media, I’m being trained—conditioned, really—to react emotionally to the content the algorithm thinks I’ll like.

So today, I’m declaring emotional sovereignty.
I’m choosing not to let others hijack my emotions without earning that influence through full truth. I’m going to practice this inner discipline, because it’s essential in today’s world.

I won’t allow a stream of external voices to tug at my feelings all day long.

Now, of course, I want you to like, subscribe, turn on notifications, set this as your favorite, repost, react, upvote, and give me lots of ❤️s—so the Great Algorithm in the Sky will surface my content to the top of your feed.

Hypocrisy noted.

Subscribe
Notify of
guest
1 Comment
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Steven Larky
Steven Larky
4 hours ago

I’ve always been troubled by seeing errors in articles but not knowing what the errors were in other articles. Not exactly what you wrote – I don’t assume the rest are correct, I assume they have errors, but I just don’t know what they are (or how important they are to the main point of the article).
Note that this problem isn’t limited to mainstream newspapers, I used to see errors in technical publications as well (EE-Times).

Share the Post:

Recent Blogs

1
0
Would love your thoughts, please comment.x
()
x