top of page

Skepticism isn’t cynicism

  • Writer: Gary Domasin
    Gary Domasin
  • 2 days ago
  • 4 min read

I’ve been spending a lot of time lately asking myself how we decide what’s true. Not in the abstract, but in the everyday moments when you open a browser, type in a question, and get an answer in seconds. What exactly are you trusting in that moment, and why?

The answer isn’t simple. Even for people who pride themselves on being curious, educated, and careful thinkers, separating signal from noise has become harder than ever. I’ve noticed that when I look something up, and something feels off, my instinct is to pause. That hesitation isn’t proof the information is false, but it’s a sign it hasn’t earned my confidence yet. And that pause matters.

It matters even more now that AI tools are becoming the default sources of information. I use them, and so do millions of others. But here’s the reality: when I ask AI questions I already know the answer to, it gets things right most of the time, but not all the time. Sometimes the answer is incomplete. Sometimes it’s just wrong.

That shouldn’t surprise us. These systems don’t reason. They don’t weigh truth. They generate language by pulling patterns from massive amounts of human writing, some careful, some sloppy, some biased. Accuracy isn’t built in. Which is why I’ve made myself a rule: I don’t trust an answer just because it sounds confident. If the information matters, I verify it. Ideally, I want to see the original sources, the raw material, not just a polished summary. It’s like the old days of opening multiple tabs and piecing things together yourself. The process is faster now, but the responsibility is the same.

For me, this all comes down to a commitment to truth. That commitment shapes how I read, how I write, and how I speak. I fact-check. I cross-check. I ask whether my own biases are nudging me toward answers I already want to believe. That kind of self-awareness isn’t comfortable, but it’s necessary on the internet.

Over time, I’ve built up a set of “caution flags.” They’re not automatic deal-breakers, just reminders to slow down and look closer.

Take sources, for example. Information tied to universities or research centers usually carries more weight. Not perfection, but a higher bar. When the source is less formal, I don’t dismiss it outright; I just tread carefully.

The same goes for industries defending themselves. Having skin in the game doesn’t mean someone is lying, but it does mean I listen with sharper ears. History is full of examples, tobacco being the obvious one, where self-interest bent the truth.

Podcasts are another minefield. Some hosts are brilliant interviewers who challenge claims and dig deeper. Others are more about entertainment than examination. If a bold idea gets tossed around and nobody in the room has the expertise to test it, that’s a yellow flag.

Opinions are everywhere online, and they deserve scrutiny. If a source is telling me how to feel instead of giving me the facts, I slow down. Perspectives are fine, but I want to know whether I’m being handed raw material to think with, or a conclusion I’m expected to adopt.

The problem is that opinions often come bundled with selective evidence. You might agree with the conclusion, but you don’t see what was left out. I’ve always believed opinions are healthiest when they’re built on facts you’ve examined yourself.

Context matters too. Clips and fragments are easy to consume, but they can distort meaning. If something important is being shared in a snippet, the responsible move is to find the full source. It takes more effort, but accuracy usually lives there.

Science brings its own challenges. On the cutting edge, uncertainty is normal. One study can spark interest, but until it’s replicated, it’s provisional. That’s not a flaw, it’s the process. Trouble starts when early findings are treated as settled fact, or when data gets cherry-picked to fit a narrative. Vaccine research history is a clear example.

Conspiratorial thinking is another red flag. Real conspiracies have existed, sure, but the structure of conspiracy claims is predictable: a conclusion first, then selective evidence to prop it up, with anything contradictory dismissed as fake or suppressed. At that point, truth becomes unreachable.

I’m equally cautious with claims that “mainstream” knowledge is inherently corrupt. New ideas don’t topple old ones by declaring war; they earn their place through evidence, replication, and peer review. That’s how science and progress have always worked.

Broad generalizations about entire groups, whether ethnic, social, or professional, are another warning sign. Reducing complex communities into villains is lazy and corrosive. It’s a caricature, not understanding.

And finally, extraordinary claims demand extraordinary proof. If someone says they’ve invented a device that will upend an entire industry, or that they alone hold world-changing knowledge, the burden of evidence is enormous. Eyewitness accounts and shaky footage don’t cut it.

At the end of the day, it all circles back to the same principle: truth takes work. There are no shortcuts. Skepticism isn’t cynicism, it’s discipline. And in a world drowning in information, that discipline might be the most valuable skill an individual has.

ree

Comments


Ask Uncle Gary

DM me your question
213 259-3889

  • TikTok
  • Screenshot 2025-10-06 at 1.18.01 PM
  • Youtube
  • Bluesky_Logo.svg
  • Instagram
  • Threads
  • Discord
  • substack
  • Screenshot 2025-11-13 at 3.10.28 PM
  • Pinterest
  • Facebook
  • Tumblr
  • Reddit
https://creditcards.chase.com/

© 2025 by Ask Uncle Gary. All rights reserved. 

© 2025 Ask Uncle Gary USA, LLC. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Service, Privacy Policy (Your Privacy Rights) and Do Not Sell or Share My Personal Information. Ask Uncle Gary may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.

bottom of page