top of page

The Expert Look Part II: What You Owe the Room

  • Writer: TJ Ashcraft
    TJ Ashcraft
  • Mar 6
  • 5 min read

In Part I, we traced the psychology of performed expertise: why people genuinely come to believe they know more than they do, how platforms accelerate that belief, and how the same laundered insight can circulate through hundreds of confident voices without ever deepening into actual understanding.


But self-deception has a limit as an explanation.


At some point — and the scale of what fills our feeds suggests that point is widely crossed — the question shifts from cognitive to ethical. Not just: why do they believe it? But: what do they owe the people they're talking to?

 

The Implicit Contract

When you present yourself as an authority on a subject — through credentials, through confident framing, through the aesthetic signals of expertise — you are making an implicit promise to your audience. Not a legal one. Not a formal one. But a real one.


You are promising that the information you're offering is grounded in something more than familiarity. That you've done the work — the reading, the research, the testing, the revision, the engagement with contrary evidence — that justifies the confidence you're performing. That the audience can make decisions based on what you're telling them without having to independently verify everything from scratch.

 

That promise is the foundation of trust. And it's the promise that cosplayed expertise systematically breaks.

 

The breaking is often invisible. The audience doesn't know what they're not getting. They don't see the missing caveats, the absent counterevidence, the shortcuts taken between source and post. They see the confidence, the polish, the credential, the social proof — and they reasonably conclude that the promise has been kept. The ethical violation isn't just in the misinformation. It's in the asymmetry: the presenter knows — or should know — the limits of their understanding. The audience doesn't.

 

Epistemic Responsibility

Philosophers of knowledge use the term "epistemic responsibility" to describe the obligations we carry as producers and transmitters of belief. The core idea is straightforward: because what we say affects what others believe, and because what others believe affects how they act, the act of assertion carries moral weight. We are not ethically neutral when we speak with confidence about things we don't fully understand.


Epistemic responsibility doesn't demand omniscience. It doesn't require that you only speak about subjects you've mastered completely — no one meets that bar, and pretending otherwise is its own form of dishonesty. What it requires is proportionality: that the confidence you perform matches the understanding you actually have, that you acknowledge the limits of your knowledge, and that you point your audience toward better sources when those sources exist.


The influencer who reposts a pharmaceutical claim without acknowledging they have no medical training. The consultant who presents a framework as original research when it's a reworded blog post. The commentator who delivers geopolitical analysis based on thirty minutes of reading. None of these are neutral acts. Each one uses the architecture of trust to transfer a belief the transmitter isn't in a position to warrant.

 

What Sincere Expertise Looks Like

It is worth naming what the ethical alternative actually looks like — not as an impossible ideal, but as a recognizable posture.


Sincere expertise shows its work. It names its sources, acknowledges their limitations, and distinguishes between what is well-established and what is contested or preliminary. It uses language that is proportionate to certainty: "the evidence suggests" rather than "it's proven that," "in my experience" rather than "the research shows" when experience is actually what's being drawn on.


Sincere expertise acknowledges the edges of its knowledge. It says "I don't know" when it doesn't, and "this is outside my area" when it is. It treats those acknowledgments not as weaknesses to hide but as markers of credibility — because only someone who has genuinely engaged with a domain knows precisely where their understanding ends.


And sincere expertise invites challenge. It doesn't perform certainty as a way of foreclosing questions. It presents conclusions as provisional — open to revision if better evidence arrives — because that's what conclusions actually are when they've been reached through honest inquiry.


This posture is harder to perform at platform scale. It doesn't fit cleanly in a carousel. It doesn't generate the engagement that confident assertion does. But it is the posture that keeps the implicit contract intact.

 

The Public's Side of the Equation

The ethical obligation doesn't run only one direction.


An audience that demands confidence over accuracy — that rewards the polished assertion and scrolls past the qualified argument — is an audience that is training its information ecosystem to produce more cosplayed expertise. The demand shapes the supply. And if the public standard for credibility is "sounds like they know," that standard will be met at scale by people who have learned that sounding is enough.


This is not a way of blaming the audience for being deceived. It's a way of naming the leverage the audience actually has. Visual literacy — the capacity to read credibility signals and ask what's actually being offered — is not just a defensive tool. It's a corrective one. Every time a person asks for the source, checks the provenance, or notices the absence of caveats, they apply a small pressure on the information environment. Collectively, that pressure changes what gets rewarded.


The public can't audit every claim. But it can develop habits of calibration: a default skepticism toward uniform confidence, a preference for sources that acknowledge limits, a willingness to sit with uncertainty rather than reach for the most confident-sounding answer available.

 

Trust Is Earned in the Texture

Here's the practical test: real expertise leaves a texture that cosplayed expertise doesn't.


The texture shows up in specificity — the example that could only come from lived experience in a domain, the counterargument acknowledged rather than ignored, the precise distinction between two things that a non-expert would treat as identical. It shows up in how uncertainty is handled — not papered over with confident delivery, but named, located, and distinguished from what is actually known. And it shows up in what the expert does with hard questions — whether they engage or deflect, whether they update their position or defend it regardless of evidence.


You can learn to read that texture. Not perfectly, and not instantly. But with practice — with the kind of attention that visual literacy develops — you can begin to feel the difference between the person who has walked the terrain and the person who has only seen the map.

 

Terrain Lens: Reading the Ethics of Expertise

When someone presents as an expert — to you, to a room, to a public audience — try this:

1.  What is this asking me to feel before I think?

2.  Is the confidence proportionate — does the certainty match the complexity of the claim?

3.  Are the limits of their knowledge visible — or has everything been smoothed into certainty?

4.  Do they name their sources, methods, or the basis for their conclusions?

5.  How do they handle hard questions or counterevidence — engage or deflect?

6.  Is this inviting understanding — or offering permission to stop thinking?

7.  What implicit promise is being made to this audience — and is there evidence it's being kept?

 

Where in your professional life do you hold an audience's trust — and what would it look like to audit the implicit promise you're making to them?

Comments


 Todd Ashcraft 2026 

bottom of page