Consumer Research · Online Reviews · April 2026

Reviews shape purchases.
No one trusts reviews.

21 shoppers reveal how the review credibility collapse triggered their own parallel verification systems.

16 of 21
bypass text — go straight to customer photos
smartphone scrolling product reviews skeptical consumer
30-second read— watch

A narrated brief of the load-bearing findings

38sA narrated brief of the load-bearing findings.

Executive summary — three sentences

Consumers have entered an authenticity arms race with review platforms, deploying counter-heuristics faster than platforms can introduce AI moderation. The triggers are clear: text-only reviews are presumed fake, 5-star praise is ignored, and any "incentivized" tag is treated as a disqualifier. Visual proof and negative-first reading patterns are now the baseline for trust — platforms that don't surface these signals force users to leave.

16 of 21rely on photos, not text
13 of 21distrust generic language
8 of 21cross-reference on Reddit/TikTok
5 of 21start with 1-star reviews
16rely on customer photos
13distrust generic text
8cross-platform verify
5read 1-star reviews first

Cookiy AI conducted 21 one-on-one video interviews to understand how real shoppers navigate a review ecosystem they suspect is compromised. Across product categories and platform types, a consistent pattern emerged: users have developed strict personal heuristics to extract signal from a manipulated environment. The result is a parallel trust infrastructure — one built by consumers, for consumers, outside the platforms' control.

Archetypes01 · 4 patterns
Spectrum

From radical disbelief to cautious conditional trust

Where each participant sits on the trust spectrum when approaching online reviews

P130
P5
P4
P132
P131
P3
P2
P129
P128
P1
P123
P116
P118
P119
P120
P121
P122
P124
P125
P126
P127
← Radical skeptic
Cautious accepter →
consumer shopping online decision behavior
~16 of 21 · THE VISUAL VALIDATOR

The Visual Validator

Skips text entirely; trusts only user-uploaded photos.

Navigates directly to the photo gallery and zooms into user-submitted images for material quality, fit, and real-world context. Staged or AI-generated product shots are dismissed immediately as insufficient proof.

“Because an AI or a robot can't really push a photo of something. I just feel more comfortable and confident when there are pictures.”

— P4 · apparel shopper

~13 of 21 · THE INVESTIGATIVE SKEPTIC

The Investigative Skeptic

Reads worst reviews first; hunts for dealbreakers.

Sorts reviews by 1-star and 2-star ratings before reading any positive feedback. Views negative reviews as more authentic and informationally dense than curated top reviews.

“I always go straight for the lowest star, like one star reviews, and see what their complaints are — to get the biggest picture.”

— P130 · electronics shopper

~8 of 21 · THE CROSS-PLATFORM DETECTIVE

The Cross-Platform Detective

Leaves the platform to find unfiltered truth.

When a product's price or brand is unfamiliar, exits the retail platform entirely to search Reddit, TikTok, or Google for organic reactions. Treats off-platform consensus as categorically more credible.

“I would actually go to Amazon or another platform that had the same product and looked at the reviews there, and usually those were more genuine.”

— P2 · home goods shopper

~6 of 21 · THE ANCHORED PRAGMATIST

The Anchored Pragmatist

Uses verified badges and volume as trust proxies.

Despite widespread skepticism, anchors on 'Verified Purchase' badges and minimum review thresholds (10–30+) as initial filters before investing time reading. Trusts structural platform signals more than most.

“If they are consistent, like, people are saying that the size runs true over and over or putting their personal heights.”

— P123 · apparel shopper

Findings02 · five patterns
01

User-uploaded photos are the last line of defense against fake reviews.

product photo real desk authentic unfiltered
How many participants rely on each trust signal?
Customer photos (n=21)
16 of 21
Distrust generic text (n=21)
13 of 21
Cross-platform check (n=21)
8 of 21
Read 1-star first (n=21)
5 of 21
What we saw

16 of 21 participants described navigating directly to the photo gallery and treating user-uploaded images as their primary trust signal. They specifically favored photos with imperfect, natural backgrounds — a messy desk, an unmade bed — over staged shots, which they flagged as potentially AI-generated or brand-supplied.

Counter-signal

Approximately 3 of 21 participants still found text-heavy reviews with 'Verified Purchase' badges credible even without photos, particularly for commodity items where visual proof mattered less than star volume.

Why it matters

For categories like apparel, home goods, and electronics, text-only reviews have effectively lost their persuasive power. Platforms that don't prominently surface user photos are ceding trust to any competitor that does.

Design implication

Create a distinct, highly visible 'Customer Gallery' section at the top of the review area — above text reviews. Deprioritize brand-supplied imagery on product pages where user photos exist.

Something that doesn't have a really nice backdrop... maybe on somebody's desk... that's probably the biggest indicator to see if there's actual real life things happening.
— P129 · general shopper
02

Consumers read worst reviews first — and treat them as the most honest.

one star complaint feedback review authentic
How do users perceive different review tiers?
1–2 star: most authentic (n=21)
5 of 21
Generic 5-star: distrusted (n=21)
13 of 21
Photos valued over text (n=21)
16 of 21
Objective specs trusted (n=21)
6 of 21
What we saw

5 of 21 participants explicitly filter to 1-star and 2-star reviews before reading anything else, and 13 of 21 described dismissing generic positive text outright. Users treat negative reviews as more informationally dense — a source of objective flaws and recurring defects rather than subjective praise.

Counter-signal

Approximately 4 of 21 participants described reading top-rated reviews first, specifically for well-known brands they already trusted — using negative reviews only as a final confirmation step, not a starting point.

Why it matters

Burying or moderating negative reviews doesn't just fail — it actively backfires. Users who can't find honest negatives assume they are being hidden, which increases suspicion rather than reducing it.

Design implication

Surface a 'Most Common Criticisms' summary box at the top of the review section, synthesized from 1- and 2-star content. This transparent negative signal paradoxically functions as a trust builder.

Sometimes those are the most authentic reviews. And give me more information than the positive ones.
— P5 · general shopper
03

"Incentivized" tags are a credibility death sentence.

fake review manipulation text language detection
What signals do users interpret as fake or biased?
Generic hyperbolic praise (n=21)
13 of 21
No user photo attached (n=21)
16 of 21
Incentivized tag present (n=21)
13 of 21
No specific details given (n=21)
6 of 21
What we saw

13 of 21 participants described strong negative reactions to reviews labeled 'incentivized' or suspected of being free-product submissions. They associated promotional tags with bias by definition — regardless of the review's actual content quality. Robotic phrasing, repetitive buzzwords, and five-star unanimity triggered the same skepticism automatically.

Counter-signal

Approximately 3 of 21 participants felt incentivized reviews could still carry useful data if they included photos and specific critiques — a pragmatic minority willing to discount the bias penalty when objective content was present.

Why it matters

The 'incentivized' disclosure mechanism, designed to increase transparency, has backfired — it now signals corruption rather than honesty. The disclosure itself has become the red flag.

Design implication

If incentivized reviews must be displayed, require user photos and a mandatory pros/cons structure. Plain incentivized text reviews should be visually demoted and shown in a clearly separated section.

You'll see more slang language of the reviews, which AI do not use... I can differentiate what is written by AI and what has been written by a human.
— P132 · frequent online shopper
04

High-stakes purchases trigger a full exit from the platform.

reddit tiktok research product verification social
How many use external platforms to verify reviews?
Cross-platform triangulation (n=21)
8 of 21
Rely on photos as proxy (n=21)
16 of 21
Distrust platform curation (n=21)
13 of 21
Trust verified badges (n=21)
6 of 21
What we saw

8 of 21 participants described leaving the retail platform when the stakes were high — expensive items, unfamiliar brands, or suspiciously uniform reviews. They sought out Reddit threads, TikTok unboxings, and Google results as sources of organic, unfiltered opinion before returning to purchase.

Counter-signal

13 of 21 participants did not mention cross-platform behavior, particularly for low-cost, repeat purchases from brands they already trusted — suggesting platform exit is triggered by stakes, not habitual skepticism.

Why it matters

A platform that doesn't earn trust at the review layer loses the purchase moment to a different platform or influencer. The consumer leaves, validates elsewhere, and sometimes buys elsewhere too.

Design implication

Embed curated third-party signals — such as 'Top Reddit discussions about this product' — directly on the product page. Acknowledge the cross-checking behavior rather than fighting it.

I would actually go to Amazon or another platform that had the same product and looked at the reviews there, and usually those were more genuine.
— P2 · electronics shopper
05

Vague praise is ignored. Sensory specifics are trusted.

detailed specification measurement product review specific
What content signals do users actively seek out?
User photos (real setting) (n=21)
16 of 21
Sensory or use-case detail (n=21)
6 of 21
Negative mentions present (n=21)
13 of 21
Verified badge + volume (n=21)
6 of 21
What we saw

6 of 21 participants explicitly cited preference for verifiable, objective data — specific dimensions, fit relative to body size, durability after extended use, or sensory descriptions like texture, scent, or weight. Specificity was interpreted as a proxy for genuine lived experience, making it credible where vague praise is not.

Counter-signal

6 of 21 participants anchored heavily on 'Verified Purchase' badges and review volume as their primary trust filter — deprioritizing content quality in favor of quantity signals, suggesting structural trust proxies haven't fully collapsed for all users.

Why it matters

Review forms that only ask for a star rating and open-ended text are generating low-trust noise. The absence of structured, objective input prompts directly reduces the trust value of the entire review corpus.

Design implication

Redesign the review submission form to prompt for objective context: product dimensions, use case, body measurements for apparel, duration of ownership. Surface these structured fields prominently in the review card display.

Specifics and use feel, you know, smells, touch... no real sort of feeling like the person actually used the product.
— P128 · general shopper
Journey— 7 steps

The skeptical shopper's review trust journey

How consumers move from product discovery to purchase decision in a low-trust review environment

  1. 01
    Arrive at product page
    Initial interest — often from social media, search, or a friend referral.
  2. 02
    Scan star rating and volume
    Quick filter: needs 10–30+ reviews and a 'Verified Purchase' badge to warrant further attention.
  3. 03
    Jump straight to photo gallery
    Bypass all written text; zoom into user-submitted photos for real-world context.
  4. 04
    Filter to 1–2 star reviews
    Hunt for recurring flaws, defects, and potential deal-breaking complaints.
  5. 05
    Scan for authenticity signals
    Flag robotic language, incentivized tags, or reviews with no photos as likely fake.
  6. 06
    Cross-platform verify if high stakes
    Exit to Reddit, TikTok, or Google for unfiltered organic opinions when trust is low.
  7. 07
    Purchase or abandon
    Trust earned through signals → buy. Signal failure or suspicious pattern → leave the platform.
Personas— 2 types

The Visual Validator

Physical proof or nothing — photos over all words

Skips written reviews entirely and goes straight to user-uploaded photos, zooming in to check material quality, fit, and real-world context. Staged or AI-generated models are an immediate red flag that ends the session.

The Investigative Skeptic

Start with the worst; work your way up

Sorts reviews by lowest rating first, treating 1-star complaints as the most honest data source available. Cross-references claims on Reddit, TikTok, or Google for expensive or unfamiliar purchases before committing.

consumer skeptical smartphone reading reviews online

03 · In their words

An AI or a robot can't really push a photo of something. Physical proof matters.
P4 · apparel shopper
I always go straight for the lowest star reviews to see what their complaints are.
P130 · electronics shopper
I can differentiate what is written by AI and what has been written by a human.
P132 · frequent shopper
Something that doesn't have a nice backdrop — maybe on somebody's desk — that's the biggest indicator.
P129 · general shopper
Sometimes those are the most authentic reviews. More information than the positive ones.
P5 · general shopper
I went to another platform with the same product — and usually those were more genuine.
P2 · electronics shopper
For the product team04 · five moves

What to build differently.

Five design moves that would change the relationship between the user and the score.

product interface design UX review trust system
01

Elevate user photos above all else

Create a dedicated customer photo gallery at the top of every review section. Deprioritize brand-supplied imagery wherever user photos exist. Flag photos showing natural, imperfect settings as high-trust signals — these are what 16 of 21 users are specifically hunting for.

02

Surface the catch proactively

Display a synthesized 'Most Common Criticisms' box derived from 1- and 2-star reviews. Users are already hunting for this; providing it transparently builds trust faster than hiding negatives. This is the single highest-leverage design change this research supports.

03

Restructure review input forms

Replace open-ended text fields with structured prompts: use case, ownership duration, body measurements for apparel, sensory observations. Objective inputs generate objective outputs — and objective outputs are what users trust when text-only reviews have lost all credibility.

04

Quarantine incentivized content

Move incentivized reviews into a clearly separated section with distinct visual treatment. If incentivized reviews are permitted, require user photos and a mandatory pros/cons structure. Plain incentivized text reviews should never sit alongside organic content.

05

Surface long-term verified use

Badge reviews from users who owned the product for 30+ days. Explicitly downrank or mark reviews posted on the first day of ownership. Users already distrust early reviews — a structural signal that confirms this instinct becomes a meaningful trust marker.

Methodology

Sample
21 participants
Devices
Method
Semi-structured 1-on-1 video interviews
Dates
April 2026
Recruitment
Participants recruited across a range of online shopping behaviors and product categories; all 21 interviews included with zero skipped.
consumer trust rebuild community peer verification

Trust collapsed online.

Users rebuilt it themselves.

The platforms didn't lose shoppers — they lost the authority to tell shoppers what to believe, and shoppers built something better in the gap.

A Cookiy Story