Work.Incentivized Reviews
SHIPPED
UX StrategyLoyaltyProduct Page

Company

American Eagle Outfitters

My role

UX Strategist

Scope

15M+ Loyalty Members

Solving Low Customer Review Participation

Product pages had low review coverage, which limited both customer confidence and merchandising insight. I led a cross functional initiative to introduce loyalty based review incentives that increased participation while preserving credibility.

The program increased daily review volume by about 300 percent, reaching roughly 1,500 reviews per day.

Shipped Product - Mobile Web
From low-coverage PDPs to a living review ecosystem
Review submission to published result, points framed as appreciation, not payment
Review submission form
Loyalty opt-in integrated into the existing flow. No new platform required
Published with reward badge
"Reward Points Awarded" tag visible to all, transparent by design
01
No new platform
Earn layered into the existing review flow. and tech stack. Loyalty accounts matched via submitted email, no forced sign-in
02
Transparent by design
"Reward Points Awarded" tag visible to all, transparent by design
03
Appreciation, not payment
Points awarded after moderation approval, independent of sentiment or rating, honest reviews only
Shipped Product - Mobile Web
From low-coverage PDPs to a living review ecosystem
Review submission to published result, points framed as appreciation, not payment
Review submission form
Loyalty opt-in integrated into the existing flow. No new platform required
Published with reward badge
"Reward Points Awarded" tag visible to all, transparent by design
01
No new platform
Earn layered into the existing review flow. and tech stack. Loyalty accounts matched via submitted email, no forced sign-in
02
Transparent by design
"Reward Points Awarded" tag visible to all, transparent by design
03
Appreciation, not payment
Points awarded after moderation approval, independent of sentiment or rating, honest reviews only
THE MOMENT THAT CLOSED THE LOOP
The MoSCoW workshop model, bringing marketing, engineering, legal, and design into the same room before anyone wrote a spec and became a reference for how to run cross-functional alignment on ambiguous initiatives. That process had longer legs than the feature itself.
Overview

More reviews without making them meaningless.

Review volume was lower than expected across PDPs. Marketing and UX partnered to explore whether loyalty points could increase participation without turning the reviews section into a points farm.

The challenge wasn't technically complex. It was strategically complex. How do you incentivize behavior without distorting it? That question shaped every decision from scope to launch.

Business context
Reviews influence purchase decisions, return rates, and merchandising strategy. Low catalog coverage meant customers had less confidence buying and the business had less signal to work with. Both problems had the same fix.

I partnered with another strategist to run a cross-functional workshop using MoSCoW prioritization and additional working sessions, bringing marketing, engineering, legal, product, and UX into the same room before anyone wrote a spec.

My role and contributions

Led UX strategy and competitive research
Facilitated cross-functional alignment workshop
Defined Phase 1 MVP with documented tradeoffs
Partnered with product and engineering on loyalty platform constraints

Project context

Part of a broader effort to expand non-transactional loyalty earn
Required alignment with legal on incentivized review disclosure
Used Crowdtwist loyalty platform with limited customization options
Why this mattered to the business
Product pages with few reviews create friction for customers deciding whether to purchase.
Low review participation also limits feedback that merchandising teams use to evaluate product performance.
Increasing review volume could, improve customer confidence on product pages, generate more feedback about product quality and fit, and strengthen engagement with the loyalty program
Problem & Solution

Incentivize participation without paying for sentiment

The core tension was real: introduce an incentive strong enough to change behavior, but not so transactional that it undermined the authenticity customers rely on when reading reviews.

The problem

Review volume too low to be useful

PDPs lacked review coverage. Merchandising teams had limited qualitative signal. Customers had less confidence buying without peer input. And the business had no low-friction way to change that.

The solution

50 loyalty points per approved review, integrated into existing flows

Awarded 50 loyalty points per approved review through the existing program. Integrated the earn into current review flows. Matched reviews to loyalty accounts via submitted email. Positioned points as appreciation, not payment.

Research

Customers weren't disengaged. They just needed a reason.

I analyzed how other retailers incentivize reviews, focusing on whether incentives were transactional or loyalty-based, how they were communicated, and what guardrails were in place to protect quality.

The competitive finding was consistent: incentives increase participation, but guardrails and expectations heavily influence quality. Most retailers launched lean and iterated once they understood real behavior.

Motivated to help

Primary motivation

Customers leave reviews to help others or express strong feelings, not primarily for rewards

Framing matters

Incentive positioning

Incentives framed as appreciation drove better participation than those framed as payment

Launch lean

Competitor pattern

Launch with minimal controls, learn from real behavior, then tighten guardrails in later phases

"Incentives increase participation, but guardrails and expectations heavily influence review quality."

This shaped the Phase 1 strategy: launch with enough structure to be credible, accept known risks intentionally, and use real behavior to inform what to tighten in future phases.

WORKSHOP

MoScoW Workshop - Snippet

Competitor Examples: Abercrombie, Adidas and Old Navy

Using tools we already have

Additional Question in Review Form

Points Rewarded Tag

Use what we already have - BazaarVoice

We already used Bazaar Voice to manage product reviews on the site. They already had the capability of implementing incentivized reviews on their platform, so we decided to use it and update content on the site rather than go through lengthy re-design.

Research

Customers weren't disengaged. They just needed a reason.

I analyzed how other retailers incentivize reviews, focusing on whether incentives were transactional or loyalty-based, how they were communicated, and what guardrails were in place to protect quality.

The competitive finding was consistent: incentives increase participation, but guardrails and expectations heavily influence quality. Most retailers launched lean and iterated once they understood real behavior.

Motivated to help

Primary motivation

Customers leave reviews to help others or express strong feelings, not primarily for rewards

Framing matters

Incentive positioning

Incentives framed as appreciation drove better participation than those framed as payment

Launch lean

Competitor pattern

Launch with minimal controls, learn from real behavior, then tighten guardrails in later phases

"Incentives increase participation, but guardrails and expectations heavily influence review quality."

This shaped the Phase 1 strategy: launch with enough structure to be credible, accept known risks intentionally, and use real behavior to inform what to tighten in future phases.

WORKSHOP

MoScoW Workshop - Snippet

Competitor Examples: Abercrombie, Adidas and Old Navy

Using tools we already have

Additional Question in Review Form

Points Rewarded Tag

Use what we already have - BazaarVoice

We already used Bazaar Voice to manage product reviews on the site. They already had the capability of implementing incentivized reviews on their platform, so we decided to use it and update content on the site rather than go through lengthy re-design.

Outcomes

Triple the reviews, real tradeoffs, useful learnings

The results validated incentives as an effective lever while surfacing clear opportunities for refinement. The anticipated quality tradeoff materialized exactly as expected, and the team had a real foundation for Phase 2.

300%
Review Volume Increase

From baseline to approximately 1,500 reviews per day

Both
Quality Distribution

Increase in both high-quality and low-quality reviews, anticipated and accepted as a Phase 1 learning

Measurable
Loyalty Sign-ups

Small but real increase in new loyalty account creation tied to the review flow

Beyond the metrics

The workshop model here, MoSCoW prioritization with legal, marketing, engineering, and design in the same room, became a reference for how to run cross-functional alignment on ambiguous initiatives. That process had longer legs than the feature itself.

What I'd do differently

Introduced a soft character minimum earlier. Even 20 words would have filtered the lowest-effort submissions without meaningfully reducing participation.

Tracked the ratio of high-quality to low-quality reviews as a health metric from day one, not just total volume.

What I'm proud of

Running a workshop that produced actual decisions. Cross-functional alignment sessions often end in notes. This one ended in a scoped MVP everyone had signed off on.

Framing the known risks openly rather than minimizing them. The team launched informed, not optimistic.

Building strong cross-functional relationships

Outcomes

Triple the reviews, real tradeoffs, useful learnings

The results validated incentives as an effective lever while surfacing clear opportunities for refinement. The anticipated quality tradeoff materialized exactly as expected, and the team had a real foundation for Phase 2.

300%
Review Volume Increase

From baseline to approximately 1,500 reviews per day

Both
Quality Distribution

Increase in both high-quality and low-quality reviews, anticipated and accepted as a Phase 1 learning

Measurable
Loyalty Sign-ups

Small but real increase in new loyalty account creation tied to the review flow

Beyond the metrics

The workshop model here, MoSCoW prioritization with legal, marketing, engineering, and design in the same room, became a reference for how to run cross-functional alignment on ambiguous initiatives. That process had longer legs than the feature itself.

What I'd do differently

Introduced a soft character minimum earlier. Even 20 words would have filtered the lowest-effort submissions without meaningfully reducing participation.

Tracked the ratio of high-quality to low-quality reviews as a health metric from day one, not just total volume.

What I'm proud of

Running a workshop that produced actual decisions. Cross-functional alignment sessions often end in notes. This one ended in a scoped MVP everyone had signed off on.

Framing the known risks openly rather than minimizing them. The team launched informed, not optimistic.

Building strong cross-functional relationships

Outcomes

Triple the reviews, real tradeoffs, useful learnings

The results validated incentives as an effective lever while surfacing clear opportunities for refinement. The anticipated quality tradeoff materialized exactly as expected, and the team had a real foundation for Phase 2.

300%
Review Volume Increase

From baseline to approximately 1,500 reviews per day

Both
Quality Distribution

Increase in both high-quality and low-quality reviews, anticipated and accepted as a Phase 1 learning

Measurable
Loyalty Sign-ups

Small but real increase in new loyalty account creation tied to the review flow

Beyond the metrics

The workshop model here, MoSCoW prioritization with legal, marketing, engineering, and design in the same room, became a reference for how to run cross-functional alignment on ambiguous initiatives. That process had longer legs than the feature itself.

What I'd do differently

Introduced a soft character minimum earlier. Even 20 words would have filtered the lowest-effort submissions without meaningfully reducing participation.

Tracked the ratio of high-quality to low-quality reviews as a health metric from day one, not just total volume.

What I'm proud of

Running a workshop that produced actual decisions. Cross-functional alignment sessions often end in notes. This one ended in a scoped MVP everyone had signed off on.

Framing the known risks openly rather than minimizing them. The team launched informed, not optimistic.

Building strong cross-functional relationships

Shana Shields
SENIOR PRODUCT DESIGNER (UX & STRATEGY)
© 2026 S. Shields · All rights reserved
Shana Shields
SENIOR PRODUCT DESIGNER (UX & STRATEGY)
© 2026 S. Shields · All rights reserved