Amazon · Prime / Offsite Marketing

Improving deal discovery during high-traffic retail events.

When shoppers land on Amazon during major retail moments, the deals they see are not random. My summer project sat with the team responsible for planning and prioritizing manual deal visibility across events like Prime Day, Big Spring Sale, Black Friday, and Cyber Monday.

The work focused on understanding how teams reviewed placement requests, which campaign patterns performed better, and where clearer guardrails could reduce time spent on manual review.

Amazon internship project visual
Problem

How should teams decide which campaigns deserve manual support?

Manual deal placements required significant time to review, approve, and execute. With many teams requesting visibility during major retail events, the team needed data insights on how to prioritize campaign placements and identify which locations are productive.

Outcome

A playbook for future manual deal placement.

Delivered and presented an Amazon-style narrative that helped build stakeholder alignment around a proposed playbook for evaluating manual placements, identifying where different merchandising approaches were most effective across event types and where automation could support future planning.

Process

The work was broken into three phases.

I moved from understanding how placements were planned to analyzing campaign performance, then translated the research into a playbook and final narrative.

01

Phase 1: Understand the workflow and define the metrics

Familiarized myself with the team’s tools, planning process, and manual merchandising workflows. I interviewed product managers, marketing managers, category teams, and data partners to understand campaign review pain points, stakeholder expectations, and which metrics mattered most.

02

Phase 2: Analyze placement performance

Reviewed 400+ campaigns across Prime Day, Big Spring Sale, Black Friday, and Cyber Monday. I also analyzed the campaigns across different placements including Single Cell Takeovers, Shovelers, Above the Fold and Below the Fold.

I worked with data partners to understand attribution limits, compare placement types, and test early hypotheses using SQL, Amazon QuickSight, and internal reporting tools.

03

Phase 3: Build and present the playbook

Consolidated the research and analysis into a 6-pager Amazon-style narrative, iterating with stakeholders for alignment and tradeoffs before presenting recommendations for evaluating manual placements across future retail events.

Findings

What the analysis helped clarify.

01 · Campaign fit

Some campaigns were better suited for manual visibility

Campaign fit depended on category, message clarity, event context, and whether manual support could add value beyond standard placement logic.

02 · Shopper behavior

Different events shaped how customers browsed

Discovery-led events and high-intent shopping moments created different expectations for how shoppers engaged with deal placements.

03 · Workflow support

Manual review needed clearer support

Teams needed a clearer way to review requests, compare campaign quality, and reduce time spent on low-fit placements.

Outcome · Playbook

What the final playbook covered.

01 · Guardrails

Campaign evaluation guardrails

Define clearer criteria for which campaigns should receive manual support and how placement requests should be evaluated.

02 · Automation

Review support opportunities

Identify where automation can support manual review while keeping human judgment in the final decision loop.

03 · Planning

Reusable event planning framework

Package the logic into a repeatable framework for future high-velocity retail events.

Key takeaway: small, thoughtful changes in merchandising logic can shape how customers discover products, even on a platform operating at Amazon’s scale.