Skip to main content

Reviewing Cards

Learn how to efficiently process feedback and make accept/reject decisions.

The Weekly Review

Most teams review cards weekly. This cadence keeps your Needs Signal inbox manageable without being overwhelming.

Recommended schedule: Friday afternoon or Monday morning, 30-60 minutes.

Review Workflow

Step 1: Filter to New Cards

  1. Go to Needs Signal section
  2. Filter by "Created in last 7 days"
  3. Sort by creation date (newest first)

This shows you all new feedback since your last review.

Step 2: Quick Triage

Scan titles and make snap decisions on obvious cases:

Obvious signal (clear pattern, strategic fit):

  • Accept immediately with brief rationale
  • Add tags
  • Move on

Obvious noise (out of scope, vague, spam):

  • Archive immediately
  • Add brief note explaining why (for future reference)
  • Move on

Everything else:

  • Leave in Needs Signal for deeper review

Goal: Process 50% of new cards in 10 minutes.

Step 3: Deep Review

For cards that aren't obvious:

  1. Read the full context - description, any signal notes
  2. Search for duplicates - has this been requested before?
  3. Check for patterns - tag it and see how many similar cards exist
  4. Evaluate fit - does this align with strategy?
  5. Add signal notes - document your thinking
  6. Decide: Accept, archive, or leave for more signal

Goal: Process remaining 50% in 20-30 minutes.

Step 4: Cleanup

Before ending your review session:

  1. Check for duplicates in newly accepted cards
  2. Merge if needed
  3. Tag everything (no untagged cards should remain)
  4. Add a quick note to ambiguous cards explaining why you're waiting

Goal: Leave inbox cleaner than you found it.

Decision Criteria

When to Accept

Accept if all of these are true:

✅ Pattern exists (multiple requests OR single strategic need) ✅ Aligns with product vision and strategy ✅ Problem is clear (even if solution isn't) ✅ Feasible to build (eventually, doesn't have to be soon) ✅ You can articulate why it matters

When to Archive

Archive if any of these are true:

❌ One-off request with no supporting evidence after 90 days ❌ Fundamentally misaligned with product direction ❌ Solving the wrong problem (user thinks they want X, really need Y) ❌ Not actionable ("make it better" with no specifics) ❌ Already exists (user missed existing feature) ❌ Bug report (should be in Jira, not Distil)

When to Wait

Leave in Needs Signal if:

⏸ Need more signal (wait to see if pattern emerges) ⏸ Investigating feasibility (checking with eng/design) ⏸ Strategic decision pending (waiting on roadmap priorities) ⏸ Gathering more context (follow-up questions sent to customer)

Don't let cards sit forever. If still uncertain after 90 days, make a decision.

Efficient Review Techniques

Batch Similar Cards

Group cards by theme during review:

  1. Review all "billing" cards together
  2. Then all "performance" cards
  3. Then all "mobile" cards

Why: Patterns become obvious when you see related feedback in sequence.

Use Keyboard Shortcuts

  • j / k - Next/previous card
  • a - Accept current card
  • x - Archive current card
  • t - Add tag
  • Enter - Open card detail

Press ? to see all shortcuts.

Time-Box Your Review

Set a timer for 45 minutes. Process as many as possible in that time.

Don't aim for inbox zero in one session. Consistent weekly progress beats marathon sessions.

Flag for Follow-Up

If a card needs more investigation:

  1. Add a signal note: "Follow up with customer about use case"
  2. Tag: needs-followup
  3. Set a reminder to check back next week

Don't let "needs more info" become an excuse to avoid decisions indefinitely.

Review Formats

Solo PM Review

Format: You alone, Friday afternoon, 30-45 min

Process:

  1. Quick triage (10 min)
  2. Deep review (25 min)
  3. Cleanup (10 min)

Pros: Fast, no coordination needed Cons: No diverse perspectives

Team Review Meeting

Format: PM + tech lead + designer, weekly 60 min

Process:

  1. PM presents top 10 cards from Needs Signal
  2. Team discusses each (5 min per card)
  3. Consensus decision on accept/archive
  4. PM documents rationale

Pros: Better decisions, team alignment Cons: Time-intensive, requires coordination

Async Review

Format: Team members review and comment throughout the week, PM decides Friday

Process:

  1. Team members add signal notes and comments to cards
  2. Friday: PM reviews all feedback
  3. PM makes final accept/archive decisions
  4. PM documents rationale

Pros: Flexible, leverages team knowledge Cons: Slower, requires active participation

Hybrid

Format: Async during week, 30-min sync meeting Friday

Process:

  1. Mon-Thu: Team adds signal notes async
  2. Friday: PM + team meet for 30 min
  3. Discuss only controversial or high-impact decisions
  4. PM makes final calls after meeting

Pros: Best of both (efficient + collaborative) Cons: Requires discipline from team

Dealing with High Volume

If you're getting >50 cards/week:

Triage Ruthlessly

Create a quick filter:

  • P0 (review immediately): customer-facing, blocker, high-value
  • P1 (review this week): everything else
  • P2 (review next week): low-impact, unclear, edge cases

Review P0 daily, P1 weekly, P2 bi-weekly.

Automate Noise Filtering

Use Distil's duplicate detection and auto-tagging:

  • Automatic duplicate suggestions
  • AI-powered tag suggestions
  • Keyword-based auto-archive rules (Enterprise feature)

Delegate Collection

Have CS/sales/support add signal notes during collection:

  • Customer tier (enterprise, SMB, free)
  • Urgency (blocker, high, medium, low)
  • Source (support ticket, sales call, survey)

PM focuses on decision, not investigation.

Batch Archive

Once per month, bulk-archive:

  • Cards in Needs Signal for >90 days with no additional signal
  • Duplicates that were merged
  • Obvious noise you missed initially

Review Cadence by Team Size

Solo PM

  • Weekly review: 30 min
  • Monthly cleanup: 15 min
  • Total: ~2.5 hours/month

Small team (2-3 PMs)

  • Weekly review: 45-60 min
  • Monthly planning: 60 min (prioritize accepted items)
  • Total: ~4-5 hours/month per PM

Large team (5+ PMs)

  • Daily quick review: 15 min (just your area)
  • Weekly deep review: 45 min
  • Monthly cross-team sync: 60 min
  • Total: ~7-8 hours/month per PM

Red Flags in Reviews

Red Flag: Can't Decide

Symptom: Staring at a card for 10 minutes, can't decide accept or archive.

Solution: Add a signal note explaining your uncertainty and move on. Come back next week with fresh eyes.

Red Flag: Accepting Everything

Symptom: Accept rate >70%.

Problem: You're rubber-stamping. Roadmap loses meaning if everything is "important."

Solution: Raise your bar. Ask "Do we have clear evidence this matters?"

Red Flag: Rejecting Everything

Symptom: Accept rate <10%.

Problem: You're too conservative. You'll miss opportunities and frustrate customers.

Solution: Lower your bar slightly. Remember: accepting doesn't mean "build tomorrow."

Red Flag: Backlog Growing

Symptom: Needs Signal grows week over week despite reviews.

Problem: Collection rate > processing rate.

Solution: Either process faster (more review time, batch operations) or collect less (raise bar for what gets logged).

Measuring Review Effectiveness

Track these metrics:

Processing rate: Cards reviewed per week (should match or exceed collection rate)

Accept rate: ~20-40% is healthy (too low = overly conservative, too high = rubber-stamping)

Time in Needs Signal: Average <30 days, max <90 days

Backlog size: <75 cards in Needs Signal

Regret rate: % of accepted cards you later wish you hadn't accepted (<10%)

After the Review

Communicate Decisions

For high-visibility cards:

  • Comment on the card explaining the decision
  • Email the requester (if customer feedback)
  • Update internal stakeholders (if from sales/cs)

Example: "Thanks for this feedback! We're accepting this for our Q2 roadmap based on similar requests from 8 other customers."

Prioritize Accepted Items

After accepting cards, prioritize:

  1. Which accepted items to push this sprint?
  2. Which can wait?
  3. Which are dependencies for other work?

Separate meeting: Don't combine review (accept/reject) with prioritization (when to build). Different mindsets.

Clean Up Tags

Weekly tag maintenance:

  • Ensure all reviewed cards are tagged
  • Fix inconsistent tag naming
  • Merge redundant tags

Advanced: Review Templates

Create reusable acceptance rationale templates:

Customer pattern: "Requested by [X] customers in the past [Y] weeks. Segments: [segments]. Aligns with [strategic goal]."

Strategic bet: "Low volume but strategically important for [reason]. Enables [future capability]."

Dependency: "Prerequisite for [accepted feature]. Blocks [X] other roadmap items."

Save these as snippets for faster reviews.

Review Anti-Patterns

Batch Reviewing Once Per Month

Problem: Backlog piles up, review takes 3 hours, overwhelming.

Solution: Weekly 30-min sessions > Monthly 3-hour marathon.

Perfectionism

Problem: Can't accept until you've fully scoped the solution.

Solution: Acceptance is about validating the problem, not designing the solution. Scope later.

Lack of Documentation

Problem: Accept cards without rationale, forget why 3 months later.

Solution: Always write 1-2 sentence rationale. Future-you needs it.

Ignoring Patterns

Problem: Review cards in isolation, miss that 5 cards are about the same problem.

Solution: Use tags and search to find related cards before deciding.

Next Steps