Skip to main content

How Distil Processes Feedback

Understand the journey feedback takes from collection to shipped feature.

The Feedback Lifecycle

Every piece of feedback in Distil follows this path:

Collection → Evaluation → Decision → Execution → Closure

Let's break down each stage.

Stage 1: Collection

What happens: Feedback enters Distil from various sources.

Sources:

  • Manual entry (you create cards)
  • Email forwarding
  • API integrations
  • Slack (coming soon)

Outcome: Card is created in Needs Signal status.

Who's responsible: Anyone can collect feedback (PMs, CS, sales, support).

Time: Ongoing - feedback arrives whenever it arrives.

Learn more about collecting feedback

Stage 2: Evaluation

What happens: You review the card and ask "Is this signal or noise?"

Key questions:

  • Is this a real pattern or a one-off?
  • Have we heard this before?
  • Does it align with our strategy?
  • What problem are they actually trying to solve?

Actions:

  • Add signal notes to capture context
  • Tag the card for categorization
  • Search for duplicates
  • Investigate (talk to customers, check analytics)

Outcome: You decide whether to accept or archive.

Who's responsible: Product managers (occasionally with input from eng/design).

Time: Weekly review sessions (30-60 min).

Learn more about signal vs. noise

Stage 3: Decision (Acceptance)

What happens: You commit to building this by accepting the card.

What acceptance means:

  • This is validated as worth building
  • It moves to your roadmap (Accepted section)
  • You've documented why it matters (rationale)

Requirements:

  • Clear evidence this is signal
  • Alignment with product strategy
  • Documented rationale (especially with Governance Pack)

Outcome: Card moves to Accepted status.

Who's responsible: Product managers (with governance approval if needed).

Time: During evaluation - happens as you review cards.

Learn more about acceptance

Stage 4: Scoping

What happens: You plan how to build it.

Actions:

  • Write implementation notes
  • Break down into sub-tasks (if complex)
  • Estimate effort (optional - can do this in Jira/Linear)
  • Prioritize against other accepted work

Outcome: Card is ready to push to dev tools.

Who's responsible: Product managers (with engineering input).

Time: During sprint planning or roadmap sessions.

Stage 5: Execution

What happens: Engineering builds it.

Actions:

  • Push card to Jira or Linear
  • Card status becomes Ready
  • Eng picks up the issue and builds
  • Work happens in Jira/Linear, not Distil

Outcome: Feature is shipped.

Who's responsible: Engineering team.

Time: Depends on complexity and sprint schedules.

Learn more about pushing to Jira

Stage 6: Closure

What happens: You close the loop with requesters.

Actions:

  • Mark card as shipped in Distil (optional)
  • Update public roadmap if using Visibility Pack
  • Email customers who requested it
  • Add link to release notes or changelog

Outcome: Requesters know their feedback was heard and acted on.

Who's responsible: Product managers or customer success.

Time: When feature ships.

The Role of Status

Status tracks where feedback is in this lifecycle:

StatusStageMeaning
Needs SignalEvaluationUnprocessed, needs review
AcceptedDecision/ScopingValidated, on roadmap
ReadyExecutionPushed to dev tools, being built
(Archived)RejectedNot worth building (noise)

Decision Points

Accept or Archive?

After evaluation, you have two choices:

Accept if:

  • Clear pattern (multiple requests)
  • Strategic fit
  • Validated problem
  • Feasible to build

Archive if:

  • One-off request with no supporting evidence
  • Misaligned with product vision
  • Solving the wrong problem
  • Too vague or not actionable

Neither (leave in Needs Signal) if:

  • Need more signal before deciding
  • Waiting to see if it becomes a pattern
  • Investigating feasibility

Push to Jira/Linear When?

Not all accepted cards should be pushed immediately.

Push when:

  • Ready to build (this sprint or next)
  • Scoped and prioritized
  • Eng has capacity

Don't push:

  • Accepted but not yet prioritized
  • Waiting on dependencies
  • Low priority backlog items

Result: Your Jira/Linear backlog stays lean and focused.

Time in Each Stage

Typical durations:

  • Collection → Evaluation: 1-7 days (depends on review cadence)
  • Evaluation → Decision: Minutes to weeks (depends on clarity of signal)
  • Decision → Execution start: Days to months (depends on prioritization)
  • Execution: Days to months (depends on complexity)
  • Total cycle time: 2 weeks to 6+ months

Healthy targets:

  • Evaluation time: <7 days for 90% of cards
  • Time in Needs Signal: <30 days average
  • Acceptance → Push: <90 days (don't let roadmap become a graveyard)

Bottlenecks to Avoid

Evaluation Bottleneck

Problem: Cards pile up in Needs Signal because you don't review regularly.

Solution: Weekly 30-min review sessions. Process 10-15 cards per session.

Acceptance Bottleneck

Problem: Everything stays in Needs Signal because you never accept anything.

Solution: Lower your bar slightly. Accepted doesn't mean "build tomorrow" - it means "validated as worth doing eventually."

Push Bottleneck

Problem: 200 cards in Accepted, engineering backlog is empty.

Solution: Push more aggressively during sprint planning. Accepted is for validation, not storage.

Feedback Collection Bottleneck

Problem: <5 cards/week coming in - not hearing from users.

Solution: Make feedback easier to give (email forwarding, in-app widget, proactive outreach).

Feedback Velocity Metrics

Track these to understand your process:

Collection rate: New cards per week (target: 20-50 for most products)

Evaluation rate: Cards processed per week (should match collection rate)

Accept rate: % of evaluated cards accepted (target: 20-40%)

Cycle time: Days from collection to shipped (varies widely, track p50 and p90)

Backlog size: Cards in Needs Signal (target: <75)

Processing at Scale

Solo PM (<100 cards/month)

  • Review Needs Signal once per week (30 min)
  • Accept/archive during review
  • Push to Jira bi-weekly during sprint planning

Sustainable and manageable.

Small Team (100-300 cards/month)

  • Distribute collection (CS, sales, support create cards)
  • PM reviews and evaluates
  • Weekly triage meeting (60 min)
  • Bulk push during sprint planning

Requires discipline but doable.

Large Team (300+ cards/month)

  • API integration for auto-collection
  • Dedicated PM time for triage (daily 30 min)
  • Team members pre-tag and add context
  • PM accepts/archives, team executes
  • Automation for duplicates and noise filtering

Requires tools and process.

The Role of Automation

What Distil automates:

  • Duplicate detection (suggests similar cards)
  • Tag suggestions based on content
  • Status transitions (e.g., push → Ready automatically)
  • Changelog generation (Visibility Pack)

What you still do manually:

  • Evaluate signal vs. noise (requires judgment)
  • Write acceptance rationale (requires context)
  • Prioritize (requires strategy)
  • Decide when to push (requires planning)

Philosophy: Automate mechanics, preserve judgment.

Collaboration in Processing

PM-Led

Model: PM does all evaluation and acceptance.

Pros: Consistent decisions, clear ownership Cons: PM is a bottleneck

Best for: Solo PM, small teams

Team-Assisted

Model: Team members add context and signal notes, PM decides.

Pros: Distributed collection, PM curates Cons: Requires coordination

Best for: Multi-PM teams

Fully Collaborative

Model: Multiple PMs evaluate and accept in their domains.

Pros: Scales well, domain expertise Cons: Requires clear ownership boundaries

Best for: Large product organizations

Learn more about team collaboration

Governance and Compliance

With Governance Pack enabled, the processing flow includes:

  • Enforced rationale: Can't accept without documentation
  • Audit trails: Every decision is logged
  • Card locking: Prevent changes to accepted decisions
  • Enhanced context: Jira/Linear issues show full rationale

Learn more about Governance Pack

Quality Control

Signs Your Process Is Healthy

✅ Needs Signal backlog stays <75 cards ✅ Cards don't sit unprocessed for >30 days ✅ Accept rate is 20-40% (not too high, not too low) ✅ Accepted cards get pushed within 90 days ✅ You can explain why any card was accepted or rejected

Signs Your Process Needs Work

❌ Needs Signal has 200+ cards ❌ Cards sit for months without review ❌ Accept rate is <10% (too conservative) or >70% (rubber-stamping) ❌ Accepted section has 100+ cards that never get pushed ❌ Can't remember why you accepted old cards

Next Steps