Core Insights: Customer Task Feedback
Last updated: April 13, 2026
The Customer Task Feedback insight captures what your customers actually think about the tasks you're asking them to complete during onboarding. Instead of guessing where customers are getting confused or frustrated, this view surfaces their direct feedback — giving you the customer perspective on your onboarding experience.
Great onboarding is built on iteration. Customer Task Feedback is the data that drives that iteration.
What This View Shows
Feedback Distribution Chart
At the top of the view, a chart visualizes the distribution of customer feedback by type. As feedback accumulates, this chart breaks down responses by sentiment category — giving you a visual sense of whether customers are generally sailing through tasks or hitting friction.
When no feedback has been collected for the selected date range, the chart displays an empty state message: "No feedback data available — Feedback from completed tasks will appear here. Adjust filters or date range to see results." This is expected for new accounts or time periods without completed task feedback.
All Feedback Table
The main body of this view is a detailed table listing every piece of feedback customers have left on completed tasks. Each row represents one feedback response with the following columns:
- Date — When the feedback was submitted.
- Account — The customer account that provided the feedback.
- Owner — The internal team member who owns the project.
- Status — The project status at the time feedback was submitted.
- Project — The project the task belongs to.
- Task — The specific task the customer left feedback on.
- Customer — The individual customer user who submitted the feedback.
- Type — The type of feedback (e.g., Positive, Negative, Got Stuck).
- Comment — The customer's verbatim comment, if they left one.
Sort the table by Type to group all negative feedback or "Got Stuck" responses together, making it easy to identify which tasks are causing the most friction. Sort by Task to see whether a specific task is consistently receiving negative signals across multiple customers.
Summary Stats
At the bottom of the page, four metrics summarize the overall feedback landscape:
- Total Feedback — The total number of feedback responses in the current view.
- Positive — The count of positive responses, indicating tasks customers found clear and easy.
- Negative — The count of negative responses, indicating tasks causing frustration.
- Got Stuck — The count of responses where customers indicated they were blocked or confused.
- Sentiment — An overall sentiment score representing the ratio of positive to total feedback, giving you a single headline metric for customer task experience.
Filters
Use the filters at the top of the page to focus your analysis:
- Date range — Set the time period for which feedback is shown.
- Playbook — Filter to feedback from tasks within a specific playbook.
- Owner — View feedback for projects owned by a specific team member.
- Account — Narrow to feedback from a specific customer.
- Feedback Type — Filter to Positive, Negative, or Got Stuck responses only.
How Customers Leave Feedback
Customers leave feedback directly within their OnRamp Customer Portal after completing tasks. When a task is marked complete, customers can optionally rate the experience and leave a comment. This feedback flows directly into the Customer Task Feedback insight in real time.
To maximize feedback volume, ensure your customer-facing task descriptions are clear and set proper expectations — customers who understand what they're doing are more likely to engage with the feedback prompt.
How to Use This Insight
Find High-Friction Tasks
Filter to Got Stuck or Negative feedback and sort by Task. If the same task appears multiple times with negative signals across different customers and accounts, that's a strong signal it needs to be redesigned — clearer instructions, better supporting resources, or a different sequence in the playbook.
Read the Comments
Quantitative feedback types tell you there's a problem; the Comment column tells you what the problem is. Make it a habit to regularly read customer verbatim comments — they often reveal specific points of confusion that aren't apparent from your internal view of the onboarding process.
Celebrate What's Working
Positive feedback is just as valuable as negative feedback. Tasks with consistently high positive scores are models to learn from — what made them easy? Could other tasks in the playbook be restructured to match that experience?
Close the Loop with Customers
When you see a "Got Stuck" response from a specific customer on a specific task, use that as a proactive outreach opportunity. Reach out, acknowledge the friction, and help them move forward. Customers notice when you're paying attention.
OnRamp AI Analysis
The OnRamp AI panel on the right side of the screen analyzes your feedback data and surfaces patterns — which tasks or playbooks are generating the most friction, whether sentiment is trending up or down, and what specific improvements might have the most impact on the customer experience.
Ask the AI questions like: "Which tasks are getting the most 'Got Stuck' feedback?" or "What do customers most commonly struggle with in the Enterprise playbook?"
Explore all Core Insights views in this knowledge base for a complete picture of your onboarding performance.
]]>