DASHBOARD UX
Role: UX researcher
EXECUTIVE SUMMARY
Pickle is creating a platform to help companies better understand their customer conversations using AI-powered transcripts, aiming to improve their relationships. With the first version of the platform already launched, Pickle is now looking to enhance user experience by redesigning a key feature: the Dashboard.
ROLE & RESPONSIBILITIES
My main role was as a UX researcher. I was responsible for various research activities including writing research plans, conducting interviews, validating and analyzing data, creating presentations and research reports, planning workshops, and collaborating with designers, founders, and the marketing and sales teams.
TEAM
UX Researcher Saniya
Designer Yueyue, Wei, Ray
Co-Founder Birch
Engineers Matt, Mark, Jon
ESTABLISH A STANDARD PROCESS FOR REDESIGN PROBLEM
Brainstorming Workshops: Prepare for and participate in "How Might We" brainstorming workshops.
Competitor Research: Study competitors to ensure each new design is unique and stands out in the market.
Analyzing Previous Designs: Review previous designs based on user experience research from the "Pickle Next Steps" project.
Creating Basic Wireframes and Workflows: Develop basic wireframes and outline workflows.
Concept Testing: Test the initial concepts.
Prototype Testing: Conduct tests on the prototypes.
RESEARCH-DESIGN PROBLEM STATEMENT
RESEARCH OBJECTIVE
Dashboard Usage Experience: Understand how users interact with and experience the dashboard.
Target User Needs and Challenges: Identify the needs and pain points of target users related to the dashboard.
Competitor Analysis: Analyze competitor dashboards to assess our market position.
Develop a Design Problem Statement: Synthesize all perspectives to create a clear design problem statement.
METHOD
Dashboard Data Analysis: Examine dashboard usage data and summarize direct feedback from previous client interviews.
User Persona and Journey Mapping: Analyze target user personas and journey maps to uncover related needs and challenges.
Stakeholder Interviews and Competitor Analysis: Conduct interviews with stakeholders and analyze competitors to understand their visions, goals, and market positioning.
'How Might We' Workshop: Run a 'How Might We' workshop to help formulate a design problem statement.
USER EXPERIENCE
Usage Data Analysis
Interview Summary on Dashboard
Main Insights
Declining Dashboard Activity: New users initially show interest in the dashboard but their engagement decreases over time.
Lack of Specific Data: The data provided (e.g., internal vs. external, types of meetings) is not detailed enough for users to effectively understand or customize graphs.
Unclear Next Steps: Users are unsure of the actions or conclusions to draw from the dashboard information.
Mismatch with Sales Needs: The dashboard does not meet the specific needs of sales teams in terms of learning and improving their sales conversations.
Mismatch with Management Needs: The dashboard does not fulfill the requirements of the VP of Sales for managing and training the sales team.
RELATED USER NEEDS & PAIN POINTS
Persona
Journal Maps
Main Insights
Enhancing Sales Conversations:
Focus on facilitating the learning and improvement of the quality of sales conversations.
Addressing Customer Needs:
Identify and respond to customer needs effectively, and determine the right questions to ask to enhance engagement.
Supporting Sales Team Management and Training:
Improve how the sales team is managed and trained.
Efficient Coaching and Training:
Reduce the time needed to review calls by creating a list of significant moments for training or coaching purposes. Streamline the process from reviewing data to identifying key coaching moments in specific clips.
Data-Driven Management:
Provide a breakdown of data for each team member with options for personalization, and automate the process of analyzing this data and providing feedback to representatives.
Integration with Revenue Data:
Link conversation data with revenue figures and provide guidance on which conversation metrics are most impactful.
Streamlining Feedback:
Incorporate a quick scoring and feedback system within the platform for initial discovery calls.
COMPETITOR ANALYSIS & HMW WORKSHOP
Competitor Analysis
Information Architecture
HMW Workshop
Main Insights
Competitor Analysis & Stakeholder Interviews
Focus of Competitor Dashboards:
Competitors typically analyze meeting activities, topics, and tracked words, categorizing them under conversation skills, sales skills, deal intelligence, and market intelligence.
Advanced Competitor Offerings:
A leading competitor goes further by providing best practice standards and team recommendations, though users express concerns about the origins of these conclusions.
Resource and Timeline Limitations:
There are constraints regarding the development resources and timelines for our company.
Workshop Insights
Design Team Alignment:
All three designers are in agreement on the research related to the dashboard.
Problem Solving and Brainstorming:
The team collaborates to categorize and refine the problem statement/user goals, brainstorm solutions, and determine metrics for the dashboard.
Information Architecture Development:
Development of an information architecture focused on variables and data types to enhance dashboard functionality.
PROBLEM STATEMENTS & USER GOALS
Enhance Meeting Understanding and Quality: Assist users in better understanding their meetings to improve overall meeting quality.
Boost Engagement: Help increase user engagement.
Support Team Management and Training: Aid users in managing their teams and facilitating effective training.
Improve Win Rates and Revenue: Help users increase their win rates and boost company revenue.
PROJECT REQUIREMENTS
Dashboard as Landing Page: The dashboard should also serve as the landing page.
Data Interaction and Exploration: Enable users to interact with and explore data more deeply within its context.
Development Constraints: The project must be managed within the limitations of available resources and timelines.
USER STORIES
Story #1:
As an EA and a VP, I want to interact with my conversation intelligence data so that I can determine my next steps to achieve my North Star Metric (e.g., reaching revenue goals, closing larger deals, converting more leads into opportunities).
Story #2:
As an EA, I want to identify useful video recordings and snapshots, and dive deep to learn best practices, understand our target customers, and learn how to address their questions or concerns to close deals.
Story #3:
As a VP, I want to be able to check the teaching dashboard, see a breakdown of conversation intelligence information for each rep, compare them to find outliers, explore factors influencing these differences, and think of ways to support my team members and address any issues.
OPPORTUNITIES & SOLUTIONS
Insight #1:
Create separate dashboards for individual and team use.
Insight #2:
Summarize key points from sales conversations, including questions, answers, customer needs, and rejections.
Insight #3:
Provide detailed data breakdowns for each team member.
Insight #4:
Integrate with revenue data to highlight important conversation metrics and provide guidance.
Insight #5:
Use data to pinpoint specific clips that showcase valuable coaching moments.
DESIGN REVIEWS (MY ROLE: SUPPORTING)
LOW FI & MID FI WIREFRAME
Layout/Data Visualization Exploration & Mid-fidelity Screens
Main insights
For the next phase, several technical questions are crucial:
Data Relevance: What data are important and useful?
AI Accuracy: How accurate can the artificial intelligence be?
Ease of Implementation: How can we simplify the implementation process?
Idea Testing: How can we effectively test out different ideas?
RESEARCH- CONCEPT DESIGN USABILITY TESTING
GOAL
Understand design usability
Understand design feasibility
METHOD
A/B Testing: Task based monitored user testing
Interview with engineers on feasibility
CONCEPT DESIGN A/B TEST
INDIVIDUAL DESIGN
Design A
Design B
TEAM DESIGN
Design A
Design B
MAIN INSIGHTS
Strengths
Team Dashboard:
The detailed breakdown of information for each representative, comparison with others, and identification of outliers are valuable for managers.
Marker Exploration:
The detailed exploration of markers ("Markers Mentioned") enhances training capabilities.
Weaknesses
Data Clarity:
Some data representations, like circle graphs, are unclear and need more explicit explanations of what the numbers represent.
Visualization Needs:
Numbers and visuals should be larger, clearer, and more eye-catching.
Relevance of Data:
While filler words are interesting, they do not warrant a major graph for exploration.
Focus on Conversation Intelligence:
Avoid duplicating functionalities that are similar to Salesforce. Prioritize conversation intelligence.
Recording Utilization:
Recent or important recordings should drive meaningful next actions.
Improvements
Data Relevance for EAs:
Data should not overwhelm EAs; limit the space for simplistic data. All data should drive actionable insights, with a common goal for EAs being to identify useful training videos and recordings.
Data Interaction:
Enhance interactions with data to clarify next steps and help reach the North Star Metric.
Marker Interaction Enhancements:
Improve marker exploration by clarifying the intention behind the design, making it clear that markers are clickable, and ensuring users can navigate easily between views.
Integration and Feasibility:
Interaction ideas are promising but require careful consideration of feasibility and accuracy. Consider merging the marker exploration page with other interactive elements to streamline user experience.
DESIGN & ITERATIONS 2 (MY ROLE: SUPPORTING)
DESIGN REVIEWS
Feedback on Summary Usage:
"I like the summary, but what can I do with it?"
Users appreciate the data provided but are unsure how to use it to enhance their performance and meeting quality. For example, if the win rate is low, they need more detailed analysis to understand the issue better.
Increasing App Engagement:
"How can we increase engagement with existing APP features?"
Users prefer an interactive dashboard over just reading a "report." They believe that better integration with existing features would encourage more exploration of Pickle.
Concerns Over Information Overload:
"Great information, but isn't it too overwhelming?"
Users feel that the dashboard provides too much information, making it difficult to quickly understand and use. Some terms, like "deal," are not intuitive to users, adding to the complexity.
EXISTING MENTAL MODEL
Prioritize Key Recordings:
Users want to focus on recent or important recordings and snapshots to determine their next steps.
View Conversation Diagnostics:
Users are interested in diagnostics related to conversations, especially those tied to crucial metrics like opportunities and revenue.
Compare Performance:
Users want to see how everyone's performance stacks up against each other in terms of conversation quality and their use of Pickle.