
Employee Engagement Survey from 0 to 1
My Role
Sole Designer
Team
Problem
We were tasked with building a new feature within the Workforce Management suite that would allow client managers to:
-
Send out recurring or one-time surveys with predefined questions
-
Collect and visualize employee feedback
-
Understand sentiment trends over time
There were no existing flows, research, or components to leverage—this was a full 0→1 product.
1 Designer Manager,
1 UX writer,
1 Product Manager,
1 Engineering Team (off-shore)
1 Design System Team
UX Outcome
Timeline
-
Delivered responsive designs (desktop, tablet, mobile) covering all edge cases
-
Enabled seamless engineering handoff via Figma specs and design tokens
-
Successfully launched MVP on time, with strong adoption and positive client feedback
4 months, 2023
Survey Scheduling – Manager Experience
Starting Point / Empty State

Survey Question Preview

Survey Schedule Specifications

Tablet/Mobile Adaptations


Survey Results – Manager Experience
Overview Page with Currently Running Instance

Survey Results - Summary View

Check-In Survey Results - Employee View

Tablet/Mobile Adaptations


Employee Experience
Entering from Dashboard or Inbox

Focused Survey Screen

Completed State

Tablet/Mobile Adaptations


Discovery & Planning
Challenge: No Foundational Research
We inherited vague ideas from leadership but had no prior discovery or clear problem framing.
My Approach
-
Conducted a competitive audit of similar features in tools like Peakon and Lattice
-
Led a story mapping workshop with PM to break the problem into user journeys
-
Used MoSCoW prioritization to define what was critical for MVP vs. later phases
-
Worked with the PM to define personas: People Ops leads, frontline managers, and employees
Outcome
A clearly scoped Earliest Usable Product aligned with user needs and business goals, plus buy-in from engineering for technical feasibility.
Design Execution
Handling Ambiguous Rules
Problem: Requirements like “scoring sentiment” or “editing surveys after launch” were unclear and constantly changing.
What I Did:
-
Documented open questions in a shared figma/jira tracker
-
Facilitated working sessions with the PM and engineers to align on edge cases
-
Created quick prototypes to test assumptions before committing to visuals
Impact: We proactively reduced back-and-forth with developers and minimized rework by validating early.
Designing for Component Gaps
Problem: There was no existing design system component for Likert-style survey inputs.
What I Did:
-
Identified the limitations in our Button Group component (fixed widths, poor label wrapping)
-
Proposed a new variable-width option to the Design System team
-
In the meantime, designed temporary versions using Selectable Chips with adjusted padding and behavior
-
Coordinated with engineers on a seamless swap once the official component was ready
Impact: Unblocked development timelines while advocating for a scalable design system improvement.

Data Visualization with Constraints
Problem: Engineers planned to use HighCharts, which came with visual constraints for our response and sentiment charts.
What I Did:
-
Audited HighCharts capabilities and limitations
-
Simplified the chart UI without losing meaning (e.g. stacked bar charts, tooltips)
-
Provided specs and fallback states for zero data, errors, and mobile views
-
Created a tokenized color scale for sentiment categories to maintain consistency
Impact: We achieved consistency across dashboards while respecting the limitations of the library—ensuring design fidelity without friction.


Key Learnings
-
Designing 0→1 features requires balancing ideal UX with practical MVP scope
-
Clear async documentation (open questions, rationale) is critical when working with changing requirements
-
It’s important to advocate for system-level improvements—but offer workarounds to keep teams unblocked
-
Design is not just screens—it's the bridge between ambiguity and clarity