Case Study 3
Design SystemsA/B TestingResearch

Case Study 3

Building a cross-platform design system to unify product experience across four teams and support a company-wide rebrand.

68

components shipped

−35%

design time on new features

84%

developer reuse rate

A scale-up with four product teams had grown rapidly without a shared design language. Each team maintained their own component library in Figma, leading to significant inconsistency across products and a high cost of design-to-development handoff.

The company was preparing a rebrand, which created a forcing function to consolidate — but also a tight 6-month deadline to ship a system all teams could adopt.

Designers spent an estimated 30% of their time recreating components that already existed elsewhere. Developers were maintaining four separate component implementations in code, with diverging behavior and accessibility standards.

Customer research showed users noticed the inconsistency — different button styles, spacing rhythms, and interaction patterns across products created a fragmented brand experience.

  • 4 separate Figma libraries with overlapping but inconsistent components
  • No single source of truth for color, typography, or spacing tokens
  • Accessibility compliance varied significantly between product teams
  • Design-to-dev handoff took 3× longer than industry benchmarks

We conducted an audit of all four products, cataloging every unique component variant, color value, and typography style in use. The audit surfaced 340 unique color values and 28 button variants across the product suite.

Interviews with all 8 designers and 12 frontend engineers surfaced adoption blockers — previous attempts at a shared system had failed due to perceived inflexibility and lack of documentation.

A design system's value isn't measured by its component count, but by how many decisions it removes.

Component Proliferation

340 unique color values and 28 button variants across the suite — most serving the same visual purpose with minor, unintentional differences.

Adoption Blockers

Previous system attempts failed because teams felt they couldn't customize for their context. Flexibility was cited as the #1 adoption requirement.

Documentation Gap

Engineers cited poor documentation as the primary reason they recreated components rather than reusing existing ones from other teams.

Governance Vacuum

No process existed for proposing or ratifying new components, so teams defaulted to local solutions that accumulated over time.

We developed the system in three phases: foundation tokens, core components, and pattern library. Each phase was validated with at least two product teams before moving forward.

Rather than building in isolation, we embedded a 'pilot team' approach — one team adopted early components in their live product, giving us real-world feedback before broad rollout.

01

Phase 1 — Token Foundation

Defined semantic color, typography, spacing, and elevation tokens. Validated with all 4 teams before any component work began to ensure buy-in.

02

Phase 2 — Core Components (Pilot)

Built 24 core components with the most active team. Iterated on documentation format, Figma structure, and code API based on live adoption feedback.

03

Phase 3 — Full Rollout + Governance

Expanded to all teams with a contribution model — any team could propose components through a defined RFC process with a cross-team review board.

The resulting system — built in Figma and React — included 68 components across 6 categories, with full documentation, accessibility annotations, and usage guidelines for each.

A token-based architecture meant the upcoming rebrand could be applied system-wide by updating token values, rather than touching every component individually.

  • Design time on new features reduced by 35% within 3 months of adoption
  • Developer component reuse rate increased from 28% to 84%
  • Accessibility audit pass rate across products went from 61% to 94%
  • Rebrand shipped on time — token update propagated across all products in 2 days