Research & prototype the way products are scored against a design language system as a measure of adoption.
Company: Philips
Led a qualitative research study and created a new way to measure the way Philips scores its products against the design language system. In doing so, it enabled business identify gaps in their products and plan roadmaps accordingly.
Project Summary
4 months start to finish with 3 person team spread across 2 office locations. I was the user researcher, experience designer and project manager. I worked with a developer and a product owner.
Timeline & Team
Qualitative research study
Experience Design
Prototype & usability testing
Responsibilities
Outcome
In the first month after launch, products were scored within 2 days as opposed to a week.
OVERVIEW
To justify the cost and time for product development, products must adhere to the latest design system. With the existing system, businesses struggled to identify gaps effectively.
The measure and monitoring of the Philips Design System adoption carried out by product owners or designers is quite time consuming, subjective and varies across the entire portfolio.
There was no single source of truth, it was offline, on a series of spreadsheets hosted on a shared drive, had a complex and time consuming onboarding process, could only be audited by one person at a time with no clear accountability of the auditor.
SUMMARY
Philips saw an opportunity to streamline the way products across the Philips portfolio were scored against its design language system.
48 mins
As opposed to 2 days to complete the score
1 simple scale for measuring the adoption - thus eliminating subjective scoring
1 scale
64%
Increase in the overall happiness of the user
RESEARCH
I planned and conducted 10 users interviews to understand the purpose, opportunity, and challenges with scoring products using the existing spreadsheets.
4 Product owners
How did you learn about the score when looking at the spreadsheet?
Why do you think designers struggle filling in the spreadsheet?
What part of the system would you like to see an improvement on?
SAMPLE QUESTIONS
4 Product Designers
How long does the process take?
Talk me through your process.
What the challenges you face when filling in your spreadsheet?
SAMPLE QUESTIONS
2 Business Partners
Why do you use there spreadsheets?
What problems do you encounter with the adoption score?
What could be done better?
SAMPLE QUESTIONS
The research interviews and analysis of the users revealed how the whole adoption scoring service could be improved.
-
is a significant point of friction due to the difficultly in looking up the history
-
lacks ease as it is not easy to see who entered what and when
-
because users have to refer to the design language site for information.
-
is not easy due to the lack of visuals
“It would be great if we could have a dashboard of sorts that gave a summary of everything”
Based on the research, I focused on two types of users involved in the scoring of the product:
Mike: The Product Group Owner
Overseas the products for Image Guided Therapy. Evaluates and reports the scores to the leadership
WANTS TO:
To find the scores quickly for reporting
Know which parts of the product need improving at a glance
KEY INSIGHTS:
Has to check to see if 8 product spreadsheets have been filled in before he can get a group score
Mike maintains his own master spreadsheet so he can protect his work from any errors that may come through
Sharon: The Designer
Deliver designs for an ER product.
Uses components from the design system to implement into their design
WANTS TO:
Not have to think too much about scoring
Have all the information (design language) in one window
KEY INSIGHTS:
The spreadsheets gives insights on the areas that need updating
Sharon dreads having to fill in the spreadsheet and finds she’s scrambling to get it done
EXPERIENCE DESIGN
First, I mapped out the ecosystem to better understand the way products are scored as a “service”. Then I mapped out the experience someone has as they progress through scoring a product.



PROTOTYPE, TEST & ITERATE
Usability testing revealed that there were gaps in the data collected in the spreadsheets that would be pertinent in painting a better picture of the score
“To paint a better picture of the score, it would be great if we include information regarding…”
A positive result emerged, 100% of users loved the component list such that there was no need for them to take many snapshots and they were able to reference the DLS quickly.
“Finally, I don’t need to take snapshots of every component in my product. I can focus on ensuring the custom components are uploaded with the relevant details…”
Users can score a new product in 48 minutes instead of 2 days!
Having everything happen in the Adoption tool eliminates any risk of there being broken links to scores and does away with spreadsheets.
OUTCOMES & LESSONS
The adoption tool is proving to be a powerful tool in enabling users to see the where the gaps were. 58% of users required no physical onboarding to the tool.
IN THE 60 DAYS SINCE LAUNCH OF THE TOOL
72 %
Were able to score with no need for support
68 %
Completed scores within the time allocated
100 %
No broken links or downtime to the tool
Adoption Tool is a crucial lens for insight and ideas about potential opportunities and optimisations in any product experience.
Research insights led to an overhaul of the adoption tool, reducing errors by 28% and increasing adoption by 37%. This improved consistency across the portfolio.
Listing the components automatically significantly reduced the overall time needed to score.
Having everything in one place improved transparency and empowered designers to identify and call out areas for improvement whilst doing their work.
Key Outcomes & Results
What I learnt
The need for scaling and future proofing the tool through API’s.
Understanding the way people work and why.
The need for transparency and accountability - if someone has a question - the tool should provide all the answers or at least a starting point.