Understanding user perceptions of in-house software while limiting fatigue

Case Study: A team-wide effort to survey the company about their perceptions of software and provide recommendations for improvement

Skills: Survey Design, Data Analysis, Cross-team collaboration

Summary

Background

To support their diverse customer portfolio, Ginkgo Bioworks has many scientists with different personas. To support these scientists we have developed many different in-house softwares. We need make sure we continuing to build software and features that meet our scientists needs and skills.

Challenge

How can we collect cross cutting data and metrics to inform the direction of our products while minimizing time and effort spent by our users and our product designers?

Solution

We worked together as a Product Design team to iterate on a survey design and create reusable processes that will make it easier and faster to turn our data into presentations. Given budget and resourcing constraints, we were restricted to using the Google Workspace for this process

A screenshot of our survey in edit mode

A screenshot of our analysis template

Results

Previously, the process took at least 5 months. This new process only took 3 months to complete from beginning to end, including designing new processes and importing old data. Now that we have the questions and data analysis designed, there should be minimal design work and much faster analysis. We estimate that next survey will take < 2 months to complete


Background

Ginkgo has a wide variety of user personas, with different expertise and goals

  • The goal of Ginkgo is to engineer organisms that product different materials for external customers

  • In order to do that, Ginkgo needs to plan, design, build, test, and iterate thousands of variants to find the best one (or combination of variants) 

  • Different scientists are hired to work on each part of the pipeline, with different skills, strengths and goals

Over the years, Ginkgo has built over 15 unique software applications to support its scientists

  • We’ve built software to track sequences, to interact with automation, to plan experiments, to analyze experimental data, to queue work to other functions… the list goes on

  • Any one scientists would need to use multiple software applications to get their work done and achieve their goals

As a product design team, we are uniquely poised to work directly alongside the users we design for

  • We can easily reach out to our users to learn more about their work or ask for feedback

  • We are able to form relationships and chat with them in the hallway at work

Typically, each member of the product design team works on a specific subset of the software and have expert knowledge in our domain

  • Our team consists of 6 product designers and a head of product design

  • We are split between designing software focused on lab automation, functional R&D and scientific operations, and customer facing R&D programs


Design Challenge

How can we collect cross cutting data and metrics to inform the direction of our products while minimizing time and effort spent by our users and our product designers?


Actions

2022 Individual Surveys

Each domain ran its own survey

2023 1st Mega survey

One person ran a single mega-survey with the support of the team

2024 2nd Mega survey

Team shared the effort and responsibility of running the mega-survey

In 2022, 3 product designers ran their own survey focusing on our domains

  • Although the domains were different, there was some overlap with the types of users that the surveys were targeting

  • I organized, ran and analyzed the findings from the lab automation survey

A screenshot of of one of the surveys

However, this approach led to major survey fatigue and a pushback from users about the number of complex surveys we were asking them to fill out

  • The surveys were in-depth to capture information

  • It was hard to recruit survey participants when there were many competing surveys being publicized at the same time

A screenshot of of one of the surveys

Custom slack emojis created in response to the large influx of surveys

It was also a lot of work for each of the product designers to run and analyze their own survey on top of the regular day-to-day task of managing their own projects

  • From planning to recruiting to analysis, it took me 5 months on and off to finish the lab automation survey

  • It also took the designer who ran the functional R&D and scientific operations surveys 5 on and off months

A screenshot of my analysis. The bulk of my time spent was reading through all of the free responses and finding a clean story line to tell. This took me two and a half months while working on my regular work

We needed to iterate, it was unsustainable for our users and the team to continue to maintain multiple surveys

In summer 2023, one of the product designers decided to take the lead on this endeavor

  • She organized the project and was supported by the rest of the team whenever she needed help

We started by reviewing our surveys and found that there was overlap in questions and participants

  • Each owner of the individual surveys organized our questions and met to compare

  • Although some questions were specific to the software in our domain, there were many similar questions about overall job goals and responsibilities, as well as our demographic questions.

A screenshot of the spreadsheet we used to compare questions

Based on this, we decided that it would be better to run one mega survey with conditional logic to only show questions relevant to the participant

A screenshot of the google document we used to organize our questions

This was much better for users, but still a major burden on the lead product designer

  • To make the process easier on the rest of us, she took on the brunt of the recruiting and analysis

  • However, the amount of information we were trying to capture made it very laborious to comb through. She had been working on analysis for over a month and hadn’t made as much progress as she wanted

  • She raised this our team weekly meeting, and we all jumped in to help

  • Analysis was quick to finish with everyone collaborating, and we shared out 4 decks to relevant functions and an executive summary

There was still room for improvement. The product design team ran a retrospective of the difficulties in running surveys in the past

A screenshot of the retro document

We decided that it would be best to be a team effort from beginning to end

  • We also needed to introduce more standardizations and reusable processes for faster analysis

  • And have clear target audiences for our final presentations. Our individual share outs were well received, but we were asked by multiple people to make additional slices and cuts of the data afterwards

Each product designer worked with their product manager and software teams to add questions related to their domain

  • We divided and conquered the outreach and put all of the questions into a shared document

  • This created a list of all of the areas of interest

Our working document with all of the questions

We worked together to pair down the questions and normalized all the ranking scales and terminology used

  • We met as a group and discussed all of the added questions. We wanted to balance gathering a lot of data while not wearing out users with too many questions

  • Some of us used a 5 point scale and some of us used a 7 point scale. We converged on a 7 point scale.

  • In addition, annotated our question into types (continuous variable, categorical, categorical comparison, etc) for easier analysis later

The designer best at forms added in the conditional logic

  • One of the product designers translated the google doc into a google form. Although adding in the logic itself is more simple, keeping track of everything is hard!

  • We did not use different more feature rich surveying tool due to budget

A screenshot of our survey in edit format with the conditional logic

Because I worked in-person the most, I did a lot of the recruiting at our main office

  • We advertised through flyers and on slack that we would be bringing in treats to entice users to fill out the survey

  • I baked two types of cookies and sat in the kitchen to encourage participants to fill out the survey

  • We confirmed that our audience was very treat motivated: This time, we had about a 20% response rate of our target audience and no complaints!

The flyer I posted on walls and doors around the main office to advertise the survey (and the reward)

While we were recruiting, another product designer who was very good at data visualization helped create templates in google sheets to make it easier for us to analyze the data

  • This internal process development led to a huge time savings and efficiency gain during the analysis portion

  • This also created a consistent visual language across all of our slide decks

  • We did not use different more feature rich analysis tool due to budget

  • It took some time for us to import all of our old data into the template for longitudinal graphs, but this is a one time task!

The template spreadsheet he set up. He also included step by step screenshot instructions on the side!

After we collected the responses, the team worked together to speed up survey analysis

  • We split up the data analysis amongst 3 designers and the qualitative responses amongst the other 3

  • We were then able to create tailored slide decks to specific software groups, and when there were additional questions during those presentations, we were able to easily slice the data to find answers.


Solution

We created 5 share out decks for different audiences with the data from the survey

  1. Lab Automation

  2. Functional R&D and scientific operations

  3. Customer facing R&D programs

  4. Digital Technology Executive Summary

  5. Functional R&D and scientific operations (for project managers)


Results

From designing to sharing out the survey results, the entire process took 3 months

  • 1 month to design the survey

  • 0.5 months to collect survey responses

  • 1.5 months to analyze the results — The turnaround time for the final decks was where we gained the most time

Previously, the process took at least 5 months. This new process only took 3 months to complete from beginning to end, including designing new processes and importing old data. Now that we have the questions and data analysis designed, there should be minimal design work and much faster analysis. We estimate that next survey will take < 2 months to complete

Results from the surveys were used to create product recommendations for our different teams

A screenshot of our executive summary

A screenshot of our customer facing R&D deck

A screenshot of our functional R&D deck

A screenshot of our lab automation deck

We built better team processes

  • Through trial and error we have settled on a reusable set of questions, data analysis and deck templates that can continue to speed up the process in the next iteration

We utilized our strengths during the survey process to speed things up for everyone

  • Each member of the team had a unique combination of digital and people skills, and we leveraged each of our strengths to speed up the process (and make it more enjoyable as well)

  • We had previously only consulted on each other’s projects for reviewing designs or bouncing off ideas. It was really nice and productive to actively collaborate with everyone on one project