Using Data to Improve Design
When setting out to create a design for a new product or reimagine an existing product, how we decide what to focus on and how to know if a change is working are critical to understanding and discussing the work. Design by its very definition has to have intent behind it; there is a motivation and a purpose that a design serves.
So how do we measure if a design has been successful? In the field of advertising, it’s relatively simple to understand. If you run a new ad campaign and sales go up, the design of the ad was a success. How do we translate this kind of 1:1 relationship to a complex web application with multiple touchpoints and a unique sales cycle, though?
This is where data comes in. Almost all massive product organizations that people think of today are constantly running experiments and collecting data on usage, tracking how small changes to an algorithm or interface impact and drive user behavior and interactions.
Data-driven Design
Making design decisions based on data insights and user analytics leads to more informed and effective design solutions. By analyzing user data and behavioral patterns, designers can optimize designs for better user engagement and satisfaction.
This approach sets up designers to identify pain points and areas of opportunity within the user experience. By leveraging data, designers can identify patterns that may not come up in contextual interviews. It’s an established bias that people will say one thing in an interview or survey, expecting there to be a “right answer,” even if that answer is not true to their behavior or actions.
Using data to inform design fosters a more collaborative environment. It provides a common baseline of understanding for discussions and decision-making among team members. By encouraging the integration of quantitative data with qualitative insights, we can develop a more holistic understanding of the user experience.
Another important aspect is the ability to test and iterate on design solutions. A/B and usability testing are great co-creation methods that can be used to gather input on different design options and determine which provides the best outcome for users. This iterative process allows for continuous evaluation and enhancement of the design.
Using a design approach centered around data also opens the door for cross-discipline collaboration. The conversation about design becomes more holistic in nature, factoring in things that fall under engineering like performance metrics. A product designer is not expected to be able to improve metrics like time-to-interaction or the results provided by a search algorithm, but these metrics clearly affect how a user experiences a product.
Measuring Success
Incorporating data into the design process also helps in setting measurable goals and tracking the success of design changes over time. This continuous feedback loop supports a culture of ongoing improvement and innovation, allowing teams to adapt and evolve their designs in response to user feedback and changing trends.
A few common metrics that teams use to track the success of design changes include: Top Tasks, Customer Satisfaction (CSAT) Rating, and System Usability Score (SUS). These metrics all have a few things in common: they’re consistent, ongoing, and robust.
A consistent metric is one that does not change over time. For each of the above metrics, the team asks the same questions every time. This provides a comparable baseline; if the metric is trending in one direction or another, you know it’s likely because of a change to the product, and not a change in the language of the questions being asked.
In order for a metric to be ongoing, it needs to be continuously measured at regular intervals. Depending on the size of the product or the frequency of changes being evaluated, these metrics can be measured anywhere from every month to once a quarter or half. The key here is that these are not ad-hoc one-off metrics. In order to have reliable data that you can check against and evaluate the success or failure of a design, you need to be constantly collecting and evaluating this data.
Finally, and perhaps most importantly, these metrics are robust. What I mean here is that these metrics need to be completed with a large enough sample size of relevant user segments. If you’re testing a product with 100,000 monthly users, a sample size of 5 is likely not going to provide statistically significant results, and the chances of outliers skewing data are very high. The standard for these metrics needs to be set at a level where you’re getting data that has stabilized and can be reliably trusted to align with the majority of your users.
Challenges with Data-driven Design
Challenges do exist in implementing data-driven design. As stated above, ensuring data accuracy and relevance is crucial, as making decisions based on flawed data can lead to undesired outcomes. It’s very easy for teams to fall into a trap where they don’t have enough participants, or the user group is not relevant to the tasks being assessed. In a desire to use the data anyway, the wrong conclusions may be drawn, and the experience is likely to suffer for the end user.
Additionally, there can be a tendency to over-rely on data, potentially stifling creativity and innovation. While I am a fan of indexing closer to data than dogma, we hire designers for a reason. Amazing brands with personality like Figma, Robinhood, and Duolingo don’t come to be from pure data; there are incredible creative minds at work developing the personality and joy those brands provide.
Striking the right balance between data-driven insights and creative intuition is essential for achieving the best results.
Wrapping Up
Using data in design offers a powerful approach to creating effective, user-centered solutions. By leveraging data to inform decisions, product teams can increase user satisfaction, improve performance, and achieve better overall outcomes. The integration of data into the design process supports a more objective, evidence-based, and multi-disciplined approach, outlining a path for continuous improvement and innovation.