I often get asked questions about the relationship between service design (SD) research and user experience (UX) research. The answer is very simple, but communicating that simplicity is not easy. This post will attempt the briefest, clearest answer possible.
Explaining the difference in the two forms of research will require briefly explaining the relationship between UX and SD, so let’s start there.
It can be helpful to think about the relationship in terms of design materials. Every design discipline shapes a certain kind of material to produce value for the people who use it.
The differences sketched out above should set the stage for understanding six main differences between SD and UX research. Most of the differences are matters of degree, and of course, there are always exceptions.
One key point that should be noted is this: Service design does not in any way replace UX, nor does service design research replace UX research. Rather, service design helps UX do a better job of designing touchpoints that support the larger service experience.
Another thing that should be noted: None of the six key differences I list are matters of technique. The toolbox of techniques used in service design overlaps with those of UX, with only minor variations in how they are used. A UXer attending a service design research session is unlikely to see any completely novel methods but is very likely to be shocked by the breadth of material covered and the rapid pace of the sessions. They might feel anxious about an apparent lack of thoroughness. This is by design, though, and I hope that what follows (in #3 and #4) will shed light on why.
Because the material of service design is the whole organization, many people are involved in the design of a service. A typical service design may change organizational processes, IT systems, policies, physical spaces, call center scripts, even how departments are organized. To improve the chances these changes will be made, it is important that the people who will be making the changes (or will be affected by these changes) understand why these changes are worth the effort and discomfort. If people reject the research or dispute the design decisions, change will not happen. Alignment of understanding is absolutely crucial.
The best way to create this alignment is to bring as many people along to help with the field research. Service design research is the ultimate alignment tool. When respected representative stakeholders from across the organization participate in research, witness firsthand how the service affects people’s lives, and contribute their own disciplinary knowledge and perceptions to the effort, insights from the field are deeper, more impactful, and more credible across the organization.
UX also benefits from client participation, of course, but can normally win sufficient alignment with far fewer people.
It is important to note that one of the most important stakeholders to include are UX designers, who will derive many of the same benefits from SD research as they would from UX research, or at least from foundational or generative UX research. (More on this below).
Because a service is designed for both those receiving the service and those providing and supporting it, the research is done with multiple types of participants situated in different parts of the experience, both front-stage and back-stage. Additionally, because services are experienced in many places in many different channels, it is often necessary to conduct research in multiple kinds of settings. For instance, a service design team might investigate how a service is experienced in the home and in a retail space, and how the service is delivered digitally, by voice, or in person. While UX research recruits a variety of cohorts and considers use in a variety of contexts, service design expands the number of participants, roles and settings beyond what is typically considered in UX.
Service design’s scope is relatively vast compared to UX. Not only must we investigate more actors and more settings, we often use different approaches for each of them, to help the team get insights on how the service comes together and how it is experienced by everyone involved. Further, service design is trying to piece together a whole experience as it unfolds over time and zig-zags across channels, so the scope of each research session tends to cover longer spans of time than a typical UX project, and investigates a participant’s experience with equal attention wherever it leads, regardless of channel. With UX research, times spans are often briefer, and non-digital channels are treated as the context for the UX design, not as something that itself might be redesigned.
Because its scope is so broad, service design does not go into the detail and depth that UX design does. This is why no service design should ever go directly into implementation. Service design only defines UX design problems it does not resolve them. Every touchpoint, digital or otherwise, defined by service design requires further work by design specialists who have mastered the craft of designing that type of touchpoint, with service designers staying involved to ensure the service as a whole stays consistent and seamless.
UX research is concerned with gathering insights that will guide the detailed design of a digital touchpoint. It seeks to get deep, detailed information on the person’s use context, mental models, vocabulary, physical and perceptual abilities, etc. Service design research only skims the surface of these questions, in order to keep an eye on how the touchpoints are integrated with one another and other components of the service.
In evaluative research, service design only validates concepts at a high level, concerning itself mostly with the usefulness and desirability of touchpoints in the context of the broader service, and deemphasizing usability to the greatest degree possible.
The only part of UX research that service design mostly replaces is the foundational research, and even there, only partially.
As mentioned above, service design research is the ultimate alignment tool. To stabilize and refine alignment, service design teams will often conduct analysis socially and in the open, preferably in a location where members of the organization can drop in and participate. At Harmonic, we have called this “open research studio”. The analysis is intentionally visual and easily digested. Stories and other insights gathered from the field are displayed in ways that invite conversation and direct collaboration on the artifacts.
The process of collaborating encourages cross-disciplinary conversations and brings out a shared understanding that is relevant and comprehensible to everyone in the organization. And because so many people have had direct involvement in shaping the understanding it is likely to be complete and credible.
Finally, returning to the earlier point that the material of service design is organizations, a material this complex is too much for any single mind to contain or any single talent to shape. The whole organization must be mobilized in redesigning itself to deliver better experiences. Everyone must learn to function as members of a design team. To make this transition as intuitive as possible, many service design research outputs are designed specifically to serve as large-scale collaboration tools. Most service design research findings include various kinds of experience maps meant to be physically hung on a wall or otherwise shared and to allow teams of people to interact with the surface as a canvas for collaborative work.
Of course, the design research findings are always tools used for designing. But because the interpreters of other kinds of design research are usually experienced designers who already know how to interpret findings to make design decisions, researchers are free to emphasize the content of the findings over their form. The fact that many of the people doing the design are inexperienced working in that way places special demands on service designers to think more about the form and explicitly make them not only easy to understand, but easy to use in support of opportunity identification, ideation, or evaluation.
Wow, that was not brief after all. But I hope it was at least clear. I’ll make one more attempt at brevity with a summary: