Skip to content
INSIGHTS
Article

Human-Centered Design: Research

* This content was originally published prior to N. Harris Computer Corporation’s 2022 acquisition of the Allscripts Hospital and Large Physician Practice business segment. Our business is now known as Altera Digital Health.

URL Copied!

Editor’s note: This is the third blog in a series on Human-Centered Design (HCD)

Research. My first love.

When I was a young designer creating visually collaborative software, I loved learning about new ways of displaying information on a screen. At the time, everything was brand new in the industry, we were all learning about visual patterns—what worked, and what didn’t.

What I loved most was listening to people’s stories, experiencing their lives with them, finding out more about their core values and needs in their work and home lives. Sitting in their living rooms, watching their behavior patterns as they went through their days. I would sit for hours and watch, take copious notes, and later look for emerging patterns across the many humans I observed. This is where Human-Centered Design truly comes to life for me. Our success as Designers starts in the Research.

HX Research, what is it?

Human-Centered Design (HX) Research is the practice of understanding a system, a group or a single human behavior at a deep intrinsic level.

There are many different methodologies and practices to get to that deep understanding, and choosing the right ones really depends on the outcomes the business wants to achieve. Below, I’ll walk through a few HX Research methods we use at Allscripts to achieve our business goals. I’ll also examine a couple methods that aren’t as helpful as they may appear.

Literature Review.

Good for: Understanding a domain enough to ask the right questions later.

A literature review involves a deep gathering of data and write-ups to capture all of the goodness about a certain subject, and communicate the information internally in a digestible way. This helps the team gather inspiration from various domains, social comparisons and analytical research to shape the field work and design directions. It also helps communicate to executives that the team has done its homework and each member is a credible thinker. The team brainstorms which direction it wants to pursue based on its Mission Statement and then divides the work. Each member returns with many data resources to share with the team,  including a short summary of key findings and patterns seen across all of the referenced literature. Don’t forget to communicate the “references” to the group. It’s surprising how many people will look for those papers later.

HX Competitive Analysis.

Good for: Understanding competitors (from an HX perspective) to build a “best practice” framework for the team, identify key innovations and what not to do or repeat.

This competitive analysis is not typical market research. Rather, HCD looks at screen design, usage, reasoning and the goals a person may have when using a product. It answers, “What’s easy?”, “what’s hard?”, “where is the ‘disrupting’ gap in the products?”, and “What patterns are competitors following that this design team may want to avoid?” It’s also important in the competitive analysis to look to other industries for inspiration. It’s possible there is a best practice that the team can apply to this industry to make the product better. A short summary of the key findings/patterns from the competitive analysis help executives and clients further their design mindset.

Contextual Inquiry.

Good for: Understanding humans, their true needs, core values, environment, relationships, and specific behaviors to understand why people do something to make a what for them later.

Mentioned first in the early 1990s by Hugh Beyer and Karen Holtzblatt as a method for human behavior data collection and analysis, Contextual Inquiry is the deepest and most detailed of all of the HX Research methods. It involves planning, collecting and synthesizing human behavior data through watching people perform a task in the context of their day. The method has an ethnographic style taken from anthropology, and helps the team understand intrinsic and extrinsic needs of the people who participate. Once the data is collected, the team makes the data visible and synthesizes it through a series of models looking for patterns in culture, environment, relationships and sequential actions. The outcome is a data-rich set of key findings that drive innovation.

Using this method catapults the HX Researcher into an empathizing state, where they not only understand the problems and needs, but they actually feel what the research participants feel.

Heuristic Evaluation.

Good for: Finding problems and sticking points with an interface or product.

A bit of a sleeper method, it is very useful if the team wants to change an existing interface or design something new and better.

Heuristic Evaluation is the analysis of an interface to determine its level of compliance with recognized usability principles. It’s performed by expert HX Designers to give direction to Design and Development teams for next steps and interface fixes. The method uses Jakob Nielsen’s “10 General Principles for Interaction Design” and enables the design team to do three things: examine the screens through a HX lens, evaluate areas for improvement and recommend a path to fix or improve. An example: A team of five skilled designers reviews a product making note of the “violations” to the heuristics; they compare notes and create a piece of communication showing the collective patterns and group recommendations. This method can help guide the product, design and development teams to create a better product based on human needs, desires and modern-day comfort with technology. Really, it’s fool-proof, and can help executives see the problems with their product firsthand.

Helpful, and really not so helpful methods

SME Interviews. The act of sitting in a conference room and listening to a Subject Matter Expert (SME) talk about what they want to see on a screen. This is not to discredit SMEs, but rather to take their expertise and integrate the perspective as a guide rather than a deciding factor.

The problem? I love SMEs. Please don’t get me wrong. What people say they want, and what they actually need are two completely different things. SMEs (or what I call Internal Validators) are wonderful contributors to the Design knowledge base. Their perspectives can guide the team to understand a domain. SMEs help the design team formulate a best guess to guide the research. The issue comes in when the SME perspective is the only perspective that drives the design. One person’s word is not enough to make design decisions. Watching many participants in the field is a better method for data gathering.

Focus Groups. Gathering five or so stakeholders or SMEs in a room to give feedback together. Usually the facilitator has the product in hand, and calls out questions to engage the group.

The problem? In this setting it is a guarantee that a bystander will only hear from the loudest one or two people in the room, while other quieter personalities may sit quiet. If this happens, Designers don’t get what they need, and in some cases may be pointed in a direction that is incompatible with a quieter participant’s needs. I would take Contextual Inquiry over a Focus Group any day.

Caveat: A skilled facilitator can find ways to quiet the loud and encourage the quiet.

Surveys. The act of putting together questions in written form, sending it out to a self-selected group for solicited feedback. (See SME interviews above.)

The problem? Survey questions are extremely difficult to write well. There’s a lot of psychology behind them, and every word in a survey question counts. If questions aren’t written with skill, following a specific set of guidelines, the findings may show false positives or negatives. Humans also get survey fatigue quickly, especially in our “sound bite” culture. Any more than three questions, and you may not get the answer you need to guide your process. To gather participant needs it’s much more informative to watch someone do their work.

Caveat: Well-written surveys are good for many things, though, like mass data capture of high-level opinions.

This is all to say, pick your methods carefully and with intention. Have a clear outcome in mind. With HCD, process is everything. How you travel is the journey.

Up next! The “secret sauce” of Human-Centered Design. Synthesis!

With Gratitude.

Editor’s note: Here are the previous blogs in the series:

Human-Centered Design: Where compassion meets technology

Human-Centered Design: The craft of scoping

 

Sources

  1. Beyer, H., & Holtzblatt, K. (1998). Contextual design: Defining customer-centered systems. San Francisco, Calif: Morgan Kaufmann.
  2. Nielsen, J. (1994a). Enhancing the explanatory power of usability heuristics. Proc. ACM CHI’94 Conf. (Boston, MA, April 24-28), 152-158.

 

Scroll To Top