Hey, I’m looking for some early visionary designers or marketers involved with hospital websites to try out what I believe could become a useful way to baseline how well healthcare websites serve patients.
Here’s the scoop: at Connective DX, I’ve worked with teams on both coasts as they’ve conducted primary research with patients to understand their needs, and how digital tools can help them connect with health resources. We frequently assess how healthcare sites perform against one another, and we’re always on the look out for great work that shows the direction for improvement. So, over the last few months, I’ve assembled a dozens of tests which can help identify how well served patients are by hospital websites, and internally tested these to see which really point to improved patient interactions.
What I’ve found is that there is simply no substitute for human experience. Earlier efforts assessed hundreds of websites with automated tests and this accomplished broad coverage, but results which had little connection to actual patient experiences. As an example, recency of page update was taken as a sign for content quality. By this thinking the Bible or Shakespeare would be inferior content to a politicians constantly updated Twitter account. The presence of a photo carousel on the home page was counted for brand sophistication. We quickly found that user testing can’t be mechanically replicated without users.
So, the approach that I’ve advanced for the HDX Index uses both expert visitors testing how users would accomplish specified tasks, and a wide range of external measures showing how these sites are used by real visitors, and tools which measure things such as the reading level required to understand content, and tests of how well pages adapt for mobile viewing along with how quickly the render on a variety of devices. Results are organized in to four separate scored sections, and they all combine for an overall score on a 100 point, non-normalized curve.
Making tools and rubrics which peers and clients adopt to advance digital practices generally is a bit part of my agency, Connective DX’s culture. My management consulting work has helped to prove a number of assessments, such as the DX-7 digital maturity self-assessment. Our development teams open source frameworks, and in that same spirit, we’re undertaking this effort to provide a way for others to understand their sites, and as a potential shared way to explain just what makes an element a “best practice”. I can already tell, that the accumulation of data and visual examples over time may provide insights on the direction and speed of change in our industry.
Over the next few months, I’m looking to spend time with digital teams who are willing to exchange insights. We want to see what parts of the Index are useful and interesting to teams, and see if there are any places where we might see things differently or need to take on extra care in assessing these digital experiences.
You can download an abstract which shares more detail on which measures fit with what sections. And if you’d like to find out more, I’d encourage you to get in touch with my, or with the talented and wise Sydney Woods who has done great work in helping us more forward on this.