Heuristic evaluation of a funding search tool
Aim: Identify potential usability issues with the backend of a university funding search tool based on the system's compliance with common usability heuristics, and recommend solutions to tackle major issues.
The University of Cambridge (UoC) university information service (UIS) wanted to created a form that would allow colleagues to add a new award for funding to the Funding Search Tool database. Based on my experience, my lack of involvement in the decision-making for this project, and limited resources, I was asked to evaluate the user interface for usability issues. I chose to conduct a heuristic evaluation because this technique can identify up to 50% of usability issues, including the most severe, before development has started.
I created an expanded version of Neilsen's 10 usability principles to account for some limitations in scope and flexibility.
I walked through the proposed flow of screens and recorded potential problems, taking note of the principles they violated.
I summarised and ranked each problem I encountered in the flow, along with severity ratings and potential solutions for each.
While the most commonly used and well-known lists for conducting heuristic evaluations, I've found Neilsen's 10 usability principles to be limited in scope and flexibility, and that some of the principles are ambiguous and open to interpretation. I therefore decided to supplement these principles based on insights from other recognised usability experts (Ben Schneiderman and Arnold Lund ), and created a detailed description of each, complete with examples. In sum, these principles were:
Feedback (visibility of system status): The system keeps the user informed (e.g., with notifications) and provides feedback for every action (e.g., drag-and-drop functionality).
Match with mental models (match between system and real world): The system uses language and concepts that are familiar to users, such as analogies from the real world, e.g., "emptying the trash can".
Consistency and standards: Users can apply existing digital experience to new system functionality for learnability, intuitiveness, and efficiency, such as the same actions and commands, e.g., Command+S for "Save" on a Mac.
Aesthetic and minimalist design: The system avoids extraneous information. Avoid unnecessary images and text, and apply a visual layout that respects Gestalt principles of human perception. Less is more.
Minimise memory and cognitive load (recognition rather than recall): The system minimises the information users have to take in and remember. This applies to both the interface (e.g., provide a toolbar with appropriate grouping and labelling of available options) and the content (e.g., allow users to make new objects by copying, pasting, and editing existing ones).
Appropriate guidance (help and documentation): System guidance is context-relevant and embedded in the UI, e.g., with hover-over text or hyperlinks. When providing instructions, the guidance refers to concrete actions based on the task at hand.
Speed for frequent actions (flexibility and efficiency of use): The system maximises efficiency depending on the user's experience and context, with shortcuts for experts (e.g., keyboard commands) and for frequent actions (e.g., auto-fill).
Navigation and escape (user control and freedom): The system clearly shows where the user is and where they can go, with easy exits to encourage exploration.
Error prevention. The system reduces unconscious errors and errors made due to a mismatch between the user's mental model and the design with warnings and helpful constraints.
Error recovery. The system provides an understandable, human message and a constructive solution to an error.
I evaluated the usability of the proposed designs against the above set of heuristic principles. After completing the heuristic evaluation, 15 problem areas that violated usability principles were ranked based on a severity scale of 0 to 4. These scores were determined based on the frequency of the problem at different points of use, the impact of the problem (the ease with which it can be overcome), and the persistence of the problem (whether users will be repeatedly bothered by the same problem once they know about it):
The following outcomes were provided. As well as quick insights into usability issues, the insights provide a foundation on which to base more extensive user testing at a later date, should the project owners choose.
A list of usability issues, described and given a severity rating.
Prioritisation of issues based on severity and frequency.
Potential solutions and recommendations for improvement.
Challenges and learnings
This project was an example of an attempt to validate a system that was already being built based on the tools available, with little consideration for the goals and needs of the user. A decision needed to be made about whether committing resources to fixing the usability problems identified in the heuristic evaluation – some of which may seem minor and yet require a lot of development effort – is the right approach to development.
Fixing all the usability issues I outlined was not necessarily the most appropriate way of dealing with the issues from an overall project perspective. In my report, I made it clear that my recommendations might fix individual usability issues at a micro level, but didn't guarantee that the system would then meet user needs. The results give structure and potential priority to some usability issues over others, but don't necessarily deal with larger issues underpinning the usability of the system, such as whether the developers were using the appropriate platform for creating the form, which limited their design options.