top of page

Heuristic evaluation of a funding search tool

AimIdentify potential usability issues with the backend of a university funding search tool based on the system's compliance with common usability heuristics, and then recommend solutions to tackle major issues.

Problem

 

The University Information Service (UIS) at the University of Cambridge (UoC) wanted to created a form that would allow colleagues to add a new award for funding to the Funding Search Tool database. I was asked to evaluate the user interface for usability issues. I chose to conduct a heuristic evaluation. This technique can identify up to 50% of usability issues, including the most severe, before development has started.

Process

1

Develop principles

I created an expanded version of Neilsen's 10 usability principles to account for some limitations in scope and flexibility.

2

Heuristic evaluation

I walked through the proposed flow of screens and recorded potential problems, taking note of the principles they violated.

3

Report

I summarised and ranked each problem I encountered in the flow, along with severity ratings and potential solutions for each.

While the most commonly used and well-known list for conducting heuristic evaluations, I've found Neilsen's 10 usability principles to be limited in scope and flexibility, and that some of the principles are ambiguous and open to interpretation. I therefore decided to supplement these principles based on insights from other recognised usability experts (Ben Schneiderman and Arnold Lund), and created a detailed description of each, complete with examples. In short, these principles were:

  1. Feedback (visibility of system status): The system keeps the user informed (e.g., with notifications) and provides feedback for every action (e.g., drag-and-drop functionality).

  2. Match with mental models (match between system and real world): The system uses language and concepts that are familiar to users, such as analogies from the real world, e.g., "emptying the trash can".

  3. Consistency and standards: Users can apply existing digital experience, such as the same actions and commands, to new system functionality for learnability, intuitiveness, and efficiency, e.g., Command+S for "Save" on a Mac.

  4. Aesthetic and minimalist design: The system avoids extraneous information, such as unnecessary images and text, and applies a visual layout that respects Gestalt principles of human perception. Less is more.

  5. Minimise cognitive load (recognition rather than recall): The system minimises the information users have to take in and remember. This applies to both the interface (e.g., provide a toolbar with appropriate grouping and labelling of available options) and the content (e.g., allow users to make new objects by copying, pasting, and editing existing ones). 

  6. Appropriate guidance (help and documentation): System guidance is context-relevant and embedded in the UI where possible, e.g., with hover-over text or hyperlinks. When providing instructions, the guidance refers to concrete actions based on the task at hand.

  7. Speed for frequent actions (flexibility and efficiency of use): The system maximises efficiency depending on the user's experience and context, with shortcuts for experts (e.g., keyboard commands) and for frequent actions (e.g., auto-fill).

  8. Navigation and escape (user control and freedom): The system clearly shows where the user is and where they can go, with easy exits to encourage exploration.

  9. Error prevention. The system reduces unconscious errors, and errors made due to a mismatch between the user's mental model and the design, with warnings and helpful constraints.

  10. Error recovery. When issues occur, the system provides an understandable, human message and a constructive solution to an error.

I evaluated the usability of the proposed designs against the above set of heuristic principles. After completing the heuristic evaluation, 15 problem areas that violated usability principles were ranked based on a severity scale of 0 to 4. These scores were determined based on the frequency of the problem at different points of use, the impact of the problem (the ease with which it can be overcome), and the persistence of the problem (whether users will be repeatedly bothered by the same problem once they know about it):

Screenshot 2023-04-14 at 23.21.58.png

Outcomes

 

The following outcomes were provided.

  • A list of usability issues, described and given a severity rating.

  • Prioritisation of issues based on severity and frequency.

  • Potential solutions and recommendations for improvement.

As well as quick insights into usability issues, the resulting report provides a foundation on which to base more extensive user testing at a later date, should the project owners choose.

Challenges and learnings

 

I wasn't directly involved in the decision-making for this project and was asked to do a usability evaluation based on limited resources dedicated to the effort. As such, this evaluation was an example of an attempt to validate a system that was already being built based on the tools available, with little consideration for the goals and needs of the user. A decision needed to be made about whether committing resources to fixing the usability problems identified in the heuristic evaluation –– some of which might seem minor and yet require a lot of development effort –– is the right approach to development.

 

Fixing all the usability issues I outlined was not necessarily the most appropriate way of dealing with the issues from an overall project perspective. In my report, I made it clear that my recommendations might fix individual usability issues at a micro level, but didn't guarantee that the system would then meet user needs. The results give structure and potential priority to some usability issues over others, but don't necessarily deal with larger issues underpinning the usability of the system, such as whether the developers were using the appropriate platform for creating the form, which limited their design options.

bottom of page