Credit Union Tellers’ Counter Program – Part 1 – Analysis of Existing Solution​

COMPANY: Credit Union (Fundamentals in UX Design CA for IADT)
TEAM: Agnieszka Przygocka, Jill O’Callaghan
MY ROLE: UX Researcher, UX Designer
TOOLS: Google Docs, Google Forms, Figma,  Miro, ScreenFlow
METHODS: Heuristic Evaluation, Task Analysis, Usability Testing, Think Aloud, SUS, Paper Prototype
TIME: Nov – Dec 2020

This part contains critical appraisal of a digital product in terms of its usability and the user experience it provides based on contemporary usability heuristics and user experience principles.

UNDERSTANDING THE PROBLEM

We were provided with the video demonstrating Counter Program by Wellington-IT. 
Our first impression was that system is very complex, outdated and impossible to use in an intuitive way.

Because of its complexity, we have decided to take screenshots from the video provided and map step by step each of the tasks completed. This gave us a deeper understanding of the current system and helped to decide what methods to use in the next steps.

Existing Application Flow Analysis

We have decided to work on this task individually for the learning purpose. Understanding by both of us how the system works was crucial in the next phases of our project.

From the interview with one of the stakeholders, we were able to factor in some of the business goals. The most important was reducing transactions processing time and providing streamlined experience to credit union tellers. It can give credit unions a chance to reduce cost. The onboarding, help within the system, ease of use was also important to our stakeholders and could reduce the cost of training.

COMPETITIVE ANALYSYS

To understand better the industry we have analysed multiple competitors solution for processing the loans, banking applications and even collected slips from TSB to understand how transactions are processed and what type of information is required. We have analysed also indirect competitors looking for interesting solutions and features, like MicroBiz POS.

The competitive analysis helped us to identify the common patterns in the design of enterprise software, dashboards and how some of the operations are processed.

Competitive Analysis
HEURISTIC EVALUATION

After analysing the video it becomes apparent that the current system does not adhere to multiple UX guidelines, accessibility guidelines and doesn’t provide help for the users. There were multiple areas of problems and to identify specific issues we have decided to conduct a heuristic evaluation.

We used the heuristic evaluation process recommended by Euphemia Wong (Wong, 2020).

Heuristic Evaluation - Steps

The most time consuming was to compile a list of relevant heuristics. I was using Nielsen’s 10 Heuristics (1994) as categories for our more specific heuristics. 

 I chose relevant guidelines from the four main sources:

  • Andy Budd’s 9 Heuristics for Modern Web Application Development
  • Roman Zadyrako – 5 Heuristics of User Onboarding
  • Userfocus – 247 web usability guidelines
List of Heuristics

 The new learning for me was to add accessibility-focused guidelines to the list. Deque webinars were an invaluable source of knowledge on how to assess accessibility during the design phase.

  • Deque – A Practical Framework for Evaluating Designs for Accessibility

Because of the limitations of these projects both of us were acting as evaluators. The recommended practice by Euphemia Wong is to recruit four experts to evaluate the system.

We collated the results of our evaluation in a spreadsheet. It was calculated by what percentage the website complies with the guidelines. For each checklist item rating of -1 (doesn’t comply with the guideline), 0 (kind of complies), 1 (complies) was added. If a guideline was not relevant, the field was left blank.

 
Heuristic Evaluation Results

The spreadsheet template we were using was created by me for my past projects. It was a great help when collaborating with my partner who was new to heuristic evaluation. Jill had no problems using it and we completed the task without issues.

The results of the evaluation were summarised in the same spreadsheet. In the Summary Tab, a score for each group of heuristics was calculated. The average score given by each evaluator is quite low, 0-44%. That is the percentage Counter Program conforms to all listed heuristics.

Heuristic Evaluation - Results Summary

I have collated findings in two lists, positive findings and issues.

Identified Issues
Positive Findings

Based on these findings, I have mapped solutions and recommendations in the affinity diagram format and we discussed them with my partner Jill. That map was helping us when designing a new solution.

Potential Solutions to the Problem

More details and documents on heuristic evaluation can be found in Appendix A – Heuristic Evaluation.

Heuristic Evaluation Process
PROCEDURAL TASK ANALYSIS

Initial analysis of the tasks presented in the video gave us a good understanding of what Counter Program is all about. Procedural tasks analysis was our attempt to extract information from or screenshots. Strip it from the outdated UI and focus purely on the functionality. Jill’s experience from the credit union was invaluable. She understands how transactions are handled and were able to analyse tasks in a great depth. She put together the Task Analysis

Procedural Task Analysis of Existing Application

I later came up with the simplified version, which helped us to stay on track with the design, without the distractions of all unnecessary steps.

Procedural Task Analysis of Existing Application

With all the complicated menus, we were quite aware that part of the application we are working on is just a small fragment of a bigger system. I thought that understanding how broad and deep is the system could help us with making better design decisions in the next phases.

It was a great learning for me to use the Task Hierarchy method recommended in the “User and Task Analysis for Interface Design” by JoAnn T. Hackos

Task Hierarchy

From the Task Hierarchy, we concluded that we need to design a solution which will be easy to scale, provide navigational components to accommodate more options, design for consistency with components which can be-reused for other application screens.  

REFERENCES

Part 2 - User Research

Collecting and examining information in order to empathise with users and identify their needs and scenarios of use.

Part 3 - Building Paper Prototype


Designing and constructing a low fidelity prototype of a solution to address a user need by applying principles of design thinking, problem-solving, and critical thinking.

Part 4 - Evaluating Final Solution

Usability testing and heuristic evaluation of the final prototype.