UTSC Library Website — Card-Sorting Study

Posted 21 April 2020 by Joshua Shum

This is 1 of 4 UX projects conducted during my time as the User Experience Librarian Intern at University of Toronto Scarborough (UTSC) Library.

As part of an ongoing process to facilitate user-centred services and spaces at the University of Toronto Scarborough (UTSC), I partnered with the Web & User Experience (UX) Librarian to conduct Phase 3 of the UTSC Library Website redesign project. In the spirit of design thinking, library spaces and services should undergo iterative testing after implementation.

Previously, Phase 1 (Summer 2016) and Phase 2 (Summer 2018) involved physical card-sorting and task-based usability testing conducted with undergradute students at UTSC. These led to several postive information architecture changes, including the following:

  • Revised UTL course reserves module (prioritizing course code > course name/instructor).
  • Recategorizing pages (e.g. "Old Exams Repository" moved from Research to Services).
  • Combined top-level Visit and About menu categories
  • Added new sub-menus (e.g. Technology).
  • Eliminated unclear or unnecessary sub-menus (e.g. Collections, Manage your content, and Use our spaces).

However, both phases identified persistently problematic areas—or, "inconclusives". This necessitated Phase 3, with the following goals and objectives:

  • Test whether changes made during Phase 1 & 2 were effective.
  • Gain clarity on “Inconclusives” from Phase 1 & 2.
  • Gather evidence on where best to place new web pages.
  • Focus on card-sorting > task-based usability testing.

Methodology

In order to gather more reliable results, we aimed for a methodology that would allow us to increase the number of participants without requiring a large amount of time and resources from moderators.


We decided to use Optimal Workshop, an online suite of usability testing tools to host our card-sorting study. With a promotional educational license, we were able to design a digital hybrid card-sorting study.

UTSC Library Website — Hybrid Card-Sorting Study

How
Optimal Workshop
Who
Any individual affiliated with UTSC (18+)
What
Participants sort cards (webpage titles) into preset or custom categories (main menu labels)
When
February 20, 2020 – March 20, 2020
Where
Virtual/Digital Card-Sorting Study
Why
To facilitate a more positive user experience for website visitors.

User Groups

As Optimal Workshop offers automatic data processing and analysis, we were able to expand our research scope to include all UTSC-affiliated individuals above or equal to the age of 18, rather than only undergraduate students. To this end, we identified the following user groups:

  • Part-Time Undergraduate Student
  • Full-Time Undergraduate Student
  • Part-Time Graduate Student
  • Full-Time Graduate Student
  • Alumni
  • Staff
  • Faculty

TABLE 1. Why a Virtual Card-Sort?

 
Phase 1 & 2
Phase 3
Participants
15 per phase (30 total)
73 (80 total; 7 rejected)
Time Commitment
45-60 min. per sort
8 min. (average) per sort
Moderators Required
2
0
Cost
 
15 x $10 gift cards
 
1 x $25 gift card
1 x free educational license*

* Value of $153–$199 USD per month/per user.


Promotional Materials

Please click on images below to enlarge:

  • Social Media posts [Twitter (Fig. 1); Instagram (Fig. 2)]
  • Flyers posted in the library and in washroom stalls [Fig. 3]
  • At the weekly UX Pop-Up Table [Fig. 4]

STUDY DESIGN

Contents

  • 2 screener questions
  • 9 pre-study questions (demographics)
  • 37 cards (existing & draft pages)
  • 6 preset categories
  • 8 post-study questions (feedback & qualitative questions)

Card-Sorting Study Questionnaires, Cards, & Categories


Card-Sort Instructions

by OptimalWorkshop
Take a look at the list of items on the left. We'd like you to sort those items into groups and list them in an order that make sense to you.

Use the groups provided or create your own by dragging and dropping an item from the left into the space on the right.

There is no right or wrong answer. Just do what comes naturally. When you're done click "Finished" at the top right.

VIDEO DEMO

Above is a video demo of the card-sort study action. Want to see how it works? Click the button below to try it yourself!


KEY FINDINGS

To analyze our data, we examined the fully completed card-sorts (n=42). Using Optimal Workshop's data analysis tools, these data sets were charted into the Standardization Grid (Figure 6) below. A 60% agreement scorebased on Nawaz (2012)—was initially used to identify "actionable" and "inconclusive" items.


Accordingly, we initially categorized cards along a Classification Scale (Table 2), where the number (out of 42) refers to the highest distribution of cards across defined categories, as also seen in the Filtered Standardization Grid (Figure 7):

TABLE 2. Classification Scale

CLASSIFICATION

HIGHEST PARTICIPANT DISTRIBUTION

Actionable

25+ (rounded up from 25.2, or 60% of 42 participants)

Reasonably Actionable

20–25

Somewhat Reasonably Actionable

15–19

Inconclusive

11–14

No Agreeance

0–10

In this context, a 60% agreement score equated to (26 participants). Accordingly, 9/37 cards were initially categorized as "actionable", and 28/38 as "inconclusive".


In order to seek further clarity, we conducted participant questionnaires to identify common themes, including positive and negative feedback based on a qualitative analysis. Furthermore, all cards and categories were re-examined in their existing contexts to identify any potential factors to consider before any decision on changes were made.


Through this qualitative assessement (Q.A.), we were able to act (or not act) on 18/37 cards by changing or maintaining their locations in the navigation menu, supported by reliable user research data:

CLASSIFICATION

INITIAL

POST-Q.A.

Actionable

9/37 cards

9/37 cards

Reasonably Actionable

10/37 cards

8/37 cards

Somewhat Reasonably Actionable

12/37 cards

2/37 cards

Inconclusive

6/37 cards

17/37 cards

No Agreeance

0/37 cards

0/37 cards

Despite most participants expressing confidence in their sorting, the placements of these participants were neither consistent with the existing card/category groupings, and the distribution of cards widely differed. This is indicative that there is room for improvement.


For more detail, please click the button (below) or see the following sections.

Click here to view the full results and analysis of the study!  

NEXT STEPS

In total, we were able to act on 19/37 cards (categorized as "Actionable", "Reasonably Actionable", or "Somewhat Reasonbly Actionable"). We were pleased to see that the location of the majority of these cards already aligned with the existing website's navigational structure, warranting no further action. However, the remaining 18/37 cards will require further testing.


Our next steps are to formally proposes some of the changes to the library administration. Beyond the relocation of certain pages under different main menu labels, we are proposing the removal of the UTSC menu label (Figure 8), which has proven to be a great source of confusion. As this menu label links to campus-specific resources rather than UTSC library-related pages, we believe that removing it from the main navigational structure would reduce the chance of confusion experienced by website visitors.


Following the example of Robarts Library—the University of Toronto Libraries' (UTL) main library—we propose moving the UTSC link to a separate area in the header (Figure 9).


PROJECT TEAM

UTSC Library Website — Card-Sorting Study

Photo of Joshua Shum. Joshua Shum
User Experience Librarian Intern
joshua.shum@mail.utoronto.ca
Photo of Sarah Guay. Sarah Guay
Liaison Librarian · Web & UX Librarian
sarah.guay@utoronto.ca