
User Experience
UX Group - User Testing Four Designs
Learning From Students: Usability testing the new Adelaide University course home page. We tested four designs (current, numbered variant, Dynamic Home, and course info patterns) with 42 students. One-on-one testing proved most valuable; results are now shaping the template and navigation.
Project background
In early 2025 our UX subgroup formed to address a long-standing pain point: page titling and navigation clarity in Canvas, particularly where numbers couldn't be used because of stackable curriculum requirements.
We met weekly in Miro to collect examples, frame a shared design challenge, and prototype naming conventions that improved readability, accessibility, and wayfinding. As those explorations progressed, Tim Churchward produced an early Dynamic Home concept—an integrated course home page that surfaces the essentials (course info, modules, assessments) without forcing students to detour through Modules.
That pivot set us up to validate four approaches directly with students and let the evidence guide the next iteration of the template.
Research goals
- Check whether our home-page patterns are fit for purpose on desktop and mobile.
- Identify navigation pain points and points of confusion.
- Learn student preferences for course information layouts, roadmap readability, and banner usage.
- Use findings to shape the default design, not run a popularity contest.
Methods
We used a mixed-methods approach:
- 1:1 usability tests (n≈10) with regional/online students.
- JotForm survey (n=32) run during on-campus sessions, plus small-group discussions.
- Facilitator + note-taker setup, with de-identified notes and AI-assisted transcription/synthesis.
Combined, 42 students contributed to the findings below.
Biggest lesson: one-on-one testing surfaced first-impression friction we couldn't see in surveys alone.
What we tested
We put four elements in front of students:
- Home page variants — the current template, a numbered/home variant, and Tim's Dynamic Home.
- Course information — a three-separate-pages pattern vs a single tabbed page.
- Course roadmap — a table showing weeks, events, and assessments.
- Banners — five banner treatments and image-type preferences.
Key findings
Home page
- Clear preference for the Dynamic Home. Students described it as "easier on my brain," with better module & topic numbering and less visual clutter.
- Mobile matters: many students access the LMS on phones; single-column layouts and compact sections help.
- Students wanted: a quick assessment area, concise course summary/contacts, and on-page navigation (avoid the detour to Modules).
Course information
- 80% preferred the tabbed page over three separate pages—fewer clicks, easier scanning.
- Usability tests showed why: routing via the Modules page added an unexpected middle step and cognitive load.
- Improvements students asked for: CLOs above PLOs, clearer table typography, and consistent list styles.
Course roadmap
- Roadmaps are valued (most students said they're useful) but readability created errors: some students misread event dates or couldn't spot due dates quickly.
- Expectations: explicit dates/times for assessments, clickable items that lead to details, and lighter, snappier tables (some effects lagged on low-powered devices).
Banners
- No single visual winner. Many students preferred no image or authentic course-related images over marketing/stock.
- Strong expectation that the course code appears on the banner.
- Overall sentiment: banners are nice-to-have, not a priority—get structure/navigation right first.
Design changes we made
- Adopt Dynamic Home as the default direction; refine for compactness and integrated on-page navigation (minimise page hops).
- Course information: reorder to CLOs → PLOs, unify list styles, and rebuild tables for readability and mobile.
- Roadmap: increase type size, tighten spacing, remove hover/cursor effects, show date ranges more clearly, and give Modules its own row to behave better on mobile.
- Banner tool: endorse use of the Media Team's tool, add a slimline height option, and include a course-code field.
What we'd do differently next time
- Split testing into two tracks:
- Continuous improvement with small, frequent 1:1 sessions run by the UX Group, and
- Verification testing (larger cohorts) for major changes.
- Bring stimulus sheets for group discussions to focus conversation (screens/printouts).
- Treat "don't make me think" as a design constraint—every extra hop or ambiguous label compounds cognitive load.
Acknowledgements
Final student-testing report compiled by Tim Churchward, with contributions from Rich Bartlett, Kelli Knuth, and Alex Price and support from Kat Alchin, Josh Cramp, and Andrew Beatton. Thanks to the Media Team for advice on banner production and template feasibility.
Gallery

User testing in progress (1:1 session)

Group discussion after testing

Students completing the JotForm survey

Building the JotForm survey (1)

Building the JotForm survey (2)

Building the JotForm survey (3)

UX subgroup's Miro board (overview)

Framing the design challenge

Initial research lane

Ideation Station

Prototype solutions

Rich's pre-design home-page contribution

Early Dynamic Home Page design by Tim
Interested in improving the student experience?
I can walk you through the research artefacts and testing process on request.
Contact me