Lost Before They Ever Applied

A moderated remote usability study of the University of Maryland Global Campus mobile site — observing how prospective students searched for degree programs, admissions details, and contact options, and where the experience caused them to give up before getting answers.

ClientUniversity of Maryland Global Campus

Engagement~60-Hour Usability Study

Method: Moderated Remote User Testing

PlatformUserZoomGo

Participants8 Moderated Sessions

Scroll to explore

Problem · Implication · Recommendation

Prospective students were abandoning before the research process was complete

For a university, the mobile website isn't just a marketing channel — it's often the first and most critical touchpoint in a prospective student's enrollment journey. When that experience fails, potential applicants don't reach out. They just leave.

Problem

Visitors couldn't locate key program, admissions, and contact information on mobile

VoC feedback indicated prospective students were abandoning sessions before completing important research tasks — unable to find degree details, admissions events, or how to get in touch with the university.

Implication

Friction in the research phase increases the risk of losing applicants before they ever connect

When program details, admissions timelines, and contact paths are hard to find, the research process becomes uncertain. Prospective students who can't get answers leave — and often don't come back.

Recommendation

Use moderated testing to observe real research behavior and identify where navigation breaks down

Watching prospective students navigate the mobile site in real time reveals not just where they fail, but why — and what specific changes to navigation, labeling, and content visibility would remove the friction.

Project Background

The mobile site had become the front door — and it wasn't fully working

As prospective students increasingly turned to mobile devices to research degree programs, UMGC's client team began receiving a consistent signal from Voice-of-Customer feedback: visitors were struggling to find what they needed and leaving before completing their research. The site wasn't broken — but the experience wasn't reliably getting people to the information that would move them toward enrollment.

The stakes were clear. Program research isn't passive browsing — it's an active decision-making process. When prospective students can't locate degree specifications, admissions timelines, or a way to ask a question, they don't just experience inconvenience. They lose confidence in the institution before they've even made contact.

The Problem

Prospective students were abandoning mobile sessions before completing research tasks, with VoC feedback pointing to difficulty finding programs, admissions events, and contact options.

The Goal

Observe real mobile research behavior through moderated testing to identify exactly where navigation broke down and deliver specific recommendations the team could act on.

My Role

End-to-end ownership of the study — from screener design and task scenario development through moderation, analysis, findings synthesis, and delivery of both a stakeholder presentation and written research report.

Participant Profile

Eight moderated sessions with prospective students across a range of ages and educational research experience, including participants with military affiliations — reflecting UMGC's actual applicant population.

My Role

Full research ownership — from study design to stakeholder delivery

This engagement required end-to-end research leadership rather than a single-phase contribution. Each phase built directly on the last, and the quality of the final recommendations depended on the rigor applied throughout.

01 Designed the moderated mobile usability study — including participant screener criteria and task scenarios built to reflect realistic prospective student research behavior

02 Moderated all eight remote testing sessions via UserZoomGo, with participants navigating the live UMGC mobile site on their own devices while verbalizing their process

03 Analyzed navigation patterns, information discovery behavior, and task completion challenges across all sessions to identify recurring themes

04 Synthesized research findings into actionable recommendations targeting navigation clarity and information findability

05 Delivered findings through both a stakeholder presentation and a written research report documenting the full analysis

Methodology

Real users. Real devices. Real research behavior.

The study used moderated remote usability testing through UserZoomGo — participants completed tasks on their own mobile devices while narrating their experience in real time. This approach captured something heuristic evaluation alone cannot: the actual decision-making process of prospective students navigating a research journey they were genuinely motivated to complete.

Task scenarios were developed to mirror the specific workflows UMGC's target audience most commonly follows — locating program details, requesting information, identifying admissions deadlines, finding contact options, and accessing military tuition information. Each task was designed to test a distinct pathway through the site without leading participants toward a correct answer.

The moderated format allowed follow-up questions when participants paused, backtracked, or expressed uncertainty — turning moments of friction into rich data about not just where the experience broke down, but what users expected to happen instead.

8 Moderated remote sessions conducted with prospective students on their own mobile devices

6 Task scenarios designed to simulate the full range of prospective student research behavior

7 Usability areas evaluated across navigation, findability, labeling, and path discoverability

60h End-to-end engagement from study design through written report and stakeholder presentation

Research Process

01 Study Design

Screener criteria, task scenarios, and moderation guide developed to reflect real prospective student research behavior

02 Moderated Testing

Eight remote sessions conducted via UserZoomGo on participants' own mobile devices with think-aloud protocol

03 Behavioral Analysis

Navigation patterns, hesitation points, and task completion challenges identified across all sessions

04 Findings & Delivery

Themes synthesized into actionable recommendations and delivered via presentation and written research report

Evaluation Scope

Tasks and evaluation criteria grounded in the actual enrollment research journey

Every task scenario and evaluation criterion was tied directly to what UMGC's prospective students actually need to accomplish on mobile — not theoretical edge cases, but the core information-gathering steps that precede enrollment.

Tasks Participants Completed

Locate master's degree program specifications

Request program information

Identify registration deadlines

Locate admissions webinar information

Find ways to contact the university

Identify military tuition information

Usability Areas Evaluated

Navigation and orientation clarity

Information findability

Search and navigation behavior

Labeling and messaging clarity

Task completion efficiency

Alternate path discoverability

Key Insight

Primary paths worked. Secondary content was effectively invisible.

The most important pattern to emerge from the research wasn't a broad site failure — it was a specific structural gap. Participants could generally navigate to core program information. Where they struggled was in locating secondary content: admissions webinars, alternate contact pathways, and supporting information that lived outside the primary navigation structure.

Central Finding

When important supporting content is difficult to discover, users don't conclude it's missing — they conclude they're looking in the wrong place. This triggers exploratory navigation and repeated searches that increase cognitive load and create doubt about whether the site can answer their questions at all.

This distinction — between primary path success and secondary content invisibility — was critical for prioritizing recommendations. Rather than calling for broad structural changes, the research pointed to targeted visibility and labeling improvements that could dramatically reduce friction without requiring a full redesign.

Key Findings

What worked, what didn't — and what the behavior revealed

Findings were organized to give the client team a complete picture: not just where the site fell short, but where it was already succeeding — which helped the team allocate improvement effort where it would have the most impact.

Program information was generally discoverable along primary paths

Most participants located master's degree specifications without significant difficulty. Primary program discovery pathways within the navigation were largely effective at guiding users to the right content.

Working well — maintain and reinforce

Request-information workflows supported multiple successful approaches

Participants were generally able to locate and complete the request-information flow. Multiple entry points allowed users who approached the task from different starting positions to succeed regardless of their navigation path.

Working well — multiple entry points are an asset

01 Admissions webinar information was not visible enough to be found reliably

Participants frequently struggled to locate webinar information — resorting to search or exploring multiple pages before finding it. The content existed on the site, but lacked the navigational visibility needed for users to find it without effort.

Add direct navigation link and surface events within program pathways

02 Contact pathways were inconsistently signposted across the site

Participants could eventually find contact options, but the routes they took varied widely — navigation menus, footer searches, direct phone number hunting. The wide variation in approach indicated that contact options lacked clear, consistent signposting within the interface.

Clarify contact labels and provide consistent context about where each leads

03 Alternate navigation paths to the same content were not clearly indicated

While primary paths worked, users consistently struggled when asked to find alternate routes to the same information. Navigation items did not signal when additional subpages existed, and the hierarchy of the menu structure was not visually reinforced.

Indicate subpages in navigation and reinforce structural hierarchy visually

Recommendations

Four targeted improvements to reduce research friction

Each recommendation was tied directly to observed participant behavior — grounded in what users actually did during sessions, not hypothetical user journeys. The goal was precision: improvements that would have high impact without requiring broad structural change.

Admissions Events Visibility

Make webinars and events discoverable within the research path

  • Add a direct navigation link to admissions webinar information

  • Surface upcoming events within program exploration pathways so users encounter them naturally

Contact Labeling

Clarify what contact options exist and where they lead

  • Clarify labels associated with phone numbers and contact links throughout the site

  • Provide clearer context about the destination or purpose of each contact option

Navigation Signals

Help users understand the depth of the navigation structure

  • Indicate visually when navigation items contain additional subpages

  • Reinforce the structural hierarchy within navigation menus on mobile

Filter & Selection Clarity

Improve feedback within program comparison and filtering tools

  • Provide clearer indicators for selected options within program comparison tools

  • Reinforce visual feedback when filters are applied so users can confirm their selections

Delivery & Presentation

Findings delivered in two formats — for the meeting and beyond it

Findings were presented to the UMGC client team through a stakeholder presentation and accompanied by a written research report. The two-format delivery was intentional: the presentation drove discussion and allowed stakeholders to explore participant behavior in context, while the written report gave the team a reference document they could use to brief additional stakeholders and track implementation progress over time.

The moderated nature of the research made the presentation particularly effective — rather than describing participant behavior in abstract terms, the team could be shown exactly how users navigated, where they hesitated, and what they said when they encountered confusion. This kind of observable evidence creates alignment quickly and gives implementation teams the context they need to prioritize correctly.

Outcomes

A majority of recommendations were implemented following delivery

The UMGC client team implemented a majority of the recommended improvements after the presentation, focusing on navigation clarity, labeling adjustments, and improved visibility of key admissions information — the three areas where observed participant behavior most clearly indicated friction.

🎓Navigation & Labeling Improved

Adjustments to navigation structure and contact labeling made key pathways clearer for prospective students researching programs on mobile — directly addressing the most common friction points observed in sessions.

📅Admissions Content Made More Visible

Visibility improvements to admissions events and webinar information reduced the exploratory searching behavior observed during testing, making it easier for prospective students to encounter critical information naturally.

📄Written Report Extended the Impact

The written research report allowed the client team to share findings with stakeholders beyond the original presentation, supporting a broader conversation about mobile experience improvements for prospective students.

Strategic Takeaway

The most revealing finding wasn't where users failed — it was how. When they couldn't find something, they didn't give up immediately. They searched harder, retraced their steps, and started to doubt themselves. Reducing that moment of doubt is exactly what good usability work is for.

— Core principle applied throughout this engagement

This project reinforced a principle that shapes how I approach every research engagement: the difference between a site that "works" and a site that works well often lives in secondary content — the information users need after the main path has run its course. Primary navigation gets most of the design attention, but it's the moments when users step off that path that determine whether they stay and complete their goal or leave with their questions unanswered.

University of Maryland Global Campus · Mobile Program Research Usability Study

Moderated User TestingMobile UXHigher EducationInformation ArchitectureUserZoomGo

Hover State

Label is not visible on hover

No BCTs

-Explaiin-

Moderated Remote User Test

Screenshot Examples

No BCTs

-Explaiin-

No BCTs

-Explaiin-