Hobsons builds software that helps universities accept, process, and manage applications from prospective students. Their software was designed principally for admissions officers.
Over time, however, they noticed that admissions officers were sharing passwords with faculty so that applications could be reviewed online. In other cases, Hobsons observed schools building custom application review software.
Realizing a new product opportunity, they asked Viget to help envision and build a tool that would streamline college application reviews.
I led the design of this new product, App Review. Among our team of developers and visual designers, I was responsible for:
Observe the myriad ways faculty and admissions officers review applications at different universities.
Develop a conceptual model of the application and prioritize features based on technical complexity and user value.
UI & IX Design
Wireframe and annotate all meaningful screens, states, architecture, and user flows.
Conduct moderated task-based usability testing using a prototype of App Review.
Reviewing admission applications requires two distinct groups to collaborate:
- Admissions Officers (AOs) receive and assign a group of applications to a pool of faculty reviewers. They keep tabs on progress and help usher the review along.
- Faculty Reviewers review the applications whenever they can find time. Often, they need to collaborate with other reviewers to do so.
The issue? Each school had a unique review process. To ensure AppReview supported them, I worked with admissions officers to investigate and define the common denominator of activities and roles.
Thinking about the common architecture enabled us accommodate a broad variety of use cases — and not over-engineer for edge cases.
I used a sequence diagram to convey this collaboration process.
I also needed to understand how faculty members reviewed applications. In one-on-one interviews, I talked with them about their processes. Going one step further, I had them review applications while I observed.
I captured this information in a mental model, which we later used as a foundation for brainstorming functionality.
This group of users is time-pressed and, as a result, they've developed a number of interesting strategies for reviewing a large number of applications, including:
- Piling and Subdividing - creating subgroups for easier comparison, or to review during small windows of free time.
- Comparing and Contrasting - comparing similar candidates on a variety of attributes.
- Annotating and Dog-earing - adding Post-Its and marginalia to distinguish candidates and remember vital information.
These observed — yet unstated — requirements were important for a productive user experience. They led to features like tagging, starring, and private notes.
Forming a concept
Having conducted enough research to start planning, I brought the team together to sketch out ideas.
This exercise helped us develop a shared understanding of the product and its concepts. Though I was remote, I used an IPVEO cam to sketch live with the client.
With a common vision in place, I turned my attention on identifying the flows and views that the application would need. This established scope and allowed us to accurately estimate effort.
Designing screens and flows
Section by section, I wireframed the necessary views. Each week I held a work share, where I presented what had been completed. Critically, this was attended by their product, development, and QA teams. This kept everyone in the loop and helped us course-correct when needed. Outside of these, we had ad-hoc video conferences and Basecamp discussions.
Because this was a handoff project, full annotations were necessary. Some notable pages follow:
The inbox is the hub of the experience. We relied on the mailbox metaphor because it was familiar to most users.
The application detail contains two primary parts. First, it displays information about the applicant in the format chosen by the admissions officer. Second, it provides a questionnaire for faculty members to record their interim notes and final decision about the applicant.
An admissions officer could assign one or many people to review applications. This interface supported picking reviewers, assigning roles, and defining the order in which they review applications.
Each program cares about different application details. Some emphasize standardized test scores while others couldn’t care less.
The interface allows application fields to be added and reordered. This is done in a mock UI to help admissions officers visualize how it will look to reviewers.
Similarly, the questions asked of reviewers vary from among schools and programs. To accommodate this, we devised a set of common question types based on what was commonly asked. These can be added and ordered as to suit.
Testing & delivery
About midway through the design phase, we had enough screens to test the core functionality of the app.
I set up an online clickable prototype and ran moderated, task-based usability tests using GoToMeeting. The client team participated as silent observers.
After each test, we held a brief recap where we discussed our observations and recorded them in Trello. Later, we clustered these, assigned severity, and prioritized improvements.
Based on the testing, we tweaked the interface, comped the screens, and built out the front-end. Hobsons took these assets and engineered the application.
- UX Design
- Todd Moy
- Visual Design
- Mark Steinruck
- Front-end Development
- Jeremy Frank
- Project Management
- Kevin Powers