Discovery, structure, and design:
a case study in redesign.

As UX Architect on the College of Agricultural and Environmental Sciences' web team, I was tasked to redesign the student recruitment section of the college's website. This site focuses on high level, general information for prospective and current students. Example content includes overviews for all academic programs, a listing of extracurricular activities, academic services, a calendar of academic events, and career resources for current students.

After our project team was formed, we set out to tackle the problem -- roughly following the workflow described in Kelly Goto and Emily Cotler's Web ReDesign 2.0: Workflow that Works. The project manager outlined a project map to communicate with our client, while allowing for more flexible, iterative, agile development cycles within Goto's phase-based workflow.

Technical goals for this redesign project included modernizing the site for mobile users and integrating it into a complex content management system.

  • This case study focuses only our discovery, structure, and design phases. This was done in close collaboration with the design and client team, with approximate 2 week check-in periods as the project unfolded.
  • The development and implementation phases will be examined separately as they reveal the process of turning a design into a product via a content management system.

Needs Discovery

Initial conversations focused on understanding the website's current state, its audiences, and the client's business goals. Part of the needs discovery phase was to reveal ideas, biases, and expectations; identify gaps between the current site and its desired state; and establish a common understanding for the tasks ahead.

Define objectives.

What are the goals for the website? Who is the primary audience? What do they want to know?

Before undertaking any new designs, evaluations, or rewrites, the client was asked to define the objectives, audience, and key content on the site. This would help to evaluate the current site, reframe items out of alignment, and ultimately make visiting the site a more successful and pleasing experience.

The client identified prospective students as the primary audience and the college's academics programs as the most important information. These site objectives were defined:

  • Recruit new students to CAES academic programs.
  • Provide information to current students and other non-faculty audiences.

With these objectives explicitly stated, each decision in the design process could be questioned as to meeting the goals of the site.

Inventory content.

What content exists? Who is the expert on it? What needs updating? What's missing?

I created an inventory of all the content on the website to provide a complete picture of what did and did not exist on the site. The content inventory included a link, page title, descriptive title, and brief annotations where needed. It revealed the extent of the site's content to the entire team.

Also included with the inventory:

  • A summary of kinds of pages on the site, its content, update frequencies, call-to-actions and links. This described content patterns and revealed areas of concentrated interest.
  • An analytics summary of most and least visited pages.
  • Popular questions appearing in organic search traffic to the site, such as: "What can you do with an Animal Science degree? How do you study for applied biotechnology? What do you study in college?"
  • Notes on known content that existed outside of the website (i.e. job postings, news and events in weekly email newsletters, questions emailed to subject matter experts). 

All items provided in the content inventory helped the client understand the scope of their own site, its high and low points, and users' expectations.

Content types on the existing site, with proposed update frequency, call-to-actions, and key links.

Capture current state.

How is the site currently performing? How is it organized? Are users finding valuable information? What are users' pain points?

I compiled a high-level Google analytics report to capture the site's current performance. Key findings were:

  • The site was experiencing ~10% annual increase in users. Rate of annual increase was slowing since the last redesign in 2010.
  • Users visiting 4-year curriculum plans spent more minutes on site and visited more pages per session, indicating an audience engaged with detail-rich, major-specific content.
  • ~60% of all sessions were from users in Georgia, indicating out-of-state traffic a significant audience as a whole.
  • The form used to interact with the student recruiter received high traffic but had low completion rates.

I produced a site map to show the current organization and taxonomy of the site. This would be reevaluated following the competitive analysis and content audit.

Sitemap before: Original site structure was wide and shallow.

Review site. Analyze competitors.

What are the strengths and weaknesses of this site? How does it compare to peer institutions?

Being very familiar with the site, I conducted an expert review of the existing site structure, content, and user flow. Observations made about the current site included:

  • The site was comprehensive in its scope, including high-level information for all things student-related.
  • Parent site elements dominated the page (global navigation, college-wide search functionality)
  • The major pages were overloaded and weighed down with accessory information.
  • Department websites had varying amounts of information for academic programs.
  • Items in the navigation failed the "sniff test" – either being too vague ("People", "Graduate") or too obscure ("Deans' Promise: Enrichment Opportunities")
  • Call to actions were buried within content.
  • Paths to communicate with the recruiter were not well defined.
  • Website generally static, with very little change over time.

Before: Original site design was static and nested within a parent website.

Before: Majors pages were dense, outdated, and lacked clear call to actions.

From client interviews, I knew the client wanted to strengthen the website content and reorganize navigation. As part of the discovery process, websites for similar institutions were evaluated and collected based on:

  • Kind of content
  • Quality of content
  • Site navigation and taxonomies
  • Overall look-and-feel
  • Integration of site with parent institution
  • Distribution/shared assets with sibling and child websites
  • Interaction with prospective students (call-to-action buttons, forms)

Insights were also gathered from eduStyle's The EduStyle Guide to Usable Higher-Ed Homepage Design, Research Report on Navigation Trends in Higher Ed, Website Standards in Higher Ed, and the Nielsen Norman Group's College Students (Ages 18-24) on the Web, 2nd Edition.

Information synthesized from these exercises informed navigation patterns, content organization, taxonomies, content breadth, and task modeling throughout the project life cycle.

Establish Site Structure

Following the needs discovery phase, the client identified primary content. Together we developed a lean site structure with navigation targeting the primary user base.

Audit content.

What is value of current content: keep, delete, modify, or add?

The client team audited every page in the site inventory, noting what should happen to it, the quality of the content, and any outstanding problems. The project manager and I reviewed their audit to produce a list of things to fix now, fix later, and discuss further.

This was referred to as a common document describing decisions made to existing content. It was used to inform content reorganization, navigation, and prioritize content updates throughout the redesign process.

Organize content.

How should the site be organized?

Information gleaned from the competitive analysis, content audit and internal expert review indicated a number of issues with the current site structure.

  • Unchecked growth had produced a wide, yet shallow site organization.
  • Information used to engage current students via email was not included.
  • Content editors abandoned some sections of the site.

I went through a number of site map iterations with the team, starting with the current state and narrowing it to a very focused structure. My goal was to logically group the most important elements of the site with easy-to-understand labels while leaving room to expand for special initiatives.

Ultimately, the content was separated by audience (prospective and current students), by program (undergraduate and graduate), and by campus. A news area was created for events, one-off articles, and announcements .

I brought the site map to all meetings and work sessions, adjusting it if absolutely necessary. It was primarily used as a common reminder for site direction and a tool to keep track of the content and pages as they were developed.

After: Final site map focused content into narrow but deep categories.

Determine site navigation.

How do users move through the site? Do they understand the chosen taxonomy and organization?

Like the site map, site navigation moved from a lightly touched reorganization to a complete overhaul, with many peer and client-critiqued iterations in between. This was documented separately from the site map to help the client see how a user would interact with content.

Proposed navigation: labels used for tree testing.

Once the team agreed on a navigation scheme, I conducted a tree-test with Optimal Workshop's TreeJack tool. This was to measure how well users could find information using only the navigation taxonomy in the new site structure. I developed a series of tasks to test specific aspects of the navigation. Each participant was given a random selection of 6 tasks to perform. The test recorded every click made in the navigation and revealed the degree of difficulty and directness experienced with each task.

Tree testing: Table of tasks, acceptable answers, test goal, and user success rate. Items in red and orange had lowest success rates, indicating potential problems with our proposed navigation.

Tree testing: diagram shows erratic user paths for a specific task. This unsuccessful task revealed that a single menu item to request information might be difficult to find.

Tree testing was two-part. First, I asked colleagues to complete the test to check for any issues with the mechanism and questions. Then users were recruited from the college's Facebook groups and student listserv. Testing revealed points of success and confusion, which I incorporated directly into the site structure, navigation, and page design.

Proposed navigation: labels adjusted after tree testing.

Design. Get Feedback. Design Again.

With the content and structure reasonably defined, I could begin creating designs that would bring the site together.

Sketch lo-fi visuals. Get feedback.

Following the site map as a guide, I drew simple sketches for all primary page types of the site. These paper and pencil designs were cheap to produce, served as excellent visuals for site discussion, and were easy to discard if necessary.

These rudimentary wireframes were used to demonstrate user paths through the site, reusable/repeating content, deviations from the existing design, solutions for usability issues, and other ideas that had been discussed in the team.

This allowed me to uncover issues that could be abandoned without taking further steps into development of page content, design or code.

Sketch lo-fi visuals: Wireframe sketches demonstrate general page layout, kinds of content, and main ideas. These were used in a live demo to show flow between pages.

Create interactive prototype. Get feedback.

After presenting and gathering feedback on the pencil sketches, I reproduced the proposed wireframes in Axure. This showed more sophisticated linking, visual layering, and other interactions.

This prototype established the general look-and-feel of the site, including typography and simulated interactions like a home page slideshow, major flip-cards, slideshows, and how users would drilldown through pages. This allowed me to share ideas again for immediate feedback -- without investment in code development.

I also used this prototype to determine the modular, reusable parts of the site for the content management system with the application analyst. With this prototype, almost all expected interactions for our development phase were defined.

Interactive prototype: used as a discussion point for development, leading to early agreement on almost all interactions needed for development (print outs shown here with notes from component discussions).

Interactive prototype: Major page content, notes outline components with developer.

Design in browser. Get feedback.

Once the design and interaction ideas were agreed upon, it was time to move the design into the browser. Using Zurb Foundation as a CSS framework, I was able to rapidly outfit an established grid system with the designs demonstrated in Axure. For functionality that extended beyond Foundation, I researched, proof-tested, and integrated jQuery plugins or wrote simple JavaScript solutions.

As features were brought into the front-end design, they were continuously shared with clients and colleagues for feedback.

The HTML prototype was fundamental to the development phase, as it provided the final output and structure for all components designed in the content management system.

After: final home page design targeted current news and events.

After: majors listing page included filters to narrow by campus and direct links to complete key tasks (apply, plan a visit, or request information).

After: individual major pages include content unique to students in this program. Again, clear call to actions are in the upper sidebar.


CAES Students

Technical team:

  • Project manager
  • UX architect/designer
  • Application analyst
  • System administrator

Client team:

  • Project lead/owner
  • Administrative assistant/content expert
  • Content editor/current student