THE PROJECT

OpenOakland is a three-year-old chapter of Code for America whose goal is to liberate citizens to ideate and develop projects to improve their community.

OpenOakland.org was in need of a redesign so the executive board assembled a team of volunteer designers, developers, content marketers and project managers to launch a new website.

MY ROLE

I was brought on to lead the user research arm of the project. Deliverables include:

  • User interviews
  • Usability testing
  • Survey creation
  • Data analysis
  • Research report creation

CHALLENGE

There are a lot of stakeholders involved with this project: OpenOakland members, Oakland citizens, sponsors and city representatives. With all of the different perspectives, constraints sometimes change abruptly because of stakeholder feedback. Additionally, we are also working against a predetermined timeline.

ACTION

The design team crafted a process that required each page of the website to be redesigned at least 3 times, with user research occurring between each iteration.

I was tasked with facilitating each research phase and collecting qualitative and quantitative data to help prioritize and inform the team's design decisions.

RESULT

This project is still ongoing with an estimated end date of January 2016. We are still utilizing our research, iterate and repeat process and planning to launch OpenOakland.org on time.


PROCESS

One of the main functions of my role is to conduct regular user testing of wireframes for the pages of the website. To make this more efficient and to increase the ROI of each session, I developed the following process to collect and compile data from testing:

  1. Conduct usability test on each wireframe before testing with users to identify elements that I assume will be confusing or frustrating for users.
  2. Organize hypotheses into three categories:
    • Visual – icons, pictures and videos
    • Interactive – buttons, links and tags
    • Content – website copy, image captions, titles and labels
  3. During testing, take note of which hypotheses are supported or refuted by user feedback.
  4. Take note of additional usability issues that I didn’t anticipate.
  5. Rank each frustrating or confusing element on a predetermined scale (see next section).

The goal for each page was to get user frustrations under 40%.


TESTING WIREFRAMES

I conducted remote moderated sessions to test each wireframe. Users were tasked with walking through a hi-fidelity wireframe while “thinking out loud.”


The goal was to test the hypotheses I gathered during my usability test and identify other frustrating elements that I didn't initially identify.


During testing, I ranked each frustrating or confusing element on a scale of 0 - 5.

My scale of usability:
0 – Users did not notice or acknowledge this element
1 – Users acknowledged this element and immediately understood it
2 – Users acknowledged this element as “confusing” but quickly figured out its meaning
3 – Users acknowledged this element as “confusing” and took a few minutes to figure out its meaning
4 - Users acknowledged this element as “confusing” but later figured out its meaning after moving on
5 – Users acknowledged this element as “confusing” but never figured out its meaning.


Here is one of the wireframes that I tested:



I organized my hypotheses into this chart:



This is the completed chart after testing was over:


After compiling the data, I determined that this page had a "frustration percentage" of 62%. Based on the data and user feedback, I wrote my recommendations and presented them to the design team.

PRESENTING RESEARCH

Below is an annotated wireframe that I presented to the design team along with the corresponding recommendations based on test data and user feedback.


OpenOakland Annotated Wireframe
  1. Intro copy is confusing for users and next steps are unclear. The action in this sentence reads "If you're interested, you'll find next steps on the project page" however, users became frustrated because they thought they were already on the project page.
  2. Relying on color coding to establish meaning left a lot of users unaware of the connection between elements. During testing, users that didn't notice the color of the triangles did not understand why the interest and skills tags were different colors
  3. The content hierarchy of the project cards is confusing for users. The information in the cards should be divided into primary and secondary information - any information that is not primary should be reserved for a different screen. Too much information too soon frustrated users during testing.
  4. Users did not understand "tags" vs "buttons" and were unsure about which were interactive. Again, the page doesn't clearly establish which elements are interactive and users were reluctant to click elements that were unclear.

TECHNIQUES

In addition to collecting quantitative data, I used other techniques to collect qualitative. A few techniques I used were:

  • 5 Second Tests - The design team had a certain message that we wanted each page to convey. By conducting a 5 second test, I was able to detail when that message was clear to users and when users missed it.
  • "On this page, what's clickable?" Tests - An issue we ran into early was not being consistent with links, buttons and static information. I used this interaction test to encourage the design team to create a standard for the entire site so that users didn't have to learn a new set of rules on each page.

WHAT'S NEXT

Right now I'm designing the usability test for the in-person testing of this site. All of the pages have gone through the necessary re-designs and the site is currently in development.


More Case Studies


Bateman Group | UX/UI Design

MightyMongoose | UX/UI Design

LiveBoard (Coming Soon) | UX/UI Design