Foresight Accessible Rideshare App
UI/UX designs for a site that helps event hosts set up and customize their virtual event space.
work
Industrial Design x Mechanical Engineering Capstone Project
date_range
August 2021 - December 2021
people
Roles:
Me: UX Design, UI Design, UX Research
Team: Darye Ji, Divita Chillakuru, Samantha Weinberg - Industrial Design Students and Josh Lee - Mechanical Engineering Student

About This Project

This was my semester-long capstone project for a joint industrial design and mechanical engineering studio. I had the opportunity to work with not only engineering students but also with a sponsor, Foresight AR (FAR). FAR is an Atlanta-based startup whose goal is to build accessible smart cities for people with visual impairment (PVI). The company’s goal was to present our work as part of their application for phase two of the US DOT Inclusive Design Challenge. The company was selected as a finalist during phase one and challenged us to create a realistic system to address pickup and dropoff points that would be universally easier to use, but especially for PVI.

FAR wanted our interdisciplinary team to work on a system for helping PVI identify and locate rideshare vehicles at centralized pickup and dropoff zones within a city without additional human intervention. The company was already developing ultra-wideband (UWB) communication protocols to accurately determine the relative position of a signal. We were tasked with applying and testing their proprietary UWB technology. Our proposed system consisted of CAD designs for physical transmitters and receivers, an app prototype for navigation guidance and rideshare booking, and UWB and Bluetooth Low Energy (BLE) signal strength feasibility testing.

I led the team in designing the app and figuring out how users with a variety of visual impairments would interact with it. This involved making it visually accessible for low-vision users and designing it to be easily accessible using screen readers. I learned about working with clients and interdisciplinary students, as well as a lot about starting UI designs with accessibility in mind so that later down the line the app can be coded to be fully usable by and useful to all users.

The Challenge

FAR asked us to take the UWB technology that they were developing and conduct interviews and testing to decide whether it could contribute to a rideshare locating system. No one on our team had worked with UWB before, so we conducted research and interviews to overcome that knowledge gap. No one had designed physical or digital products for people with complete blindness either, so there was also a learning curve around implementing tactile and auditory feedback.

The sponsors also challenged us to imagine this as a solution for a future where rideshare vehicles are autonomous—a world without drivers to assist the PVI in finding their cars. So on top of the technology and design patterns we had to familiarize ourselves with, we needed to envision our solution for a futuristic use case.

Additionally, of course, many long projects come with tensions and time management stress. As a team, we had to navigate project management and hold each other accountable while maintaining amicable relationships between ourselves, with our sponsor, and with our professors.

Solution

Our solution had 4 main pieces to it: a wearable UWB anchor for the PVI, a UWB anchor for inside the rideshare vehicle, n UWB beacon attached to pickup/dropoff spot signage, and an app to guide the user on their phone. The ideal application for this would not require the wearable UWB anchor for the PVI because modern smartphones are starting to include UWB capabilities that will replace the need for PVI to buy and wear an additional anchor. The poster below summarizes the project as a whole. I decided to focus most of this portfolio entry specifically on the app development since that was my main contribution to the project, but I will explain the rest of the system at a high level to provide context.

Project overview poster

Discovery

We started the project with lengthy primary and secondary research to understand the unfamiliar technologies and specific user base we were designing for.

Needfinding

The team conducted nine total interviews. Seven of our interviewees included people with low vision, total blindness, legal blindness, and multi-sensory loss. Five of the interviews were informational and held online where we could ask the PVI about their commute and navigation habits. Two of the interviews were more observational and conducted “in the field” as we observed how one individual with total blindness and another with total blindness and hard of hearing navigated in their day-to-day lives. We also spoke with two urban planning professors, especially those who specialize in geographic information systems (GIS), to gain more domain knowledge about choosing pickup and dropoff spots within a city.

Shadowing pictures and observations user #1
Shadowing pictures and observations user #2

We used affinity mapping to extract 3 key insights:

  1. Traveling solo requires planning far in advance ahead of time
  2. Memory retention and cognitive load get strained, especially in unfamiliar environments
  3. Transportation preference is correlated with cost and time
Affinity map

This led to the conclusion that, if traveling alone, people with visual impairments want to be as independent as possible, but traveling is mentally exhausting and transportation options are contingent on financial and time values.

Secondary Research

Our secondary research involved looking into existing tech, academic papers, and adjacent applications. We focused on three main topics, dividing and conquering to learn as much as we could in a limited time:

  1. Accessibility and mobility—tools or strategies, state of accessible rideshare
  2. Smart technology and data—AV technology, current app solutions
  3. Municipality codes—government regulations on parking, public transportation shelters, and pickup and dropoff areas

I researched topic #1 and looked into what Uber and Lyft currently provided for PVI. I also looked at how people could leverage Marta Mobility, the local paratransit service. In general, solutions were provided for people with motor disabilities—like Uber and Lyft’s wheelchair-accessible vehicles—but support for PVI who can’t see where their vehicle is and where it is going was lacking. The focus seemed to be simply on boarding the vehicle, not finding it. Uber and Lyft provide assisted rides, but those depend on drivers to get specially trained, leading to complaints of supply being low.

Evaluation of Lyft, Uber, and Marta Mobility accessible ride features

Design Criteria

Once we had a better grasp of the user group, their needs, and the technological limitations in mind, we were able to create a variety of criteria for our system.

Criteria tables for each piece of the design system

These criteria outlined how we wanted each piece of the system to perform. We assigned them a priority based on how crucial the requirement was for the system to function properly, grouped them by requirement type (i.e. function, ergonomic, aesthetic, etc.), and described how we might measure success for each. This table was immensely useful for making sure each piece of the system got fully fleshed out, for planning the project milestones, and for distributing tasks among teammates.

UX Design

With the discovery done and the project scoped, we designed and tested each piece and how they work together.

System Design

With many iterations that would take me twice as long as this entire project description to describe and illustrate, we designed a 4-piece PVI rideshare navigation system.

The beacons would be attached to signage and be placed in popular pick-up and drop-off areas. Because they are bolted to one spot, they can provide a point of truth in terms of location. We designed the beacon to be solar-powered and calculated what type of solar panel would be needed to provide sufficient power throughout the year without much maintenance or large batteries. The signage design was also tested with low-vision users to make sure the color contrast and graphics communicated effectively.

Beacon signage and UWB chip housing renderings with callouts and exploded view

To send their location to the beacon, users would wear a UWB-transmitting anchor. This would be a temporary piece in the system until UWB technology in smartphones is reliable enough to provide location information. We tested the signal distance for UWB and found that its effective range in a city was only around 50m (170ft) and buildings and cars posed a large blocker to the signal’s clarity. Because of this, we decided to use GPS to supplement the navigation with UWB. GPS would be used for getting the user within 50m of the pickup point. This effectively gets the user to the correct city block and correct side of the road, and then the anchor will communicate with the beacon using the UWB waves to locate the pickup location with up to 10cm accuracy.

Wearable UWB anchor renderings with callouts and exploded view

To guide the autonomous vehicle (AV) to meet the user at a defined pickup spot, the AV will also have a UWB anchor similar to the users’. This anchor will provide the beacon with an accurate location of the vehicle, which it can then convey to the user’s phone. With the UWB signal’s 10cm accuracy radius, it would also be feasible to guide the user directly up to the door of the car, resolving any confusion about where to board.

AV UWB anchor renderings with callouts and exploded view

The user’s smartphone will process all the location data and give directions to the user. The user can use the phone to call a rideshare pickup and get timely notifications about the AV’s arrival. If the user needs to walk to the pickup destination, the phone will start directing them so the car and user can arrive at as close to the same time as possible, minimizing wait time for the users and minimizing the time the AV spends stopped at the curb, thereby reducing the impact on traffic. The phone will connect to the beacon and the wearable anchor via BLE signals, which do not require much power. Using the distances and directions calculated from the wearable anchor and AV anchor with respect to the beacon, the phone would give directions to the user to meet their rideshare vehicle.

App screens from the prototype

The systems chart is displayed below, showing how all the pieces work together and what signals connect them.

Systems chart

App Design

As I mentioned earlier, I took charge of the app design. I not only created all the mockups but also became the team’s expert in designing apps for PVI. This meant doing competitive research, secondary research, developing testing methodologies, and reaching out to experts to validate my designs.

Competitor App Analysis

I started by creating a template to evaluate competing navigation apps made for PVI. The goal was to determine the key functions of each app, take notes of the design and structure, and take screenshots of the UI. To supplement this, I also watched talks and read articles about app navigation for PVI and best practices for creating accessible designs. Some of my learnings included:

  1. There are three levels of accessibility: inaccessible, accessible, and usable
  2. Navigational cues need to be considered in design and built into code
  3. Multiple-column layouts don’t register well with screen readers
  4. Label everything descriptively, but concisely
  5. Annotations for labels and voiceovers are typically created by designers to guide developers

With all this in mind, I created an interaction flow to ensure our prototype would satisfy the design criteria mentioned earlier.

Competitive app analysis example for Be My Eyes app

Interaction Flow

I used a screen flow diagram to determine needed screens and share interaction behavior with the team, sponsor, professors, and users. The flow was refined multiple times since I was working on the app while the team CADed and calculated technological constraints. We also gained new knowledge when meeting with the sponsor every week and thus the system often needed updating and alterations.

For the final flow (depicted below), I read about different APIs’ costs and implementation methods to better understand how the back end of the app could be built. This helped inform economical and technical feasibility. We decided to present this flow and information on the APIs that could be used in addition to a Figma prototype that could more easily be tested with PVI.

App interaction flow diagram

Screen Development Process

I created the app screens following a 6-step process:

  1. Use sticky notes to list out all the necessary screens
  2. Add features and content that each screen should have
  3. Sketch out various layout ideas for each screen
  4. Create low-fi screens in Figma and further iterate on design
  5. Ask team to vote on favorite layouts
  6. Create consistent set of mid-fi screens based on voting and feedback from team

Step 4 allowed me to try a lot of ideas informed by the best practices and competitive research mentioned before. Step 5’s team voting allowed for discussion as to which screen flow is most user-friendly. The second round of mockups in step 6 was meant for me to work out details for content on each page and establish a more consistent visual design language.

Screen creation process imagery

User and Expert Feedback

Once I had the mid-fi mocks, I could bring them in front of low-vision users and UI designers with more experience to gather feedback. I was able to show my mockups to a group of 10 people with low vision and some of the updates from their feedback included...

  1. Dark mode is easier to see because many people with low vision also have light sensitivity
  2. Magnification options would need to be supported since text is normally already magnified for people with low vision
  3. Images of storefronts would be useful. Even if they can’t see the details in the image, they can still get a general idea of what to look for at their destination
App edits informed by user and expert testing

I was also able to get advice from a UI designer with experience working on accessible apps. He advised creating developer notations that include details on how to label components, what the screen reader should say, etc.

App notation examples for developers

Once I had these notations created, we met up again with a totally blind user and read out the screen reader scripts. This was a low-tech way to check that our descriptions and labeling made cognitive sense. We found a few gaps in our user flow from talking through, including having the ability to schedule a ride for later and directing the user on which side of the vehicle to get out of rather than which side of the road the destination is on.

Overall app flow changes

We suggested to the sponsor that it would be important to create an app prototype with screen reader functionality and check whether blind users could understand and navigate the structure of the app. This would ensure that the app is not just accessible, but also useable.

Results

At the end of our course and semester, we compiled a 67-page report for our sponsor, complete with 15 pages of writing, 6 pages of resources, and 46 pages of appendix with diagrams, tables, renderings, and more. We also sent FAR our CAD files, Figma, and signal testing data so they could continue to refine and develop our system.

We presented our project at the Fall 2021 Georgia Tech Capstone Expo and won first place for best project in Industrial Design and Mechanical Engineering. We also submitted our project to the Richard John Livingston Martin Humanitarian Design Contest in Spring 2022 and were awarded second place.

Pictures of our team at the Capstone Expo

I learned a lot from this project about accessibility and app creation for PVI, and I also worked on soft skills for collaborating with cross-functional teammates and communicating with sponsors. I know this one was a long one, so thanks for reading!

Other Projects

Google Internship Takeaways
Watch Listen Learn™ Card Game
NCR Training Documentation Website
Moolah Personal Finance App
Gatherly Spatial Video Chat Platform Redesign