Revisiting an old project with some UI updates
Original Project Timeline: July 2013 - July 2015
Class: UX Generalist
The artifact
I designed and built this functional prototype while working as a research practitioner at Georgia Tech. It's a mobile web app that lets users explore situated media around Midtown Atlanta via augmented reality.
These screenshots from the original web app feature situated media using augmented reality. The top left shows the main system menu, the top right shows a video about a sculpture situated at the sculpture's geolocation, and the bottom row shows a building plan overlaid on its future building site.
This video features functions and designs proposed during the development process.
Revisiting the Design
I updated my previous designs using Apple's iOS human interface guidelines and the experience I gained as a UX designer since I initially built the app. I incorporated some more up-to-date usability aspects and made sure the UI directed users' attention to the primary purpose of the app: exploring Midtown Atlanta through situated content and augmented reality features. Each icon is a module representing themed sets of augmented reality content or apps relevant to navigating Midtown Atlanta.
A. Apple’s iOS human interface guidelines recommend keeping the system status bar present unless there’s a better trade-off.
B. More screen real estate is given to the modules, the primary purpose of the app.
C. There’s few things more satisfying than the feeling of pushing a big button.
D. Users can add more modules here. The original app tried to integrate other apps and services as a one-stop shop for places and things of interest around Midtown Atlanta.
E. Users can make changes to their modules, get help, and set overall app preferences from the toolbar.
F. Screen elements are minimized to favor a larger viewport for augmentations.
G. Media content markers pop out a bit more with a container and outline.
H. Darker icons denote previously viewed media artifacts.
I. Windowed icons show where nearby content is if it’s nearby but not immediately in frame.
J. Users can filter content types, toggle augmentations, get help, find recommended viewing spots, and adjust settings via the toolbar.
K. Users can still see what’s situated in AR out here while reading about it below.
L. Text content slides up from the bottom when its in-field icon is pressed. This info “drawer” retracts when the icon is pressed again.
M. Users can also click the viewing field to retract popup content.
N. When a video icon is pressed, a video is overlaid in the environment so the user can peer through a window in the wall of reality to see the hidden layer “underneath.”
O. Rotating the device pulls the video into full screen mode. Reversing this action returns the system to the default view state.
Using the native platform's best practice guidelines freed up space on the screen and space in my mind; instead of redesigning common components that iOS had already, I could focus my attention on designing the novel, augmented reality parts of the app. I found that I was able to be more innovative and deliberate in my design decisions the second time around.
Next Level
Revisiting this project was a lot of fun, and I'd like to continue it with the following next steps:
Continue updating existing features and design others that didn't get realized in the first version due to time constraints.
Make an interactive prototype of the new designs to explore animation options and get closer to realizing a field-ready system.
Rebuild the functional parts of the app in a native mobile environment, such as Apple iOS, and with a newer augmented reality tool, like Apple's ARKit or ARToolkit + Unity3D.
Restart Level