top of page
1200px-NoImage.png

JARVIS

Joint Augmented Reality Visual Informatics System (JARVIS) is an Exploration Extravehicular Mobility Unit (xEMU) spacesuit heads-in display (HID) visual aid to enable crew EVA autonomy. This component is part of the xEMU Informatics Subsystem.

Role : Operations Team Lead

Mainly contributed in user research, concept of scenario, and requirement development.

Period: 2019 - 2020

DESIGN GOAL

As an Augmented Reality technology is emerging, the JARVIS needed to identify its applications

and prove its utility as a way to increase crew’s independent task authority,.

DESIGN PROCESS

process.png

RESEARCH

NASA is using the same spacesuits since the Apollo mission in 1980s.

As the US government announced "Boots on the Moon" by 2024, it is perfect time to apply new technology for the spacesuits. JARVIS aims to support future space operations for in-suit astronauts

by providing communication medium through Heads-In-Display and AR technology in the spacesuit.

To find right use cases and prove its usefulness,

we set three research goal and conducted market research, requirements review from high-level spacesuit development, and are developing user scenarios and conducting A/B testing. 

Key Research Goals

Identify technical limitation and requirements to apply AR to JARVIS

Suggest the potential operational use of the JARVIS system to accomplish explorations

Explore how JARVIS would affect task performance and crew’s independency

Market Research

To capture design considerations of Augmented Reality, I reviewed UX design requirements and guidelines from industry and captured relevant guidelines and categorize it into now, next, and future in order to prioritize those guidelines based on its feasibility in the current state of space mission technology. 

JARVIS Capability Roadmap.png
review.png
guideline.png

Requirement Review

To discover potential information which needs to be displayed on the helmet to support the spacesuit functionality.

benefit.png

Brainstorming

To identify details of design features in each use case, our team brainstormed the potential information that can be used for user awareness during space mission in the deep dive session. Then, based on environmental restriction and technical support, we categorized ideas from the deep dive session into now, next, and future. 

brainstorm.png

User Scenario

To identify how the users can interact with a visual reference on the JARVIS to perform their exploration mission, we first analyzed a task flow of each use case and linked with relevant capability with each task. Then, we specified a specific task from the task flow and described how the user can interact with JARVIS system and perform their task.

Task Flow_Photo Video.png
scenario.png

A/B Testing

To explore the effectiveness of a visual reference on task performance and crew independency in the procedure use case, we performed A/B testing with different type of visual reference: no visual reference, with paper-printed reference on the wrist, and with paper-printed reference in front of user view. The procedure, materials, and results are confidential yet.

abtest.png

© 2022 by Yunkyung Kim.

bottom of page