- Oculus Research Project
( Now called 'Facebook Reality Lab' )
Fartalk is a Virtual Reality environment designed to enable research
towards achieving photorealistic social VR experience.
I was responsible for user research and background research, defining high-level structure of the project together with the team, creating information architecture, designing interactions and both graphical and environmental interfaces. I also conducted regular usability tests and interviews based on the development timeline, gathered feedback and made certain design changes to improve the experience.
User and Market Research
User Interface Design ( both graphical and environmental )
Usability Tests and Interviews
There are three core goals of this project:
1. Data-Collection: Enable data collection of real social interactions in virtual reality for social prediction and other research studies at the Oculus Research Pittsburgh (ORP) lab.
2. External Demo System: Provide a VR environment for users to conveniently test and experience the head mounted capture (HMC) system and reconstructed social signals such as facial motion.
3. Internal Evaluation System: Provide quantitative and qualitative means of evaluating the accuracy, precision, and efficiency of the head mounted capture system, the avatar construction, animation, and prediction sub-systems.
The research projects in the lab are in progress along with our project, which makes our team's work much more complex since we had to continuously communicate with other teams on their research and understand what we are able to put into the experience. Sometimes there might be completely new research results that need to be integrated into the experience.
The challenge as a designer is to look at the big picture and design a product that not only provide users a good experience which would help data collection, but also will be least affected by new research and technologies that might be integrated along the way. New research results being implemented should improve, but not mess up the experience. How do we create an experience that users would be willing to spend time in so that the data collected can be used for social prediction research purpose? How do we create an experience/a structure that is sustainable enough for future research implementations?
After interviews and research on the target users who are mainly internal employees, I learned that we are targeting people who are familiar with VR to a certain extent and will be having HMC system installed by their desks. Target users will be using Fartalk mostly at their work places. Thus, the default posture for our users will be seated, and In terms of 'social', the usage of this application can be meeting purpose, or chatting with friends.
So we decided to design Fartalk which will be used mainly as a chatting and meeting tool where actual facial movements are captured and being presented to friends and co-works in VR.
There were a lot of uncertain factors in the early phase of development. I talked with several people across different teams to make sure that I got all the information, how the system would work in the backend, and what are the possibilities. Hardware-wise, I tested the HMC system thoroughly to understand the tracking technology, as well as the future plan for the tracking and headset design.
At first, I was designing based on an ideal situation. As I learned more details and limitations along the way, I started to cut down some unnecessary steps, and restructure.
In the beginning, user will need to enter his/her name on the desktop in order to let other people know who is in which room in the experience. In the later version, user should only need to enter his/her name on their own machine once.
BEFORE — We focus a lot on the research perspectives (almost equally as we focus on the actual user side).
AFTER — After several iterations, we figured that it is more important to focus on the users since they might not all be researchers. So we decided to hide the research features and make them unobtrusive.
BEFORE — We used to let users choose their avatars at first and then let them choose from a bunch of different chatting rooms.
AFTER — We introduced total different cartoon style avatars on top of the realistic ones. We want to match the room style with the avatar style, so we decided to let users choose a room before choosing their own avatar. There will also be a blackbox, a place with best lighting environment for users to see clearly all the facial and eye movements.
The greyed-out parts are the functions that we haven’t implemented and plan to implement in the near future.
INTERACTION AND INTERFACE DESIGN
LOGIN INTERFACE ON PC
MENU DESIGN (MOCK-UP IN MAYA)
2D UI with 3D animations
3D UI with animations
In this version, watch serves as the mirror feature for users to check their face and facial expressions, which, on the other hand, can also be used for debugging purpose.
In the future version, the watch can be used as a quick access to more features.
USABILITY TESTS AND INTERVIEWS
Usability tests and interviews were conducted internally in the lab every 2-3 weeks in order to get feedback from our actual future users to better improve the experience. At the same time, it helps us push for a stable testing version and find out unknown bugs in time.
I found out a lot of issues and new ideas throughout the testing sessions and I documented all of them. We have fixed some of the issues and ideas, however, due to our schedule and implementation effort, some of the ideas were put into future plans.