Medical Simulated Interactive Manikin

Client: National University of Singapore


KEY WORDS: Mixed Reality, Augmented Reality, HoloLens, Medical, Educational



I did user research and medical background research, defined high-level structure based on client's requests and vision, designed user flow and customer journey, created storyboard for the physical examination phase, designed interaction and interface for the product, and also conducted usability testing and interviews.

I worked alongside a 3D Artist, a Producer, and three Software Engineers.

  • Research
  • User Flow & Customer Journey
  • Storyboard
  • Interaction Design
  • Visual Design
  • Usability Testing
  • Presentation


MediSIM provides students with the opportunity to see an interactive virtual patient - lending greater verisimilitude to the experience. The main purpose of this project is for medical students/trainers to practice abdominal examinations and prepare them for conducting examinations on real patients. 



Our target demographics are medical students or trainees who would be the main users of our product, as well as medical teachers who would set up scenarios for students and teach alongside when students are going through the experience.





The target users of our product is medical students / trainees who already have the knowledge about internal organs, the correlation between diseases and symptoms, and the process of the whole abdominal examination. However, they are still lack of clinical examination skills and the attitude of coming to real patients. I talked with a few medical students when I was doing research and they told me that it’s really hard for them to practice their knowledge from book because no one likes to be touched and TA is the only person that they can practice on.



 The internal organs inside the Abe are all real-sized abnormal organs, which will move to simulate the movement while breathing. The reason for only putting in abnormal organs is that a doctor shouldn't be able to feel anything while palpating if the patient is healthy (organs are normal).

The internal organs inside the Abe are all real-sized abnormal organs, which will move to simulate the movement while breathing. The reason for only putting in abnormal organs is that a doctor shouldn't be able to feel anything while palpating if the patient is healthy (organs are normal).



1. Bring Abe to life

A virtual patient will be superimposed on Abe in order for students to finish certain steps of abdominal examination which involves examining eyes, hands and nails. Also to help students practice their skills in context with an interactive patient.

2. Impression of 3D vision

Providing 3D vision of internal organs and related disease info simultaneously to help diagnose the disease, and understand the shape and arrangement of (deformed) organs.




The poster on the left has a sci-fi/futurism touch to it, but we decided to go with the one on the right because it conveys better meaning of our project.



 Final Version

Final Version


The last one is our finalized logo. I chose to use the medical symbol Caduceus instead of the internal organs. I chose to use green and blue-ish colors because of the clinic theme. I also put our name in one line with the dot on "i" because of the readability and consistency.



We want to help students with not only practicing the procedures, but also learning the diagnostic knowledge in context. To achieve this goal, we designed two different modes, Exam mode and Teach mode.

In Exam mode, medical students have to perform the procedures of abdominal examinations and make diagnosis based on their observations (symptoms of the virtual patient and the abnormal organs inside Abe). In teach mode, students have access to more information about various symptoms for related organs and diseases. Moreover, they may choose to have a closer look at realistic 3-dimensional internal organs and pick up any individual organ to manipulate.



Since this experience is possibly going to be used in class or a study group, we have to think broader. 

The customers of our experience are not necessarily the actual user of our product. Thus, we need to design for not only the users of the product but for everyone that's involved in the experience.


Design the experience for 4 devices and 3 groups of users on 2 different modes:





Initial design for Meta 1

Version 1

Version 2 - Blue

Version 2 - White

Given that a relatively dark environment is required while using the AR device (especially for Meta 1), I chose to use bright colors so that the UI can stand out, and be more readable and clear. The first version didn't work well due to the field of view and resolution of the device we were using (Meta 1). After the first failure, I realized that texts are really hard to see clear in the goggle. So I designed the second version with only the big text. It turned out to have a pretty good effect and look nice in it.


Design for HoloLens


State - Disabled

State - Hover

State - Selected


 Menu with instruction

Menu with instruction



During the migration to the HoloLens, we changed and optimized most of the interactions based on the features of the HoloLens.

For instance, the menu follows your head when you are moving around and you can also choose to pin the menu in 3D space so that you can always go back and find it whenever you want.


In order to let student have a closer look at the detail of the organs, we designed this feature (in the gif below), in which student may walk towards the organs and pick up individual organs to manipulate by saying “rotate” “move” or “scale”. 

As for the organs, the user can click on the "X-RAY" button to see through the patient's body for the purpose of learning the position of the organs inside the body while palpating at the same time. 


NEW UI (Use lines to lead the user)


The line comes out from the organ. Since the field of view is quite limited, I use the line to guild the user to look around in the "world".



MediSIM keeps track of all the necessary steps in the abdominal examination procedure and provides students with a report at the end showing the correct and missing steps.


At first, we decided to use an extra physical button as a trigger for the user to do some interactions. However, after some user testing, we found that it was not intuitive enough and otiose especially since we are using Augmented Reality. Thus we changed it to voice control. The user says certain words ("check") to interact with "the patient".  The last icon is the final one that we are using.


After we got HoloLens, we changed the interactions to a more natural way by just using your hand instead of saying words to the patient, which gives users more control and makes the experience closer to the real situation.



I designed this in order to make sure that the user is wearing the HoloLens right and sees the whole screen.



We have gotten a lot of positive feedback from medical students and industry guests including people from medical and educational industries.



VR Storytelling | Oculus Story Studio




Prologue explores emotional engagements in the realm of Virtual Reality.

The project creates an immersive experience with story beats and progression of events to evoke emotional reactions from players using various methods including movement, mechanics, and visual and audio components.

Our goal is to find a systematic way to evoke specific emotions in a story experience and to create and explore novel affordances for creating emotions for future VR developers.



I led the design of PROLOGUE from initial research to final deliverable. I worked alongside a 3D Artist and two Software Engineers.

  • Product Research
  • User Flow
  • Storyboarding
  • Interaction Design
  • User Interviews & Usability Testing





Options to consider for brainstorming:

  • Movement:
    • What is the impact on moving the player on the experience of the story/emotion?
    • What is the impact of allowing the player to move vs not allowing them to move on the experience of the story/emotion?
    • What are different forms of representing movement in VR (swimming, flying, bouncing) and how do they relate to emotion and storytelling? Can you invent a new kind of movement for VR experiences?
    • How can the audience can be permitted to move objects in the VR space in order to evoke specific emotions or storytelling beats?
  • Mechanics:
    • What actions create the desired feeling for your story?
    • What kinds of player actions evoke specific emotions?
    • How can combining actions create mechanics that express hybrid emotions?
    • How can performing certain actions or mechanics in a particular order evoke specific emotions or story experiences?
  • Visual & audio affordances:
    • What can you do to direct the audience's attention to particular places without being heavy handed?
    • What sorts of visual cues create mystery or drive the player to move or take certain actions?
    • What visual or audio cues suggest that something is interactive and/or how to interact with it?
    • How can gaze triggers be used to seamlessly create pacing and ensure that the audience doesn't miss a critical plot point?
    • How fast or slow should a story progress (be paced) to allow the audience to absorb everything but not get bored?
  • Aesthetics:
    • How can music be uniquely used to evoke specific emotions in VR?
    • How does proximity or movement intersect with music or sound in VR?
    • What visual effects can create certain emotional experiences or progressions uniquely in VR?
    • How can the use of space, movement, and light intersect uniquely in VR to effect particular emotional progressions or story experiences?
    • How can the shape of objects in the VR space be used to create specific emotions or story experiences?
    • How can transitions between scenes be embellished with aesthetic elements that enhance the core emotion or story experience?

I spent most of my time doing research in the first week of pre-production including materials of how to tell an emotionally engaging story as well as good storytelling short films and experiences. I was inspired a lot by one of the episodes in the 2nd season of Black Mirror called White Bear and some other similar concept short films. In the end we agreed on creating a looping story in which the start and the ending of the story will be somewhat the same but from a different perspective.

Initially we had two different ideas for the story and the interactions as well. One is about reincarnation in which player would create a doll/robot and in the end becomes/"gives birth" to the thing he just created, the other one is more of a dark story where player does some "wrong" moves that he might not know at first and he needs to make a decision whether to set up another person or sacrifice himself (whether to become good or evil) in the end of the experience.



Based on our initial idea, I created this emotional arc that we want players to experience. We want the player to feel curious at first and experience a set of positive emotions and then negative emotions. The twist will be a sudden transformation which includes scale and perspective change. And we want the player to have some takeaway/insights after the whole experience. 



In the first scene, Player will enter a foggy forest and see some fireflies from a far distance. And then player will notice a little creature crying in the forest. The creature will look at the player and look at the dead flowers on the ground back and forth asking for the flower. Once the player picks up the dead flower, it will become alive on player's hand. 

Two things that we learned from the iterations in this part is that 3D sound and feedbacks are essential in our experience so that we can indirectly control the player to go on for the main story line. For example, player will hear directional crying sound from the creature in the beginning so that he knows where to find the sound source, the creature. 

In the early playtest sessions we got a lot of feedback saying that they want to interact with the creature but it is not responding to their actions. So based on the feedback, we added in some more animations to the creature to make it more responsive. 





Constant playtests are really essential for us since we need to test the effectiveness of conveying the story and evoking the emotions.

So we created a playtest schedule for the whole semester. We planed to test transformation and the first half of the experience during our halves, test iterations from half on 11.4, complete our experience on 11.11, test the emotions through the experience on 11.17 and test the emotion towards the ending during our soft opening.


After every playtest, we gathered feedback and learned what was good and what didn't work, based on which, we would iterate our experience.

In the early playtest sessions we got a lot of feedback saying that they want to interact with the creature but it is not responding to their actions. So based on the feedback we got, we added in some more animations to the creature to make it more responsive. 

We tested our experience with 54 people in total. 49 out of 54 playtesters understood our general story. Most of the playtesters felt the transformation effective and impactful. The creature at beginning seems cute or neutral to most of our playtesters. About half of the playtesters didn’t feel scared, mostly from soft opening. And only a few had dilemma in the end, however, different ones as we expected.



Indirect Control:

Two things that we learned from the iterations is that 3D sound and feedbacks are essential in our experience. For instance, in the final experience player will hear directional crying sound from the creature in the beginning and will hear scary sound from the monsters in the indicated direction as well. 

Initial Ideas:

It’s important to perform thorough research during the first few weeks of preproduction. Our programmers researched the pipeline of using Unreal Engine and the designers went through lot of emotional engaging stories. This way, we were able to determine the limitations of our platform and work around them. We also identified the classic emotion arc of story and created our experience based on that.

Dilemma creation:

It’s not easy for players to have sympathy for a character in a short time. And it’s harder to make them choose between themselves and game character within a short period. In virtual experiences, players tend to think more about themselves. How to get them out and think about the game character needs a fair long time of relation building up.





Final Choice:

Simple; Delicate; Film Feeling

Those thin lines also imply the insights behind our story
(story after prologue), which is what we want to convey through our experience.



Mysterious; Arouse Curiosity; Related to the story