Medical Simulated Interactive Manikin

Client: National University of Singapore


KEY WORDS: Mixed Reality, Augmented Reality, HoloLens, Medical, Educational



I did user research and medical background research, defined high-level structure based on client's requests and vision, designed user flow and customer journey, created storyboard for the physical examination phase, designed interaction and interface for the product, and also conducted usability testing and interviews.

I worked alongside a 3D Artist, a Producer, and three Software Engineers.

  • Research
  • User Flow & Customer Journey
  • Storyboard
  • Interaction Design
  • Visual Design
  • Usability Testing
  • Presentation


MediSIM provides students with the opportunity to see an interactive virtual patient - lending greater verisimilitude to the experience. The main purpose of this project is for medical students/trainers to practice abdominal examinations and prepare them for conducting examinations on real patients. 



Our target demographics are medical students or trainees who would be the main users of our product, as well as medical teachers who would set up scenarios for students and teach alongside when students are going through the experience.





The target users of our product is medical students / trainees who already have the knowledge about internal organs, the correlation between diseases and symptoms, and the process of the whole abdominal examination. However, they are still lack of clinical examination skills and the attitude of coming to real patients. I talked with a few medical students when I was doing research and they told me that it’s really hard for them to practice their knowledge from book because no one likes to be touched and TA is the only person that they can practice on.



The internal organs inside the Abe are all real-sized abnormal organs, which will move to simulate the movement while breathing. The reason for only putting in abnormal organs is that a doctor shouldn't be able to feel anything while palpating if the patient is healthy (organs are normal).

The internal organs inside the Abe are all real-sized abnormal organs, which will move to simulate the movement while breathing. The reason for only putting in abnormal organs is that a doctor shouldn't be able to feel anything while palpating if the patient is healthy (organs are normal).



1. Bring Abe to life

A virtual patient will be superimposed on Abe in order for students to finish certain steps of abdominal examination which involves examining eyes, hands and nails. Also to help students practice their skills in context with an interactive patient.

2. Impression of 3D vision

Providing 3D vision of internal organs and related disease info simultaneously to help diagnose the disease, and understand the shape and arrangement of (deformed) organs.




The poster on the left has a sci-fi/futurism touch to it, but we decided to go with the one on the right because it conveys better meaning of our project.



Final Version

Final Version


The last one is our finalized logo. I chose to use the medical symbol Caduceus instead of the internal organs. I chose to use green and blue-ish colors because of the clinic theme. I also put our name in one line with the dot on "i" because of the readability and consistency.



We want to help students with not only practicing the procedures, but also learning the diagnostic knowledge in context. To achieve this goal, we designed two different modes, Exam mode and Teach mode.

In Exam mode, medical students have to perform the procedures of abdominal examinations and make diagnosis based on their observations (symptoms of the virtual patient and the abnormal organs inside Abe). In teach mode, students have access to more information about various symptoms for related organs and diseases. Moreover, they may choose to have a closer look at realistic 3-dimensional internal organs and pick up any individual organ to manipulate.



Since this experience is possibly going to be used in class or a study group, we have to think broader. 

The customers of our experience are not necessarily the actual user of our product. Thus, we need to design for not only the users of the product but for everyone that's involved in the experience.


Design the experience for 4 devices and 3 groups of users on 2 different modes:





Initial design for Meta 1

Version 1

Version 2 - Blue

Version 2 - White

Given that a relatively dark environment is required while using the AR device (especially for Meta 1), I chose to use bright colors so that the UI can stand out, and be more readable and clear. The first version didn't work well due to the field of view and resolution of the device we were using (Meta 1). After the first failure, I realized that texts are really hard to see clear in the goggle. So I designed the second version with only the big text. It turned out to have a pretty good effect and look nice in it.


Design for HoloLens


State - Disabled

State - Hover

State - Selected


Menu with instruction

Menu with instruction



During the migration to the HoloLens, we changed and optimized most of the interactions based on the features of the HoloLens.

For instance, the menu follows your head when you are moving around and you can also choose to pin the menu in 3D space so that you can always go back and find it whenever you want.


In order to let student have a closer look at the detail of the organs, we designed this feature (in the gif below), in which student may walk towards the organs and pick up individual organs to manipulate by saying “rotate” “move” or “scale”. 

As for the organs, the user can click on the "X-RAY" button to see through the patient's body for the purpose of learning the position of the organs inside the body while palpating at the same time. 


NEW UI (Use lines to lead the user)


The line comes out from the organ. Since the field of view is quite limited, I use the line to guild the user to look around in the "world".



MediSIM keeps track of all the necessary steps in the abdominal examination procedure and provides students with a report at the end showing the correct and missing steps.


At first, we decided to use an extra physical button as a trigger for the user to do some interactions. However, after some user testing, we found that it was not intuitive enough and otiose especially since we are using Augmented Reality. Thus we changed it to voice control. The user says certain words ("check") to interact with "the patient".  The last icon is the final one that we are using.


After we got HoloLens, we changed the interactions to a more natural way by just using your hand instead of saying words to the patient, which gives users more control and makes the experience closer to the real situation.



I designed this in order to make sure that the user is wearing the HoloLens right and sees the whole screen.



We have gotten a lot of positive feedback from medical students and industry guests including people from medical and educational industries.