











MotionLab
2023
2023
2023
Collaboration with UPItt Bio Outreach
Collaboration with UPItt Bio Outreach
Collaboration with UPItt Bio Outreach
AR
AR
AR
Hands-on, lab-based science experiments are critical in STEM education but are often expensive and inaccessible for students with learning disabilities (LDs). Our project aims to create an interactive virtual simulation of a biology lab experiment that is cheaper, and focuses on accessible experience for students with LDs. We will be building off of the Pitt Bio Outreach (PBO) program's use of multisensory videos to deliver lab instruction, which incorporates audiovisual aids to increase accessibility.
MotionLab uses the Leap Motion sensor, which can map a user's hand gestures in a digital, 3D environment to walk students through an existing experiment developed by PBO, which teaches students how to use micropipettes in a biology lab context. Two main learning goals of this experiment are (1) to correctly use the micropipette and (2) to understand lab protocol, such as hygiene and best practices.
Hands-on, lab-based science experiments are critical in STEM education but are often expensive and inaccessible for students with learning disabilities (LDs). Our project aims to create an interactive virtual simulation of a biology lab experiment that is cheaper, and focuses on accessible experience for students with LDs. We will be building off of the Pitt Bio Outreach (PBO) program's use of multisensory videos to deliver lab instruction, which incorporates audiovisual aids to increase accessibility.
MotionLab uses the Leap Motion sensor, which can map a user's hand gestures in a digital, 3D environment to walk students through an existing experiment developed by PBO, which teaches students how to use micropipettes in a biology lab context. Two main learning goals of this experiment are (1) to correctly use the micropipette and (2) to understand lab protocol, such as hygiene and best practices.
Demo of Final Experience


Final Experience Layout


Pit Lab Bio Outreach Researchers Using Final Prototype


Users Participating in Contextual Inquiry


Final Prototype User Testing Session
Research
Research


The focus of our initial research was to gain a comprehensive understanding of the affordances and limitations of existing science education tools and methods , as well as potential improvements made to these tools through the use of XR. In addition, we wanted to understand the student experience of learning biology through lab experiments.
Research Insights:
Students switched attention back and forth between multiple areas during steps.
Most errors occurred from forgetting steps or improper use of the equipment.
Immediate feedback is crucial to error reduction and learning.
The focus of our initial research was to gain a comprehensive understanding of the affordances and limitations of existing science education tools and methods , as well as potential improvements made to these tools through the use of XR. In addition, we wanted to understand the student experience of learning biology through lab experiments.
Research Insights:
Students switched attention back and forth between multiple areas during steps.
Most errors occurred from forgetting steps or improper use of the equipment.
Immediate feedback is crucial to error reduction and learning.
Expert Interviews: We conducted 2 structured interviews with experts in the field of science education and learning disabilities to develop a conceptual model of teaching and conducting science experiments and understand the most important procedures.
Semi-Structured Interviews: We interviewed 4 college students who have lab experience to understand how they feel about the learning process and discover any common frustrations or helpful elements.
Contextual Inquiry: We observed 4 undergraduate college biology students performing the PBO micropipette experiment using multisensory video instruction in order to uncover low-level details of experiment procedure and improve our understanding of students' science experiment processes.
These methods were supported by secondary research methods including literature reviews and competitive analysis.
Expert Interviews: We conducted 2 structured interviews with experts in the field of science education and learning disabilities to develop a conceptual model of teaching and conducting science experiments and understand the most important procedures.
Semi-Structured Interviews: We interviewed 4 college students who have lab experience to understand how they feel about the learning process and discover any common frustrations or helpful elements.
Contextual Inquiry: We observed 4 undergraduate college biology students performing the PBO micropipette experiment using multisensory video instruction in order to uncover low-level details of experiment procedure and improve our understanding of students' science experiment processes.
These methods were supported by secondary research methods including literature reviews and competitive analysis.






Contextual inquiry methodology
Contextual inquiry methodology
Lo-Fi Prototyping and Testing
Lo-Fi Prototyping and Testing
In order to receive more feedback on our choice to use the Leap Motion system, we conducted 3 prototype testing sessions with college students of various backgrounds. Most participants had limited experience with using micropipettes, and we were able to recruit one participant with ADHD to align with our target user’s profile.
In order to receive more feedback on our choice to use the Leap Motion system, we conducted 3 prototype testing sessions with college students of various backgrounds. Most participants had limited experience with using micropipettes, and we were able to recruit one participant with ADHD to align with our target user’s profile.




We used the Wizard of Oz testing method to simulate an interactive virtual environment that reacts to participants’ hand motion. After the participants finished all the steps, they were given a real micropipette and ran through the prototype again in order for us to evaluate how well the skills and knowledge were translated from the digital to physical tool.
We used the Wizard of Oz testing method to simulate an interactive virtual environment that reacts to participants’ hand motion. After the participants finished all the steps, they were given a real micropipette and ran through the prototype again in order for us to evaluate how well the skills and knowledge were translated from the digital to physical tool.



The Wizard of Oz testing was supported by transfer of learning assessment with real micro-pipettes
The Wizard of Oz testing was supported by transfer of learning assessment with real micro-pipettes
Gesture and Transfer of Learning
Gesture and Transfer of Learning


The Leap Motion sensor translates hand gestures into 3D space, allowing for realistic interaction with 3D objects. However, due to the sensitivity of the sensor and lack of haptic feedback (such as resistance when pressing a micropipette plunger), we needed to abstract gestures to help in communicating the use of various science tools.
The Leap Motion sensor translates hand gestures into 3D space, allowing for realistic interaction with 3D objects. However, due to the sensitivity of the sensor and lack of haptic feedback (such as resistance when pressing a micropipette plunger), we needed to abstract gestures to help in communicating the use of various science tools.
Key Gestures were designed to reinforce transfer of learning
Grabbing Objects: Many steps in the experiment require users to pick up objects and move them around the table. By performing a grabbing gesture near a moveable object, users can lock a object to a set orientation in their hand.
Plunging and Sliding: Simulating the usual thumb movement for pushing down a pipette plunger using the Leap Motion was not possible; instead, we allowed users to push down the plunger using their non-dominant hand and provided feedback using a vertical bar on the left of the interface.
Dynamic UI: We also implemented several UI elements in the lo-fi prototype into the 3D environment. Users can see labels and instructions for objects they may not be familiar with, and can open up a menu by opening up their non-dominant hand’s palm to perform actions such as restarting the experiment or resetting the position of the micropipette.
Key Gestures were designed to reinforce transfer of learning
Grabbing Objects: Many steps in the experiment require users to pick up objects and move them around the table. By performing a grabbing gesture near a moveable object, users can lock a object to a set orientation in their hand.
Plunging and Sliding: Simulating the usual thumb movement for pushing down a pipette plunger using the Leap Motion was not possible; instead, we allowed users to push down the plunger using their non-dominant hand and provided feedback using a vertical bar on the left of the interface.
Dynamic UI: We also implemented several UI elements in the lo-fi prototype into the 3D environment. Users can see labels and instructions for objects they may not be familiar with, and can open up a menu by opening up their non-dominant hand’s palm to perform actions such as restarting the experiment or resetting the position of the micropipette.




To further aid in transfer of learning and simulation of haptic feedback, we included exaggerated forms of feedback using audio-visual queues.
Haptic Feedback Simulation: Since a major part of recognizing the difference between the first and second stops on micropipette plungers relies on the physical feedback from the pipette itself, we had to figure out a way to simulate this in a digital space where the users aren’t actually holding a physical object. We accomplished this through exaggerated feedback in the plunger slider where the user would have to push their hand down with more “force” to go past the first stop. In addition, we simulated an interaction where if the user brought their hand close to the first stop position on the slider, it would immediately “snap” to the first stop position to simulate a “clicking” sort of feeling.
Audio and Visual Feedback: In addition to the pressure feedback for the slider, we also designed audio feedback where the slider would make a “click” noise every time the user moved their hands to either the first or second stop. There was also audio feedback added to let the user know when they’ve completed a task successfully.
To further aid in transfer of learning and simulation of haptic feedback, we included exaggerated forms of feedback using audio-visual queues.
Haptic Feedback Simulation: Since a major part of recognizing the difference between the first and second stops on micropipette plungers relies on the physical feedback from the pipette itself, we had to figure out a way to simulate this in a digital space where the users aren’t actually holding a physical object. We accomplished this through exaggerated feedback in the plunger slider where the user would have to push their hand down with more “force” to go past the first stop. In addition, we simulated an interaction where if the user brought their hand close to the first stop position on the slider, it would immediately “snap” to the first stop position to simulate a “clicking” sort of feeling.
Audio and Visual Feedback: In addition to the pressure feedback for the slider, we also designed audio feedback where the slider would make a “click” noise every time the user moved their hands to either the first or second stop. There was also audio feedback added to let the user know when they’ve completed a task successfully.



Mid-Fi Prototype testing with basic leap motion interaction and transfer of learning assessment
Mid-Fi Prototype testing with basic leap motion interaction and transfer of learning assessment
UI and Design Systems
UI and Design Systems
Accessibility being at the core of our project, our user interface was designed and iterated upon focusing primarily on students with learning disabilities
Multisensory Learning: The user interface and experiment instructions were delivered through both a text display and through audio so that the user could process the instructions with the methods they prefer. In future iterations, more personalization to this feature, such as being able to turn the narration off or adjusting the speed of narration would be beneficial to help the users adapt the tool to their needs.
Accessibility being at the core of our project, our user interface was designed and iterated upon focusing primarily on students with learning disabilities
Multisensory Learning: The user interface and experiment instructions were delivered through both a text display and through audio so that the user could process the instructions with the methods they prefer. In future iterations, more personalization to this feature, such as being able to turn the narration off or adjusting the speed of narration would be beneficial to help the users adapt the tool to their needs.




Accessible UI Design System: Using a combination of video game UI principles and accessibility heuristics, we designed a UI system which both complimented our interaction systems and environmental design while being accessible to students with learning disabilities.
Accessible UI Design System: Using a combination of video game UI principles and accessibility heuristics, we designed a UI system which both complimented our interaction systems and environmental design while being accessible to students with learning disabilities.


Accessible UI Design System