Pianoverse: Immersive Piano Learning with Wearable Haptic Feedback Glove

UX DESIGN | RESEARCH | PROTOTYPING 

How can we improve music learning experience in Virtual Reality?

The project aims to revolutionize piano learning by introducing a VR piano with a haptic glove add-on. This virtual experience simplifies the traditional challenges of learning piano, offering marked notes/chords on a virtual keyboard.


The haptic glove enhances the learning process by guiding finger movements based on note sequences, facilitating muscle memory for chords/songs. Targeted at beginners with limited resources, the system provides a cost-effective and space-efficient alternative to traditional pianos and lessons.

OVERVIEW

Demo

We decided to work on Piano learning in VR

The project aims to revolutionize piano learning by introducing a VR piano with a haptic glove add-on. This virtual experience simplifies the traditional challenges of learning piano, offering marked notes/chords on a virtual keyboard.

Ideation and Brainstorming

Wearables

Muscle stimulation

Muscle memory

Guidance

Visual Cues

Haptic Feedback

Wearables

Error correction

Ease of learning

Final Idea

THE PRODUCT

The Virtual Reality Piano Tutor with EMS Feedback is an innovative educational technology project designed to revolutionize piano learning experiences. The primary goal is to create an immersive virtual reality environment where users can learn to play the piano with real-time feedback provided through an electric muscle stimulator (EMS) integrated into a specialized glove,

Wearable EMS(Electric muscle simulation) feedback gloves for Piano learning in VR

The gloves are meant to hold the fingers back when the EMS is activated

Steps to restrain fingers:


A servo motor is used to rotate a piece of string tied around the servo horn


The string is connected to a ring that can be fastened around the user’s finger


This restricts the user’s range of motion, so that they can’t bring the targeted finger all the way down

Implementation

STAGE 1

Low-fi prototype to visualise the idea

STAGE 2

Testing different individual parts of the idea

VR Environment Setup:

Established a virtual reality (VR) environment using Meta Quest 2, incorporating precise hand tracking for interactive experiences with virtual objects.

Integrated a pre-made Piano prefab into the VR space, customized it for individual pressable keys, and synchronized it with the headset's hand tracking capabilities.


Wearable Device Prototype - EMS Placement:

Explored and tested optimal locations for Electric Muscle Stimulation (EMS) pads on the hand and arm to target specific fingers and nerves.

Experimented with using EMS to induce arm movements, enhancing sensory feedback for users regarding hand positioning during piano play.


Sound Integration for Virtual Piano:

Edited and added sound to the Piano prefab, ensuring that each key produces a distinct sound upon interaction with virtual hands/fingers.

Enabled seamless integration between the virtual piano and the Meta Quest 2's built-in hand tracking system.


Communication Integration in Progress:

Reached a stage where both the VR headset and the Wearable Prototype functioned independently, yet efforts were ongoing to establish communication between the two systems.

Working towards seamless integration to enable real-time coordination between the virtual piano simulation and the EMS feedback provided by the wearable device.

STAGE 3

Final Prototyping

Control Mechanism:

Integrated the wearable device with a VR headset using Wi-Fi and ESP32, allowing real-time communication for finger control. Created a Peer to Peer network, enabling the VR headset to send instructions to the ESP32, controlling both servo motors and EMS.


VR Practice Mode:

Implemented a "Practice" mode in the VR scene, displaying a falling key visualization for users to follow the note sequence. Visual feedback included turning keys green for correct presses and red for incorrect ones.


Communication Protocol:

Established communication between VR headset and ESP32 using a web socket server. Unity's Native WebSockets library facilitated message exchange, sending note sequences from the VR scene to the wearable device.


Sequenced Learning:

Utilized Unity to assign specific piano keys for the note sequence. The VR headset sent the expected note names to the ESP32, guiding the user through the sequence. After completing the sequence, a stop command was sent, and the scene reset.


Arduino Code Logic:

On the Arduino side, the code stored the most recent note value and activated the play function. The play function triggered EMS at intensity level 1 and adjusted the servos, following the mapped finger sequence. A stop command reset servos and set EMS back to level 0, concluding the learning sequence.


This project successfully merged VR technology with a wearable device, providing an immersive and interactive platform to teach piano muscle memory through a combination of servo motor control and EMS feedback.

Fin.

Context

Team

At the conception of our project, we were actually unsure of where to take our ideas. As a group we were pretty set on using VR or AR technology in combination with haptics to create an immersive experience, but were unsure of how we wanted to apply those technologies in a meaningful way. We wanted to experiment with the applications of VR and haptics in regards to music and instruments, and came up with a few ideas.

Team of 5 students

Timeline

Aug’23 - Dec’23

Tools

Unity, Visual studio code,

Context

Team

At the conception of our project, we were actually unsure of where to take our ideas. As a group we were pretty set on using VR or AR technology in combination with haptics to create an immersive experience, but were unsure of how we wanted to apply those technologies in a meaningful way. We wanted to experiment with the applications of VR and haptics in regards to music and instruments, and came up with a few ideas.

Team of 5 students

Timeline

Aug’23 - Dec’23

Tools

Unity, Visual studio code,

Demo

We decided to work on Piano learning in VR

The project aims to revolutionize piano learning by introducing a VR piano with a haptic glove add-on. This virtual experience simplifies the traditional challenges of learning piano, offering marked notes/chords on a virtual keyboard.

Ideation and Brainstorming

Muscle memory

Guidance

Visual Cues

Wearables

Haptic Feedback

Error correction

Ease of learning

Muscle stimulation

Final Idea

THE PRODUCT

Wearable EMS(Electric muscle simulation) feedback gloves for Piano learning in VR

The Virtual Reality Piano Tutor with EMS Feedback is an innovative educational technology project designed to revolutionize piano learning experiences. The primary goal is to create an immersive virtual reality environment where users can learn to play the piano with real-time feedback provided through an electric muscle stimulator (EMS) integrated into a specialized glove,

The gloves are meant to hold the fingers back when the EMS is activated

Steps to restrain fingers:


A servo motor is used to rotate a piece of string tied around the servo horn


The string is connected to a ring that can be fastened around the user’s finger


This restricts the user’s range of motion, so that they can’t bring the targeted finger all the way down

Implementation

STAGE 1

Low-fi prototype to visualise the idea

STAGE 2

Testing different individual parts of the idea

VR Environment Setup:

Established a virtual reality (VR) environment using Meta Quest 2, incorporating precise hand tracking for interactive experiences with virtual objects.

Integrated a pre-made Piano prefab into the VR space, customized it for individual pressable keys, and synchronized it with the headset's hand tracking capabilities.


Wearable Device Prototype - EMS Placement:

Explored and tested optimal locations for Electric Muscle Stimulation (EMS) pads on the hand and arm to target specific fingers and nerves.

Experimented with using EMS to induce arm movements, enhancing sensory feedback for users regarding hand positioning during piano play.


Sound Integration for Virtual Piano:

Edited and added sound to the Piano prefab, ensuring that each key produces a distinct sound upon interaction with virtual hands/fingers.

Enabled seamless integration between the virtual piano and the Meta Quest 2's built-in hand tracking system.


Communication Integration in Progress:

Reached a stage where both the VR headset and the Wearable Prototype functioned independently, yet efforts were ongoing to establish communication between the two systems.

Working towards seamless integration to enable real-time coordination between the virtual piano simulation and the EMS feedback provided by the wearable device.

STAGE 3

Final Prototyping

Control Mechanism:

Integrated the wearable device with a VR headset using Wi-Fi and ESP32, allowing real-time communication for finger control. Created a Peer to Peer network, enabling the VR headset to send instructions to the ESP32, controlling both servo motors and EMS.


VR Practice Mode:

Implemented a "Practice" mode in the VR scene, displaying a falling key visualization for users to follow the note sequence. Visual feedback included turning keys green for correct presses and red for incorrect ones.


Communication Protocol:

Established communication between VR headset and ESP32 using a web socket server. Unity's Native WebSockets library facilitated message exchange, sending note sequences from the VR scene to the wearable device.


Sequenced Learning:

Utilized Unity to assign specific piano keys for the note sequence. The VR headset sent the expected note names to the ESP32, guiding the user through the sequence. After completing the sequence, a stop command was sent, and the scene reset.


Arduino Code Logic:

On the Arduino side, the code stored the most recent note value and activated the play function. The play function triggered EMS at intensity level 1 and adjusted the servos, following the mapped finger sequence. A stop command reset servos and set EMS back to level 0, concluding the learning sequence.


This project successfully merged VR technology with a wearable device, providing an immersive and interactive platform to teach piano muscle memory through a combination of servo motor control and EMS feedback.

Fin.

How can we improve music learning experience in Virtual Reality?

The project aims to revolutionize piano learning by introducing a VR piano with a haptic glove add-on. This virtual experience simplifies the traditional challenges of learning piano, offering marked notes/chords on a virtual keyboard.


The haptic glove enhances the learning process by guiding finger movements based on note sequences, facilitating muscle memory for chords/songs. Targeted at beginners with limited resources, the system provides a cost-effective and space-efficient alternative to traditional pianos and lessons.

Demo

I believe in the enchanting power of good design, capable of weaving magic into every experience. Let's work together and make magic!


Feel free to drop an email at mrunal.dhaygude@gmail.com.


MRUNAL DHAYGUDE

Pianoverse: Immersive Piano Learning with Wearable Haptic Feedback Glove

UX DESIGN | RESEARCH | PROTOTYPING