I crafted an astronaut avatar using Blender to symbolize exploration and limitless possibilities. Employing a comprehensive skeleton structure, every bone was meticulously articulated to enable fluid movement, seamlessly influencing the corresponding muscles.
After rigging an armature to the avatar in Blender, I imported it into Unity and configured its rig, where the avatar served as the foundation for the subsequent program development, ingeniously mapping wearable sensor data to the avatar's motion via distinct bone identifiers.
I conceptualized a spaceship-inspired stage. By adding multiple spotlights, I injected a dynamic ambiance into the scene. Leveraging scripting, I orchestrated a gradual modulation of the spotlights' intensities over time, crafting an entrancing play of light that emulated the sensation of light gradually flickering to life and gently dimming away.
The game's interaction design follows a clear path: upon user entry, a menu modal materializes in the initial scene. Triggered by the joystick, selecting the "start game" button propels users into the second scene, where they can gracefully synchronize their movements with an avatar. Through scripting, I successfully managed the behavior of VR buttons on the menu, orchestrating both their pressing and releasing actions seamlessly.
A combination of hardware and software components ensures movement prediction and user feedback. The architecture enables real-time movement tracking and prediction, enhancing user experiences with haptic feedback and visual information through the VR headset, while also storing data in Firebase for analysis and accessibility through applications.
Following the collaboration in gathering movement data, conducting ML model training, and refining sensor calibration, we seamlessly integrated an algorithm. This algorithm masterfully translated users' motions into corresponding avatar movements. Backed by embedded ESP32 and MPU9250 sensors within the wearables, user data was diligently collected via accelerometers and gyroscopes. This data journeyed through Firebase to the Oculus Quest via WiFi, enabling a harmonious alignment between the avatar's actions and the user's physical gestures.
Within a demanding 10-week timeframe, adeptly managing team expectations emerged as a linchpin in SyncDance VR's achievement. Employing clear communication and pragmatic scope assessment, we adapting to challenges, navigated technological complexities, and reallocated resources while adhering to a refined scope.
Transitioning from a designer to a VR developer, I embraced an expanded role. With a steep learning curve, I harnessed new skills in Blender modeling, rigging, Unity scene design, and VR deployment. This adaptability ensured our team's cohesion and project advancement. I pride myself on being a versatile team player, actively acquiring new skills to address team needs effectively.
The project's evolution continues with plans to cultivate a thriving dance community. Enabling remote participation, users can connect from anywhere to share their VR dance experiences. This expansion promises to foster a global community united by their passion for dance and technology.
Looking ahead, the project aims to integrate gaming elements tailored for Gen-Z users. By infusing interactive challenges, leaderboards, and collaborative dance quests, the experience will engage the gaming-savvy generation in a novel fusion of dance and competition within the virtual realm.
Elevating the precision of the ML algorithm involves widening the scope of data collection. By accumulating insights from a more diverse participant pool, we can intricately fine-tune the model's training. This can significantly heighten the algorithm's precision, resulting in outcomes that are notably more accurate.