top of page
Spot_MoCap.png
Hearts:Vido

Starlab

About:

For the majority of my graduate degree, I contributed to a research contract called the Distributed Autonomous Robotic Experiments and Simulations (DARES). This research was conducted at Starlab, a motion capture stage on campus, under Texas A&M University’s Department of Mechanical Engineering. In my graduate research, I developed projects combining AR/VR with live motion capture data from full body motion, robots, and props in Unreal Engine and Unity. In addition, I used point cloud data from a LiDAR scanner as well as photogrammetry to create clean and accurate retopologized 3D environments and detailed textures to be used in simulations. Outside of the DARES project, I also collaborated on several virtual production projects using Unreal Engine - including set dressing, recording and editing full body motion, lighting, and rendering - while I was in this position. For both the DARES project and virtual production projects, I operated a motion capture stage through tasks such as camera calibration, markering subjects and props, and working with Vicon cameras and Shogun Live/Post or with OptiTrack cameras and Motive.

​

My research was done under the supervision and guidance of Professor Michael Walsh. Learn more about Starlab here: Link

​

Hearts: About

Responsible for real-time tracking for the virtual reality headset, robot, and other props in the motion capture volume using a Vicon system and Unity. Also created all models and textures for VR, excluding tree models.​

Responsible for real-time tracking for the virtual reality headset, hands, and chair in the motion capture volume using an OptiTrack system and Unreal Engine. Set up spacial interactions, so that a hand overlapping hit boxes in the play space would trigger an event in the scene.​

Responsible for rigging the robot model and capturing, retargeting, and editing the mo-cap data.​

Responsible for capturing, retargeting, and editing mo-cap data, as well as creating and texturing the 1-1 model of the lab used in Unreal, and the creating the custom Metahuman mannequin materials.​

Contributed to developing and implementing the workflow for a virtual camera and real-time, full-body motion capture using a Vicon system, MotionBuilder, and Unreal Engine.

©2023 by Izzy Rollo. Created with Wix.com

bottom of page