Currently, our final project (Capturing Movement in American Football) is underway and making good progress. We had our third motion capture session last Saturday, October 4th. For this session, I invited a friend who is an actor, Nick Yañez-Atkins, to don a suit and play the role of a coach in my football concept. He had been wanting to try Mocap out, and this proved to be a great opportunity!
For this past session, we tried adding the Manus gloves to Nick for hand capture as well as the Unreal Live Link for facial capture and streaming live data to Unreal Engine. After calibrating the space, we did successfully get the Manus gloves working in Motive.
We also did get the facial capture for Live Link to work. We had my phone set up in the Rokoko headrig for the purpose of capturing Nick’s facial expressions as a coach.
However, the body data from Motive did not sync up well in Unreal. We could not rig the Motive data to the Unreal Metahumans automatically. We tried for an hour or so troubleshooting and working to pair the Metahuman with the Motive skeleton but eventually decided to shelve the live Unreal streaming so that we didn’t waste our time with Nick.
So, we ditched the Live Link facial capture and focused on the body and hand capture in Motive. This turned out great, in large part due to Nick’s great physical acting with gestures and his whole body as a coach. Galt also did well with falling on the crash pad (provided by the Media Commons) and acting out injuries. I made sure to talk out the choreography and asked if he was comfortable with doing it. He said he loved doing it, and we got a lot of really good takes out of the whole session.
We have one more Mocap session for Friday, 10/10. At this point, we have most of the movement assets for the video collage. After tomorrow, I need to shift to editing the piece together. I will write the narrative piece today and edit that alongside helping Michael and everyone stitch together the movement data from Motive to MotionBuilder to Unreal Engine. We will include TouchDesigner for a live audience feedback, but we are unsure if it will be easier for the final output to be in Unreal or TouchDesigner. We will have to do trial and error on that this weekend as we race to get something put together before our last class on Wednesday.

Leave a comment