StudioV is working on a mixed reality production platform. Taking advantage of real-time rendering ability of game engine Unity, we produce CGI content that syncing environment, character skeleton tracking animation and facial expression animation in real-time. During production, actors will also be immersed in the virtual environment, which can help actors set up the mood.
The end product of production can distribute to multiple platforms, including VR, AR, 360 video.
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes. See deployment for notes on how to deploy the project on a live system.
What things you need to install the software and how to install them
- Unity 2017.2
- Motion capture system: e.g. Optitrack motion capture system + software Motive 2.0
- VR system: e.g. HTC Vive + steamVR
A step by step series of examples that tell you have to get a development env running.
Note: In all instructions, we will only provide explanations of the components that we are using, if you want to use other hardware or software, you can try to work on your own or contact studio.
-
Prepare motion capture system Motion capture system is required to track skeleton and rigidbody movement, and stream tracking data to Unity. Follow the instruction in following link to set up motion capture system https://wiki.optitrack.com/index.php?title=OptiTrack_Documentation_Wiki
-
Prepare VR system VR system allows actors see the scene, other actors and themselves immersively. You need a VR compatible computer to run VR application. Follow the link to set up VR system https://support.steampowered.com/kb_article.php?ref=2001-UXCM-4439
-
Prepare face tracking
-
Eye tracking: hardware required. We are using Tobii VR for eye tracking. Tobii Pro SDK Unity is used for development. https://www.tobii.com/tech/products/vr/ https://www.tobiipro.com/product-listing/tobii-pro-sdk/
-
Facial expression tracking: hardware required. We are using BinaryVR for facial expression tracking. http://www.binaryvr.com/
-
Download this project and open it in Unity 2017.2
-
Import the unity assets/plugins you need, please check Appendix A for a list of assets we use
-
Import Photon Unity Networking(PUN), version 1.87 https://www.assetstore.unity3d.com/en/#!/content/1786
-
To use PUN, you need an App ID, follow the instruction to get an App ID for your application https://doc.photonengine.com/en-us/realtime/current/getting-started/obtain-your-app-id
-
Initial setup of PUN: https://doc.photonengine.com/en-us/pun/current/getting-started/initial-setup
-
Go to PhotonServerSetting, make sure the Hosting protocal is set to Tcp.
-
In PhotonServerSetting find Rpc list and press ”Clear RPCs” and then ”Refresh RPCs”.
-
-
Import MicroLibrary: Download the source code from link https://www.codeproject.com/Articles/98346/Microsecond-and- Millisecond-NET-Timer, copy the file MicroLibrary.cs to Assets/Scripts/Tool/ in the project
-
Import SteamVR SDK: https://www.assetstore.unity3d.com/en/#!/content/32647
- Open Launcher scene, go to Assets/Prefabs folder, and drag [CameraRig] object to Launcher scene
- Find TobiiPro_Host object in the scene, assign the EyetrackerCam field with Camera(eye) object under [CameraRig]/Camera(head)
Note: After clicking Clear in Console, there shouldn’t be any complie errors after you import these 3 assets.
- Import BinaryVR SDK (Recommended): you can get the SDK after purchase
- Go to path binarysdk/examples/Unity/Assets/ExampleScene/Scripts/, copy and replace FaceExpressionController.cs into Assets/Scripts/AvatarSpecific/ in the project.
- Go to path binarysdk/examples/Unity/Assets/BinaryFaceHMD/, copy this folder and place it under Assets folder in the project.
- Go to path binarysdk/examples/Unity/Assets/StreamingAssets/, copy the file model.bfh in this folder to Assets/StreamingAssets/ in the project
- Refer to example scene in binarysdk to set it up
This project is licensed under the MIT License - see the LICENSE file for details
Read the documentation file for details