
The options available are Snap to Pivot and Orient to Surface. To control how dragged GameObjects should snap and align to the face, use the Placement Options section of the main Unity toolbar. To reposition a Prefab or move it to a different landmark, select the GameObject, hold down the Shift key, and then drag the GameObject to its new position. MARS places the GameObject as a child GameObject of the face landmark in the Transform hierarchy. Release the mouse button to anchor the GameObject to that particular feature. When you drag a Prefab from your Project onto the face, different key ‘landmarks’ (such as the eyebrow or nose) light up as the cursor hovers that area. To create facemasks, decorate this model as if it were a mannequin. The Facemask template places a head model in the middle of your Scene. You should now see your video in the Environment drop-down list of the Device view. From Unity's main menu, go to Window > MARS > Developer > Refresh Session Recordings. In the Project window, right-click the video clip then select Create > MARS > Session Recording from Video Clip.Īfter you've done this, you must refresh Recorded mode environments. To load your own video clip for testing inside MARS, you must create a Session Recording asset that references the video clip. Face tracking works with both these options. To test out face tracking capabilities while designing your AR app, make sure that the Simulation view is open, and set its mode to either Live (to get a stream from the first webcam MARS detects), or Recorded (if you have a pre-recorded video to work with). This opens a template Scene with all the elements you need to incorporate face tracking in your app, which you then need to save separately (menu: File > Save As). Using a facemask templateįrom Unity's main menu, go to Window > MARS > Choose Template, and select the Facemask option. You can still create facemasks in the Editor without ULSee.
#Face of mars locatipn install
To use this kind of face tracking you need to download and install ULSee, which is a third-party plug-in. The video feed can come from either a camera in Live mode or custom video in Recorded mode. MARS also supports face tracking against a pure video in simulation. You can play these videos in Recorded mode to test against recorded face data. The MARS package content includes default Session Recordings that are videos with pre-recorded face tracking data. In the Unity Editor MARS applications can use face tracking in Recorded mode or Live mode simulations.
#Face of mars locatipn android
MARS applications have face-tracking ability when you deploy your app to an Android or iOS device. Face tracking is an important part of AR with many practical use cases, including face effects, filters, and "try-ons" which allow the user to simulate makeup, eyeglasses, or different hairstyles.
