GithubHelp home page GithubHelp logo

janooba / facial-ar-remote Goto Github PK

View Code? Open in Web Editor NEW

This project forked from unity-technologies/facial-ar-remote

0.0 1.0 0.0 18.08 MB

Facial AR Remote is a tool that allows you to capture blendshape animations directly from your iPhone X into Unity 3d by use of an app on your phone.

License: Other

C# 98.72% ShaderLab 1.28%

facial-ar-remote's Introduction

Facial AR Remote (Preview)

About

Facial AR Remote is a tool that allows you to capture blendshape animations directly from your iPhone X into Unity 3d by use of an app on your phone.

Experimental Status

This repository is tested against the latest stable version of Unity and requires the user to build their own iOS app to use as a remote. It is presented on an experimental basis - there is no formal support.

Download

Get the latest release from the Releases tab

How To Use/Quick Start Guide

Project built using Unity 2018+, TextMesh Pro Package Manager, and ARKit plugin. Note ARKit plugin is only required for iOS build of remote. For your convenience, you may want to build the remote from a separate project. For best results use Bitbucket tip of ARKit plugin

This repository uses Git LFS so make sure you have LFS installed to get all the files. Unfortunately this means that the large files are also not included in the "Download ZIP" option on Github, and the example head model, among other assets, will be missing.

iOS Build Setup

  1. Setup a new project either from the ARKit plugin project from BitBucket or a new project with the ARKit plugin from the asset store.

  2. (Unity 2018.1) Add TextMesh-Pro to the project from Window > Package Manager. The package is added automatically in Unity 2018.2 and above.

  3. Add this repo to the project and set the build target to iOS.

  4. Setup the iOS build settings for the remote. In Other Settings > Camera Usage Description be sure you add "AR Face Tracking" or something to that effect to the field. Note You may need to set the Target Minimum iOS Version to 11.3 or higher. You may also need to enable Requires ARKit Support Note The project defaults to ARkit 2.0, To use ARkit 1.5 you will need to set ARKIT_1_5 in Other Settings > Scripting Define Symbols* this will be required only if you have not updated your remote app to support ARkit 2.0. Note You may need to update your version of the ARkit plugin and update to XCode 10 or greater for ARKit 2.0.

  5. Open Client.scene and on the Client gameobject, set the correct Stream Settings on the Client component for your version of ARKit.

  6. When prompted, import TMP Essential Resources for TextMesh Pro

  7. Enable "ARKit Uses Facetracking" on UnityARKitPlugin > Resources > UnityARKitPlugIn > ARKitSettings

  8. Set Client.scene as your build scene and build the Xcode project.

Editor Animation Setup

Install and Connection Testing

  1. Add TextMesh-Pro to your main project or new project from Window > Package Manager.

  2. Add this repo to the project. Note You should not need the ARKit plugin to capture animation.

  3. To test your connection to the remote, start by opening ../Examples/Scenes/SlothBlendShapes.scene.

  4. Be sure your device and editor are on the same network. Launch the app on your device and press play in the editor.

  5. Set the Port number on the device to the same Port listed on the Stream Reader component of the Stream Reader game object.

  6. Set the IP of the device to one listed in the console debug log.

  7. Press Connect on the device. If your face is in view you should now see your expressions driving the character on screen. Note You need to be on the same network and you may have to disable any active VPNs and/or disable firewall(s) on the ports you are using. This may be necessary on your computer and/or on the network. Note Our internal setup was using a dedicated wireless router attached to the editor computer or lighting port to ethernet adaptor.

Known Issues

  1. Character Rig Controller does not support Humanoid Avatar for bone animation.

  2. Animation Baking does not support Humanoid Avatar for avatar bone animation.

  3. Stream source can only connect to a single stream reader.

  4. Some network setups cause an issue with DNS lookup for getting IP addresses of the server computer.

Note: History edits were made on 10/29/2018. If you cloned this repository before that date, please rebase before submitting a Pull Request.

facial-ar-remote's People

Contributors

foobraco avatar jonathan-unity avatar mtschoen avatar mtschoen-unity avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.