Share
VIDEOS 1 TO 50
How To - DIY Facial Motion Capture Rig for $15
How To - DIY Facial Motion Capture Rig for $15
Published: 2018/02/28
Channel: Red Belly Media
The Last Of Us 2 Facial Motion Capture Technology
The Last Of Us 2 Facial Motion Capture Technology
Published: 2016/12/04
Channel: PlayStationReport
Industrial Light & Magic show off new facial capture - BBC Click
Industrial Light & Magic show off new facial capture - BBC Click
Published: 2017/09/05
Channel: BBC Click
After Effects Facial Motion Capture (face replacement)
After Effects Facial Motion Capture (face replacement)
Published: 2012/04/25
Channel: MotionStream
FACIAL MOTION CAPTURE TEST
FACIAL MOTION CAPTURE TEST
Published: 2017/01/15
Channel: KOMOTION STUDIOS
HOW TO - Homemade  Facial Motion Capture
HOW TO - Homemade Facial Motion Capture
Published: 2015/07/09
Channel: Front Effects
Live 2.5 – Realtime Animation Software Key Product Updates
Live 2.5 – Realtime Animation Software Key Product Updates
Published: 2017/05/22
Channel: Faceware Tech, Inc.
Face2Face: Real-time Face Capture and Reenactment of RGB Videos (CVPR 2016 Oral)
Face2Face: Real-time Face Capture and Reenactment of RGB Videos (CVPR 2016 Oral)
Published: 2016/03/17
Channel: Matthias Niessner
Unreal Engine 4 - Realistic Face Motion Capture Test
Unreal Engine 4 - Realistic Face Motion Capture Test
Published: 2017/07/22
Channel: Braulio Fg
Faceshift: Markerless Motion Capture
Faceshift: Markerless Motion Capture
Published: 2013/08/13
Channel: Autodesk
CGI  & VFX Tech Showreel: "Facial Motion Capture Reel" - by Faceware Technologies
CGI & VFX Tech Showreel: "Facial Motion Capture Reel" - by Faceware Technologies
Published: 2017/08/28
Channel: TheCGBros
Facial capture with no-cost head pose during audio recording (playblast)
Facial capture with no-cost head pose during audio recording (playblast)
Published: 2015/04/23
Channel: Cubic Motion - computer vision that works.
How to do Facial Motion Capture | After Effects CC
How to do Facial Motion Capture | After Effects CC
Published: 2015/10/25
Channel: Creativid Studios
iClone 7 - Facial Mocap & Editing System
iClone 7 - Facial Mocap & Editing System
Published: 2017/07/28
Channel: Reallusion
Cheap Facial Motion Capture
Cheap Facial Motion Capture
Published: 2012/03/27
Channel: Krish Kachhwaha
FACIAL MOTIONCAPTURE MAKING
FACIAL MOTIONCAPTURE MAKING
Published: 2014/07/04
Channel: Sagar Rajan Lonkar
"Old Man" with Actor -  3D Motion Capture Facial Animation
"Old Man" with Actor - 3D Motion Capture Facial Animation
Published: 2008/11/16
Channel: OzzybanOswald
Real-Time High-Fidelity Facial Performance Capture
Real-Time High-Fidelity Facial Performance Capture
Published: 2015/08/05
Channel: DisneyResearchHub
iPhone X Facial Expression Capture test PART 2 - Data into Maya
iPhone X Facial Expression Capture test PART 2 - Data into Maya
Published: 2017/12/05
Channel: Kite & Lightning
CGI Facial Mocap Re-Targeting Demo : "Gollum Project" by Fabrice Visserot
CGI Facial Mocap Re-Targeting Demo : "Gollum Project" by Fabrice Visserot
Published: 2013/08/29
Channel: TheCGBros
iClone Faceware Facial Mocap Tutorial  - Overview & Quick Setup: Out of the Box
iClone Faceware Facial Mocap Tutorial - Overview & Quick Setup: Out of the Box
Published: 2017/09/19
Channel: Reallusion
CGI Real-Time Facial MoCap Demo : "Markerless Real-Time Facial Mocap" by - Snapper Systems
CGI Real-Time Facial MoCap Demo : "Markerless Real-Time Facial Mocap" by - Snapper Systems
Published: 2013/07/16
Channel: TheCGBros
Autodesk MAYA real time markerless facial mocap pipeline tutorial f-clone
Autodesk MAYA real time markerless facial mocap pipeline tutorial f-clone
Published: 2017/06/13
Channel: bluegarage f-clone
FaceAnimator: a markerless, monocular, real-time, 3D facial motion capture system
FaceAnimator: a markerless, monocular, real-time, 3D facial motion capture system
Published: 2017/09/20
Channel: Nicola Garau
iPhone X Facial Capture test PART 3 - FULL BODY
iPhone X Facial Capture test PART 3 - FULL BODY
Published: 2018/01/10
Channel: Kite & Lightning
CGI Facial Mocap : PF Track | Maya
CGI Facial Mocap : PF Track | Maya
Published: 2015/03/04
Channel: Chetal Gazdar
Facial mocap test
Facial mocap test
Published: 2016/01/31
Channel: marc mordelet
Facial Motion Capture - part1
Facial Motion Capture - part1
Published: 2013/05/25
Channel: Boris Samaras
iPhone X Face Motion Capture into Houdini (Projection mapping)
iPhone X Face Motion Capture into Houdini (Projection mapping)
Published: 2017/11/13
Channel: Elisha Hung
Facial Motion Capture with Blender
Facial Motion Capture with Blender
Published: 2013/02/20
Channel: blazraidr
Facial mocap test with iphone
Facial mocap test with iphone
Published: 2016/07/27
Channel: marc mordelet
iClone Faceware Realtime Facial Mocap System - Demo Video
iClone Faceware Realtime Facial Mocap System - Demo Video
Published: 2017/09/21
Channel: Reallusion
Avatar: Motion Capture Mirrors Emotions
Avatar: Motion Capture Mirrors Emotions
Published: 2009/12/24
Channel: Discovery
Windup character facial capture using Unity and IPhone X
Windup character facial capture using Unity and IPhone X
Published: 2018/06/27
Channel: Yibing Jiang
Dynamixyz Facial Motion Capture
Dynamixyz Facial Motion Capture
Published: 2018/01/11
Channel: Puppeteer Lounge
iClone Facial Mocap Plugin
iClone Facial Mocap Plugin
Published: 2017/10/26
Channel: 3D Tutorials and News
Cinema Face Cap - Facial Motion Capture for Unity
Cinema Face Cap - Facial Motion Capture for Unity
Published: 2016/06/22
Channel: Cinema Suite
FaceShift Markerless Motion Capture Facial Animation Software Beta for Kinect
FaceShift Markerless Motion Capture Facial Animation Software Beta for Kinect
Published: 2012/10/28
Channel: Truebones Motions
Using FaceShift Facial Motion Capture Animation Data in Autodesk MotionBuilder 2013
Using FaceShift Facial Motion Capture Animation Data in Autodesk MotionBuilder 2013
Published: 2012/06/07
Channel: Truebones Motions
f-clone webcam and kinect markerless facial mocap (motion capture) software
f-clone webcam and kinect markerless facial mocap (motion capture) software
Published: 2017/06/30
Channel: bluegarage f-clone
f-clone webcam and kinect markerless facial mocap (motion capture) software
f-clone webcam and kinect markerless facial mocap (motion capture) software
Published: 2017/05/24
Channel: bluegarage f-clone
Motion capture for games - iClone Live Mocap with Perception Neuron
Motion capture for games - iClone Live Mocap with Perception Neuron
Published: 2018/04/15
Channel: Sparckman
Dynamixyz
Dynamixyz' real-time facial motion capture in Unity
Published: 2014/05/05
Channel: DynamixyzTeam
iClone 7 Facial Motion Capture Webinar _OCT 17, 2017
iClone 7 Facial Motion Capture Webinar _OCT 17, 2017
Published: 2017/10/18
Channel: Reallusion
Squadron 42: Facial Animation Technology
Squadron 42: Facial Animation Technology
Published: 2015/10/10
Channel: Star Citizen
Markerless Facial Motion Capture Tech Demo
Markerless Facial Motion Capture Tech Demo
Published: 2012/01/06
Channel: Motekentertainment
motion capture & Facial animation
motion capture & Facial animation
Published: 2017/01/19
Channel: Rumbling Games Studio
Moviemation Facial Motion Capture English
Moviemation Facial Motion Capture English
Published: 2011/04/19
Channel: MOVIEMATIONde
Super Girl - Faceshift - facial motion capture 3D - mascotte
Super Girl - Faceshift - facial motion capture 3D - mascotte
Published: 2015/07/13
Channel: herve gaerthner
Mimic Productions Body and Facial Motion Capture Showreel
Mimic Productions Body and Facial Motion Capture Showreel
Published: 2017/02/22
Channel: Mimic Productions
NEXT
GO TO RESULTS [51 .. 100]

WIKIPEDIA ARTICLE

From Wikipedia, the free encyclopedia
  (Redirected from Face tracking)
Jump to navigation Jump to search

Facial motion capture is the process of electronically converting the movements of a person's face into a digital database using cameras or laser scanners. This database may then be used to produce CG (computer graphics) computer animation for movies, games, or real-time avatars. Because the motion of CG characters is derived from the movements of real people, it results in more realistic and nuanced computer character animation than if the animation were created manually.

A facial motion capture database describes the coordinates or relative positions of reference points on the actor's face. The capture may be in two dimensions, in which case the capture process is sometimes called "expression tracking", or in three dimensions. Two dimensional capture can be achieved using a single camera and low cost capture software such as Zign Creations' Zign Track. This produces less sophisticated tracking, and is unable to fully capture three-dimensional motions such as head rotation. Three-dimensional capture is accomplished using multi-camera rigs or laser marker system. Such systems are typically far more expensive, complicated, and time-consuming to use. Two predominate technologies exist; marker and markerless tracking systems.

Facial Motion Capture is related to body motion capture, but is more challenging due to the higher resolution requirements to detect and track subtle expressions possible from small movements of the eyes and lips. These movements are often less than a few millimeters, requiring even greater resolution and fidelity and different filtering techniques than usually used in full body capture. The additional constraints of the face also allow more opportunities for using models and rules.

Facial expression capture is similar to Facial Motion Capture. It is a process of using visual or mechanical means to manipulate computer generated characters with input from human faces, or to recognize emotions from a user.

History[edit]

One of the first papers discussing performance-driven animation was published by Lance Williams in 1990. There, he describes 'a means of acquiring the expressions of realfaces, and applying them to computer-generated faces'.[1]

Technologies[edit]

Marker-based[edit]

Traditional marker based systems apply up to 350 markers to the actors face and track the marker movement with high resolution cameras. This has been used on movies such as The Polar Express and Beowulf to allow an actor such as Tom Hanks to drive the facial expressions of several different characters. Unfortunately this is relatively cumbersome and makes the actors expressions overly driven once the smoothing and filtering have taken place. Next generation systems such as CaptiveMotion utilize offshoots of the traditional marker based system with higher levels of details.

Active LED Marker technology is currently being used to drive facial animation in real-time to provide user feedback.

Markerless[edit]

Markerless technologies use the features of the face such as nostrils, the corners of the lips and eyes, and wrinkles and then track them. This technology is discussed and demonstrated at CMU,[2] IBM,[3] University of Manchester (where much of this started with Tim Cootes,[4] Gareth Edwards and Chris Taylor) and other locations, using active appearance models, principal component analysis, eigen tracking, deformable surface models and other techniques to track the desired facial features from frame to frame. This technology is much less cumbersome, and allows greater expression for the actor.

These vision based approaches also have the ability to track pupil movement, eyelids, teeth occlusion by the lips and tongue, which are obvious problems in most computer animated features. Typical limitations of vision based approaches are resolution and frame rate, both of which are decreasing as issues as high speed, high resolution CMOS cameras become available from multiple sources.

The technology for markerless face tracking is related to that in a Facial recognition system, since a facial recognition system can potentially be applied sequentially to each frame of video, resulting in face tracking. For example, the Neven Vision system[5] (formerly Eyematics, now acquired by Google) allowed real-time 2D face tracking with no person-specific training; their system was also amongst the best-performing facial recognition systems in the U.S. Government's 2002 Facial Recognition Vendor Test (FRVT). On the other hand some recognition systems do not explicitly track expressions or even fail on non-neutral expressions, and so are not suitable for tracking. Conversely, systems such as deformable surface models pool temporal information to disambiguate and obtain more robust results, and thus could not be applied from a single photograph.

Markerless face tracking has progressed to commercial systems such as Image Metrics, which has been applied in movies such as The Matrix sequels[6] and The Curious Case of Benjamin Button. The latter used the Mova system to capture a deformable facial model, which was then animated with a combination of manual and vision tracking.[7] Avatar was another prominent performance capture movie however it used painted markers rather than being markerless. Dynamixyz[permanent dead link] is another commercial system currently in use.

Markerless systems can be classified according to several distinguishing criteria:

  • 2D versus 3D tracking
  • whether person-specific training or other human assistance is required
  • real-time performance (which is only possible if no training or supervision is required)
  • whether they need an additional source of information such as projected patterns or invisible paint such as used in the Mova system.

To date, no system is ideal with respect to all these criteria. For example the Neven Vision system was fully automatic and required no hidden patterns or per-person training, but was 2D. The Face/Off system[8] is 3D, automatic, and real-time but requires projected patterns.

Facial expression capture[edit]

Technology[edit]

Digital video-based methods are becoming increasingly preferred, as mechanical systems tend to be cumbersome and difficult to use.

Using digital cameras, the input user's expressions are processed to provide the head pose, which allows the software to then find the eyes, nose and mouth. The face is initially calibrated using a neutral expression. Then depending on the architecture, the eyebrows, eyelids, cheeks, and mouth can be processed as differences from the neutral expression. This is done by looking for the edges of the lips for instance and recognizing it as a unique object. Often contrast enhancing makeup or markers are worn, or some other method to make the processing faster. Like voice recognition, the best techniques are only good 90 percent of the time, requiring a great deal of tweaking by hand, or tolerance for errors.

Since computer generated characters don't actually have muscles, different techniques are used to achieve the same results. Some animators create bones or objects that are controlled by the capture software, and move them accordingly, which when the character is rigged correctly gives a good approximation. Since faces are very elastic this technique is often mixed with others, adjusting the weights differently for the skin elasticity and other factors depending on the desired expressions.

Usage[edit]

Several commercial companies are developing products that have been used, but are rather expensive.

It is expected that this will become a major input device for computer games once the software is available in an affordable format, but the hardware and software do not yet exist, despite the research for the last 15 years producing results that are almost usable.

See also[edit]

References[edit]

  1. ^ Performance-Driven Facial Animation, Lance Williams, Computer Graphics, Volume 24, Number 4, August 1990
  2. ^ AAM Fitting Algorithms from the Carnegie Mellon Robotics Institute
  3. ^ Real World Real-time Automatic Recognition of Facial Expressions
  4. ^ Modelling and Search Software ("This document describes how to build, display and use statistical appearance models.")
  5. ^ Wiskott, Laurenz; J.-M. Fellous; N. Kruger; C. von der Malsurg (1997), "Face recognition by elastic bunch graph matching", Lecture Notes in Computer Science, Springer, 1296: 456–463, doi:10.1007/3-540-63460-6_150 
  6. ^ Borshukov, George; D. Piponi; O. Larsen; J. Lewis; C. Templelaar-Lietz (2003), "Universal Capture - Image-based Facial Animation for "The Matrix Reloaded"", ACM SIGGRAPH 
  7. ^ Barba,, Eric; Steve Preeg (18 March 2009), "The Curious Face of Benjamin Button", Presentation at Vancouver ACM Siggraph chapter, 18 March 2009. 
  8. ^ Weise,, Thibaut; H. Li; L. Van Gool; M. Pauly (2009), "Face/off: Live Facial Puppetry", ACM Symposium on Computer Animation 

External links[edit]

Disclaimer

None of the audio/visual content is hosted on this site. All media is embedded from other sites such as GoogleVideo, Wikipedia, YouTube etc. Therefore, this site has no control over the copyright issues of the streaming media.

All issues concerning copyright violations should be aimed at the sites hosting the material. This site does not host any of the streaming media and the owner has not uploaded any of the material to the video hosting servers. Anyone can find the same content on Google Video or YouTube by themselves.

The owner of this site cannot know which documentaries are in public domain, which has been uploaded to e.g. YouTube by the owner and which has been uploaded without permission. The copyright owner must contact the source if he wants his material off the Internet completely.

Powered by YouTube
Wikipedia content is licensed under the GFDL and (CC) license