Share
VIDEOS 1 TO 50
Finger Tracking without Cameras
Finger Tracking without Cameras
Published: 2017/08/02
Channel: Polhemus
Finger Tracking using Kinect v2
Finger Tracking using Kinect v2
Published: 2016/01/05
Channel: Vangos Pterneas
Hand and finger Tracking using OpenCV Computer Vision
Hand and finger Tracking using OpenCV Computer Vision
Published: 2013/08/12
Channel: Simen Andresen
Tutorial - How to create Finger Beams, w Mocha Tracking - After Effects
Tutorial - How to create Finger Beams, w Mocha Tracking - After Effects
Published: 2012/10/16
Channel: 2004BeachBum
LEAP Motion Controller Orientation Demo Video - Finger Tracking & Painting Feature
LEAP Motion Controller Orientation Demo Video - Finger Tracking & Painting Feature
Published: 2013/08/01
Channel: PhoneRadar
Manus VR brings arm, hand and finger tracking to VR
Manus VR brings arm, hand and finger tracking to VR
Published: 2016/07/28
Channel: Manus VR
The Motiv Ring brings fitness tracking to your finger
The Motiv Ring brings fitness tracking to your finger
Published: 2017/01/22
Channel: Digital Trends
Finger Tracking With OpenCV
Finger Tracking With OpenCV
Published: 2016/05/08
Channel: Andrew Ke
Visionary Render 1.2 Finger Tracking using ART
Visionary Render 1.2 Finger Tracking using ART
Published: 2016/03/01
Channel: Virtalis
finger tracking test 02
finger tracking test 02
Published: 2013/07/14
Channel: daito manabe
Full hand/arm/finger tracking for oculus Rift / VR for $50
Full hand/arm/finger tracking for oculus Rift / VR for $50
Published: 2014/10/03
Channel: Travel Trousers
Hand Finger Motion Tracking
Hand Finger Motion Tracking
Published: 2012/12/20
Channel: Stan Melax
Realtime Hand Tracking from Video
Realtime Hand Tracking from Video
Published: 2017/04/26
Channel: Perceptual Computing Laboratory
Finger tracking and drawing with OpenCV
Finger tracking and drawing with OpenCV
Published: 2017/11/02
Channel: 唐培文
Robust finger tracking and finger click detection using Kinect V2
Robust finger tracking and finger click detection using Kinect V2
Published: 2014/03/19
Channel: TcBoY Yeo
finger tracking.wmv
finger tracking.wmv
Published: 2010/07/24
Channel: Juan Diego Gómez
Finger & Hand Tracking
Finger & Hand Tracking
Published: 2015/02/05
Channel: Polhemus
Colored Finger tracking mouse
Colored Finger tracking mouse
Published: 2011/01/12
Channel: Chinni Krishna
Laser Finger Tracking Test
Laser Finger Tracking Test
Published: 2016/05/03
Channel: daito manabe
3D Wiimote Finger Tracking
3D Wiimote Finger Tracking
Published: 2009/12/11
Channel: bridleman4
Basic Finger Tracking with Myo Armband
Basic Finger Tracking with Myo Armband
Published: 2015/01/09
Channel: Chris Zaharia
Valve finger-tracking VR controllers, LG bendy displays, Overwatch kills aimbots
Valve finger-tracking VR controllers, LG bendy displays, Overwatch kills aimbots
Published: 2017/06/23
Channel: NCIX Tech Tips
Finger tracking with Kinect SDK
Finger tracking with Kinect SDK
Published: 2012/04/13
Channel: Fran Trapero Cerezo
Full 3D Skeletal Hand and Finger Tracking
Full 3D Skeletal Hand and Finger Tracking
Published: 2012/10/12
Channel: Stan Melax
AR Tower Defense Using Finger Tracking
AR Tower Defense Using Finger Tracking
Published: 2016/12/08
Channel: John
Finger Tracking and Bare Hand Human Computer Interaction
Finger Tracking and Bare Hand Human Computer Interaction
Published: 2010/02/01
Channel: Christian von Hardenberg
HoloPanel: Hand/finger tracking + Augmented reality panel interface (Final test #3)
HoloPanel: Hand/finger tracking + Augmented reality panel interface (Final test #3)
Published: 2015/06/08
Channel: Abraham Botros
Optitrack flex 3 finger tracking
Optitrack flex 3 finger tracking
Published: 2017/09/15
Channel: Ivan Cueto
Tracking the Finger Movements of a Guitarist
Tracking the Finger Movements of a Guitarist
Published: 2016/10/04
Channel: Polhemus
SkinTrack: Using the Body as an Electrical Waveguide for Continuous Finger Tracking on the Skin
SkinTrack: Using the Body as an Electrical Waveguide for Continuous Finger Tracking on the Skin
Published: 2016/05/05
Channel: Future Interfaces Group
Wearable 3-DOF Cutaneous Haptic Device with Integrated IMU-Based Finger Tracking
Wearable 3-DOF Cutaneous Haptic Device with Integrated IMU-Based Finger Tracking
Published: 2016/08/18
Channel: SNU INRoL
Finger Tracking Using KinectCoreVision
Finger Tracking Using KinectCoreVision
Published: 2012/04/28
Channel: rhkrdbswhd
free air finger tracking in use
free air finger tracking in use
Published: 2009/04/12
Channel: cloudflint
Finger Tracking + Grasp detection
Finger Tracking + Grasp detection
Published: 2012/06/07
Channel: Raymond Lo
HoloPanel: Hand/finger tracking + Augmented reality panel interface (Final test #5)
HoloPanel: Hand/finger tracking + Augmented reality panel interface (Final test #5)
Published: 2015/06/08
Channel: Abraham Botros
Possible VR input scheme for same hand translational control and finger tracking
Possible VR input scheme for same hand translational control and finger tracking
Published: 2014/06/21
Channel: Edward Mercieca
PMD[vision] CamBoard nano - finger tracking. Touchless writing with your finger
PMD[vision] CamBoard nano - finger tracking. Touchless writing with your finger
Published: 2012/11/30
Channel: pmdtechnologies ag
Kinect Finger Tracking Demo
Kinect Finger Tracking Demo
Published: 2011/12/04
Channel: Lukas Taves
Finger Tracking with Kinect V2
Finger Tracking with Kinect V2
Published: 2017/06/06
Channel: Son Tran Dinh
Finger Tracking
Finger Tracking
Published: 2018/05/07
Channel: Rob Somogyi
Finger tracking in OpenCV (Development Stage)
Finger tracking in OpenCV (Development Stage)
Published: 2011/04/13
Channel: Mandar Shinde
Hand Tracking Finger Tracking (Kinect OpenNi EmguCV OpenCV)
Hand Tracking Finger Tracking (Kinect OpenNi EmguCV OpenCV)
Published: 2012/03/03
Channel: TcBoY Yeo
Oculus Rift DK2 with Leap Motion Hand/Finger Tracking
Oculus Rift DK2 with Leap Motion Hand/Finger Tracking
Published: 2016/02/19
Channel: Steve Walker
Visionary Render 1.2  Finger Tracking
Visionary Render 1.2 Finger Tracking
Published: 2015/12/16
Channel: Virtalis
PlayStation VR REZ Infinite playing with LEAP Motion finger tracking: Area X
PlayStation VR REZ Infinite playing with LEAP Motion finger tracking: Area X
Published: 2016/10/16
Channel: THEGAMEVEDA
Bloculus.de | Microsofts "Digits" -- Finger Tracking ohne Data Gloves
Bloculus.de | Microsofts "Digits" -- Finger Tracking ohne Data Gloves
Published: 2013/10/08
Channel: Behind VR
control an on screen marionette using finger tracking
control an on screen marionette using finger tracking
Published: 2012/10/28
Channel: Eli Elhadad
free air finger tracking demonstration
free air finger tracking demonstration
Published: 2009/06/12
Channel: cloudflint
Kinect hand gestures, finger tracking
Kinect hand gestures, finger tracking
Published: 2012/07/29
Channel: johnbobpoker
Leap Motion real-time finger tracking at ASUS in COMPUTEX2013
Leap Motion real-time finger tracking at ASUS in COMPUTEX2013
Published: 2013/06/09
Channel: Qi Ding
NEXT
GO TO RESULTS [51 .. 100]

WIKIPEDIA ARTICLE

From Wikipedia, the free encyclopedia
  (Redirected from Hand tracking)
Jump to navigation Jump to search
Finger tracking of two pianists' fingers playing the same piece (slow motion, no sound).[1]

In the field of technology and image processing, finger tracking is a high-resolution technique that is employed to know the consecutive position of the fingers of the user and hence represent objects in 3D. In addition to that, the finger tracking technique is used as a tool of the computer, acting as an external device in our computer, similar to a keyboard and a mouse.

Introduction[edit]

The finger tracking system is focused on user-data interaction, where the user interacts with virtual data, by handling through the fingers the volumetric of a 3D object that we want to represent. This system was born based on the human-computer interaction problem. The objective is to allow the communication between them and the use of gestures and hand movements to be more intuitive, Finger tracking systems have been created. These systems track in real time the position in 3D and 2D of the orientation of the fingers of each marker and use the intuitive hand movements and gestures to interact.

Types of tracking[edit]

There are many options for the implementation of finger tracking. A great number of theses have been done in this field in order to make a global partition as an objective. We could divide this technique into finger tracking and interface. Regarding the last one, it computes a sequence estimation of the image which detects the hand part of the background. Regarding the first one, to carry out this tracking, we need an intermediate external device, used as a tool for executing different instructions.

Tracking with interface[edit]

In this system we use inertial and optical motion capture systems.

Inertial motion capture gloves[edit]

Inertial motion capture systems are able to capture finger motions reading the rotation of each finger segment in 3D space. Applying these rotations to kinematic chain, whole human hand can be tracked in real time, without occlusion and wireless.

Hand inertial motion capture systems, like for example Synertial mocap gloves, are using tiny IMU based sensors, located on each finger segment. For most precise capture, at least 16 sensors have to be used. There are also mocap gloves models with less sensors (13 / 7 sensors) for which the rest of the finger segments is interpolated (proximal segments) or extrapolated (distal segments). The sensors are typically inserted into textile glove which makes the use of the sensors more comfortable.

Because the inertial sensors are capturing movements in all 3 directions, flexion, extensions and abduction can be captured for all fingers and thumb.

Hand skeleton[edit]

Since inertial sensors are tracking only rotations, the rotations have to be applied to some hand skeleton in order to get proper output. To get precise output (for example to be able to touch the fingertips), the hand skeleton has to be properly scaled to match the real hand. For this purpose manual measurement of the hand or automatic measurement extraction can be used.

Fusing data with optical motion capture systems[edit]

As described below, because of marker occlusion during capturing, tracking fingers is the most challenging part for optical motion capture systems (like Vicon, Optitracks, ART, ..). Users of optical mocap systems claims that the most post-process work is usually due to finger capture. As the inertial mocap systems (if properly calibrated) are mostly without the need for post-process, the typical use for high end mocap users is to fuse data from inertial mocap systems (fingers) with optical mocap systems (body + position in space).
The process of fusing mocap data is based on matching time codes of each frame for inertial and optical mocap system data source. This way any 3rd party software (for example MotionBuilder, Blender) can apply motions from two sources, independently of the mocap method used.

Hand position tracking[edit]

On the top of finger tracking, many users require positional tracking for the whole hand in space. Multiple methods can be used for this purpose:

  • Capturing the whole body using inertial mocap system (hand skeleton is attached at the end of body skeleton kinematic chain). Position of the palm is determined from the body.
  • Capturing position of the palm (forearm) using optical mocap system.
  • Capturing position of the palm (forearm) using other position tracking method, widely used in VR headsets (for example HTC Vive Lighthouse).
Disadvantages of inertial motion capture systems[edit]

Inertial sensors have two main disadvantages connected with finger tracking: - Problem to capture absolute position of the hand in space (already covered above). - Problem with magnetic interference - metal materials use to interfere with sensors. This problem may be noticeable mainly because hands are often in contact with different things, often made of metal. The current generations of motion capture gloves are able to withstand unbelievable magnetic interference. Thought, the magnetic imunity depends on multiple factors - manufacturer, price range and number of sensors used in mocap glove.

Optical motion capture systems[edit]

a tracking of the location of the markers and patterns in 3D is performed, the system identifies them and labels each marker according to the position of the user’s fingers. The coordinates in 3D of the labels of these markers are produced in real time with other applications.

Markers[edit]

Some of the optical systems, like Vicon or ART, are able to capture hand motion through markers. In each hand we have a marker per each “operative” finger. Three high-resolution cameras are responsible for capturing each marker and measure its positions. This will be only produced when the camera is able to see them. The visual markers, usually known as rings or bracelets, are used to recognize user gesture in 3D. In addition, as the classification indicates, these rings act as an interface in 2D.

Occlusion as an interaction method[edit]

The visual occlusion is a very intuitive method to provide a more realistic viewpoint of the virtual information in three dimensions. The interfaces provide more natural 3D interaction techniques over base 6.

Marker functionality[edit]

Markers operate through interaction points, which are usually already set and we have the knowledge about the regions. Because of that, it is not necessary to follow each marker all the time; the multipointers can be treated in the same way when there is only one operating pointer. To detect such pointers through an interaction, we enable ultrasound infrared sensors. The fact that many pointers can be handled as one, problems would be solved. In the case when we are exposed to operate under difficult conditions like bad illumination, motion blurs, malformation of the marker or occlusion. The system allows following the object, even though if some markers are not visible. Because of the spatial relationships of all the markers are known, the positions of the markers that are not visible can be computed by using the markers that are known. There are several methods for marker detection like border marker and estimated marker methods.

  • The Homer technique includes ray selection with direct handling: An object is selected and then its position and orientation are handled like if it was connected directly to the hand.
  • The Conner technique presents a set of 3D widgets that permit an indirect interaction with the virtual objects through a virtual widget that acts as an intermediary.
Articulated hand tracking[edit]

This is an interesting technique from the point of view that is more simple and less expensive, because it only needs one camera. This simplicity acts with less precision than the previous technique. It provides a new base for new interactions in the modeling, the control of the animation and the added realism. It uses a glove composed of a set of colors which are assigned according to the position of the fingers. This color test is limited to the vision system of the computers and based on the capture function and the position of the color, the position of the hand is known.

Tracking without interface[edit]

In terms of visual perception, the legs and hands can be modeled as articulated mechanisms, system of rigid bodies that are connected between them to articulations with one or more degrees of freedom. This model can be applied to a more reduced scale to describe hand motion and based on a wide scale to describe a complete body motion. A certain finger motion, for example, can be recognized from its usual angles and it does not depend on the position of the hand in relation to the camera.

Many tracking systems are based on a model focused on a problem of sequence estimation, where a sequence of images is given and a model of changing, we estimate the 3D configuration for each photo. All the possible hand configurations are represented by vectors on a state space, which codes the position of the hand and the angles of the finger’s joint. Each hand configuration generates a set of images through the detection of the borders of the occlusion of the finger’s joint. The estimation of each image is calculated by finding the state vector that better fits to the measured characteristics. The finger joints have the added 21 states more than the rigid body movement of the palms; this means that the cost computational of the estimation is increased. The technique consists of label each finger joint links is modeled as a cylinder. We do the axes at each joint and bisector of this axis is the projection of the joint. Hence we use 3 DOF, because there are only 3 degrees of movement.

In this case, it is the same as in the previous typology as there is a wide variety of deployment thesis on this subject. Therefore, the steps and treatment technique are different depending on the purpose and needs of the person who will use this technique. Anyway, we can say that a very general way and in most systems, you should carry out the following steps:

  • Background subtraction: the idea is to convolve all the images that are captured with a Gauss filter of 5x5, and then these are scaled to reduce noisy pixel data.
  • Segmentation: a binary mask application is used to represent with a white color, the pixels that belong to the hand and to apply the black color to the foreground skin image.
  • Region extraction: left and right hand detection based on a comparison between them.
  • Characteristic extraction: location of the fingertips and to detect if it is a peak or a valley. To classify the point, peaks or valleys, these are transformed to 3D vectors, usually named pseudo vectors in the xy-plane, and then to compute the cross product. If the sign of the z component of the cross product is positive, we consider that the point is a peak, and in the case that the result of the cross product is negative, it will be a valley.
  • Point and pinch gesture recognition: taking into account the points of reference that are visible (fingertips) a certain gesture is associated.
  • Pose estimation: a procedure which consists on identify the position of the hands through the use of algorithms that compute the distances between positions.

Other tracking techniques[edit]

It is also possible to perform active tracking of fingers. The Smart Laser Scanner is a marker-less finger tracking system using a modified laser scanner/projector developed at the University of Tokyo in 2003-2004. It is capable of acquiring three-dimensional coordinates in real time without the need of any image processing at all (essentially, it is a rangefinder scanner that instead of continuously scanning over the full field of view, restricts its scanning area to a very narrow window precisely the size of the target). Gesture recognition has been demonstrated with this system. The sampling rate can be very high (500 Hz), enabling smooth trajectories to be acquired without the need of filtering (such as Kalman).

Application[edit]

Definitely, the finger tracking systems are used to represent a virtual reality. However its application has gone to professional level 3D modeling, companies and projects directly in this case overturned. Thus such systems rarely have been used in consumer applications due to its high price and complexity. In any case, the main objective is to facilitate the task of executing commands to the computer via natural language or interacting gesture.

The objective is centered on the following idea computers should be easier in terms of usage if there is a possibility to operate through natural language or gesture interaction. The main application of this technique is to highlight the 3D design and animation, where software like Maya and 3D StudioMax employ these kinds of tools. The reason is to allow a more accurate and simple control of the instructions that we want to execute. This technology offers many possibilities, where the sculpture, building and modeling in 3D in real time through the use of a computer is the most important.

References[edit]

  1. ^ Goebl, W.; Palmer, C. (2013). Balasubramaniam, Ramesh, ed. "Temporal Control and Hand Movement Efficiency in Skilled Music Performance". PLoS ONE. 8 (1): e50901. doi:10.1371/journal.pone.0050901. PMC 3536780Freely accessible. PMID 23300946. 

External links[edit]

Disclaimer

None of the audio/visual content is hosted on this site. All media is embedded from other sites such as GoogleVideo, Wikipedia, YouTube etc. Therefore, this site has no control over the copyright issues of the streaming media.

All issues concerning copyright violations should be aimed at the sites hosting the material. This site does not host any of the streaming media and the owner has not uploaded any of the material to the video hosting servers. Anyone can find the same content on Google Video or YouTube by themselves.

The owner of this site cannot know which documentaries are in public domain, which has been uploaded to e.g. YouTube by the owner and which has been uploaded without permission. The copyright owner must contact the source if he wants his material off the Internet completely.

Powered by YouTube
Wikipedia content is licensed under the GFDL and (CC) license