Previously I mentioned a nice tutorial on using the facial detection feature of the iOS SDK on a live video stream.
Here’s a nice example from Danny Shmueli showing how to use the facial gesture recognition on streaming video added with iOS 7 for detecting blinking, and smiling gestures.
One of the key issues with using this facial gesture recognition feature is that performance, and the example does a nice job of handling this issue. The logic has been nicely extracted so you can use this gesture recognition in your own projects.
Here’s a video showing DSFacialGestureDetector in action:
You can find DSFacialGestureDetector on Github here.
A nice example showing how to use the iOS facial gesture recognition feature on streaming video.
- Using Core Image Filters In iOS 5 On A Live Video Feed
- Example: Using Core Image Face Detection On A Live Video Feed
- Open Source: Easy Multistroke and Single Stroke Complex Gesture Recognition On iOS
- An Improved Implementation of the $1 Unistroke Gesture Recognizer For iOS
- An iOS Framework Providing Amazing High-Performance Trainable Object Recognition
Original article: Example: Using The iOS Facial Gesture Recognition On A Live Video Feed
©2014 iOS App Dev Libraries, Controls, Tutorials, Examples and Tools. All Rights Reserved.




