Example: Using Core Image Face Detection On A Live Video Feed

Some time ago I posted a tutorial on the basics of using Core Image face detection on the iOS platform.

Something that was asked in the comments thread for that post was how to use Core Image face detection on a live video feed – and whether it was fast enough.

Recently I received a submission from Jeroen Trappers showing how to use Core Image face detection on a live video feed, and place a moustache on the faces found within the video stream.

Here’s an image of the example in action from Jeroen’s site:

You can find the tutorial here.

The speed is respectable on newer iOS devices with a 640×480 capture resolution.

Added to the Core Image page.

©2012 iPhone, iOS 5, iPad SDK Development Tutorial and Programming Tips. All Rights Reserved.

.

DeliciousTwitterFacebookLinkedInEmail


Leave a Reply

Your email address will not be published. Required fields are marked *