Using Core Image Filters In iOS 5 On A Live Video Feed

For those non-mac developers unfamiliar with Core Image – Core Image is an API that allows you to perform pixel accurate image processing using a high level syntax that has been added to iOS devices with the release of iOS 5

Combined with the iPhone and iPad cameras you can create some very interesting image filters very quickly.

Using the filters on live video does require some insight, and I’ve found an interesting example showing how to take a live video feed, and apply filters in real-time.

The tutorial is from Jake Gundersen and in the tutorial he takes a live feed, and filters it through a randomly placed collection of circular windows, and in addition applies color filters to each individual window (there’s actually more going on, but unless you have an iPad 2 handy the app lags a little).

You can find the tutorial on Jake’s website here.

The tutorial has been added to the iOS 5 developer tutorials page (also added today was an iOS 5 json tutorial).

©2011 iPhone, iOS 5, iPad SDK Development Tutorial and Programming Tips. All Rights Reserved.

.

DeliciousTwitterTechnoratiFacebookLinkedInEmail


Leave a Reply

Your email address will not be published. Required fields are marked *