While professional photographers don’t like them, the simple fact is that the camera from practically every iteration of the iPhone has been comparable to what was available in prosumer DSLR cameras. The camera can be used to take pictures (duh), as well as to focus quickly (which is good for barcodes and QR codes). One of the great features of iOS is that it has the ability to take a picture and perform an operation on it, such as run a filter or perform facial detection. This is provided by the iOS Core Image Library.
Filters can be used for creating various effects on images. You might be thinking, "So what? I’ve been able to do that for quite some time with Instagram." Filters are just one part, but Core Image also can detect faces. With the automatic ability to detect faces in images, these images can be easily cut down to get facial information.
In this article, I’ll look at how to apply filters to images and to detect faces.