By using Vision framework you can do many things like detection of faces, face features, object tracking, and others.
In this blog, we will take a look at how we can detect a face from an image. Following are the steps to perform face detection
- First, you need an image from which you can detect a face from an image this is taken from the camera or a Photos from your iPhone, you can use for this UIImagePickerController
12345678910111213141516let picker = UIImagePickerController()picker.delegate = selflet alert = UIAlertController(title: nil, message: nil, preferredStyle: .actionSheet)if UIImagePickerController.isSourceTypeAvailable(.camera) {alert.addAction(UIAlertAction(title: "Camera", style: .default, handler: {action inpicker.sourceType = .cameraself.present(picker, animated: true, completion: nil)}))}alert.addAction(UIAlertAction(title: "Photo Library", style: .default, handler: { action inpicker.sourceType = .photoLibraryself.present(picker, animated: true, completion: nil)}))alert.addAction(UIAlertAction(title: "Cancel", style: .cancel, handler: nil))alert.popoverPresentationController?.sourceRect = self.view.frameself.present(alert, animated: true, completion: nil)
Then use these delegate methods to get an image:
1234567extention ViewController: UIImagePickerControllerDelegate, UINavigationControllerDelegate {func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {dismiss(animated: true, completion: nil)self.image = info[UIImagePickerControllerOriginalImage] as! UIImage}} - After fetching the image then you have to use that for detecting a face CIDetector find an example below for detecting a face.
123456789101112131415161718192021let imageOptions = NSDictionary(object: NSNumber(value: 5) as NSNumber, forKey: CIDetectorImageOrientation as NSString)let personciImage = CIImage(cgImage: image.cgImage!)let accuracy = [CIDetectorAccuracy: CIDetectorAccuracyHigh]let faceDetector = CIDetector(ofType: CIDetectorTypeFace, context: nil, options: accuracy)let faces = faceDetector?.features(in: personciImage, options: imageOptions as? [String : AnyObject])if let face = faces?.first as? CIFaceFeature {print("found bounds are \(face.bounds)")let alert = UIAlertController(title: "Say Cheese!", message: "We detected a face!", preferredStyle: UIAlertController.Style.alert)// alert.addAction(UIAlertAction(title: "OK", style: UIAlertAction.Style.default, handler: nil))let confirmAction = UIAlertAction(title: "OK", style: .default) { [weak alert] _ in}alert.addAction(confirmAction)let cancelAction = UIAlertAction(title: "Cancel", style: .cancel, handler: nil)alert.addAction(cancelAction)} else {let alert = UIAlertController(title: "No Face!", message: "No face was detected", preferredStyle: UIAlertController.Style.alert)alert.addAction(UIAlertAction(title: "OK", style: UIAlertAction.Style.default, handler: nil))self.present(alert, animated: true, completion: nil)} - By using the help CIFaceFeature you can also detect that smiling face or face’s eye positioning and many others.
1234567891011if face.hasSmile {print("face is smiling");}if face.hasLeftEyePosition {print("Left eye bounds are \(face.leftEyePosition)")}if face.hasRightEyePosition {print("Right eye bounds are \(face.rightEyePosition)")}
I hope you enjoyed this, please let me know in the comments how did it go and if there are things that can be improved. Thanks for tuning in once again!