Updated 6 July 2023
By using Vision framework you can do many things like detection of faces, face features, object tracking, and others.
In this blog, we will take a look at how we can detect a face from an image. Following are the steps to perform face detection
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 |
let picker = UIImagePickerController() picker.delegate = self let alert = UIAlertController(title: nil, message: nil, preferredStyle: .actionSheet) if UIImagePickerController.isSourceTypeAvailable(.camera) { alert.addAction(UIAlertAction(title: "Camera", style: .default, handler: {action in picker.sourceType = .camera self.present(picker, animated: true, completion: nil) })) } alert.addAction(UIAlertAction(title: "Photo Library", style: .default, handler: { action in picker.sourceType = .photoLibrary self.present(picker, animated: true, completion: nil) })) alert.addAction(UIAlertAction(title: "Cancel", style: .cancel, handler: nil)) alert.popoverPresentationController?.sourceRect = self.view.frame self.present(alert, animated: true, completion: nil) |
1 2 3 4 5 6 7 |
extention ViewController: UIImagePickerControllerDelegate, UINavigationControllerDelegate { func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) { dismiss(animated: true, completion: nil) self.image = info[UIImagePickerControllerOriginalImage] as! UIImage } } |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 |
let imageOptions = NSDictionary(object: NSNumber(value: 5) as NSNumber, forKey: CIDetectorImageOrientation as NSString) let personciImage = CIImage(cgImage: image.cgImage!) let accuracy = [CIDetectorAccuracy: CIDetectorAccuracyHigh] let faceDetector = CIDetector(ofType: CIDetectorTypeFace, context: nil, options: accuracy) let faces = faceDetector?.features(in: personciImage, options: imageOptions as? [String : AnyObject]) if let face = faces?.first as? CIFaceFeature { print("found bounds are \(face.bounds)") let alert = UIAlertController(title: "Say Cheese!", message: "We detected a face!", preferredStyle: UIAlertController.Style.alert) // alert.addAction(UIAlertAction(title: "OK", style: UIAlertAction.Style.default, handler: nil)) let confirmAction = UIAlertAction(title: "OK", style: .default) { [weak alert] _ in } alert.addAction(confirmAction) let cancelAction = UIAlertAction(title: "Cancel", style: .cancel, handler: nil) alert.addAction(cancelAction) } else { let alert = UIAlertController(title: "No Face!", message: "No face was detected", preferredStyle: UIAlertController.Style.alert) alert.addAction(UIAlertAction(title: "OK", style: UIAlertAction.Style.default, handler: nil)) self.present(alert, animated: true, completion: nil) } |
1 2 3 4 5 6 7 8 9 10 11 |
if face.hasSmile { print("face is smiling"); } if face.hasLeftEyePosition { print("Left eye bounds are \(face.leftEyePosition)") } if face.hasRightEyePosition { print("Right eye bounds are \(face.rightEyePosition)") } |
I hope you enjoyed this, please let me know in the comments how did it go and if there are things that can be improved. Thanks for tuning in once again!
If you have more details or questions, you can reply to the received confirmation email.
Back to Home
Be the first to comment.