AVFoundation tap to focus feedback rectangle

I am developing an iphone application where I directly use AVFoundation to capture videos via the camera.

I've implemented a feature to enable the tap to focus function for a user.

- (void) focus:(CGPoint) aPoint;
{
#if HAS_AVFF
    Class captureDeviceClass = NSClassFromString(@"AVCaptureDevice");
    if (captureDeviceClass != nil) {        
        AVCaptureDevice *device = [captureDeviceClass defaultDeviceWithMediaType:AVMediaTypeVideo];
        if([device isFocusPointOfInterestSupported] &&
           [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
            CGRect screenRect = [[UIScreen mainScreen] bounds];
            double screenWidth = screenRect.size.width;
            double screenHeight = screenRect.size.height;
            double focus_x = aPoint.x/screenWidth;
            double focus_y = aPoint.y/screenHeight;
            if([device lockForConfiguration:nil]) {
                [device setFocusPointOfInterest:CGPointMake(focus_x,focus_y)];
                [device setFocusMode:AVCaptureFocusModeAutoFocus];
                if ([device isExposureModeSupported:AVCaptureExposureModeAutoExpose]){
                    [device setExposureMode:AVCaptureExposureModeAutoExpose];
                }
                [device unlockForConfiguration];
            }
        }
    }
#endif
}

So far so good, but I am missing the feedback rectangle like in the photos app. Is there any way to tell the AVFoundation Framework to show this feedback rectangle or do I have to implement this feature myself?

Here's what I did: This is the class that creates the square that is shown when the user taps on the camera overlay.

CameraFocusSquare.h

#import <UIKit/UIKit.h>
@interface CameraFocusSquare : UIView
@end


CameraFocusSquare.m

#import "CameraFocusSquare.h"
#import <QuartzCore/QuartzCore.h>

const float squareLength = 80.0f;
@implementation FBKCameraFocusSquare

- (id)initWithFrame:(CGRect)frame
{
    self = [super initWithFrame:frame];
    if (self) {
        // Initialization code

        [self setBackgroundColor:[UIColor clearColor]];
        [self.layer setBorderWidth:2.0];
        [self.layer setCornerRadius:4.0];
        [self.layer setBorderColor:[UIColor whiteColor].CGColor];

        CABasicAnimation* selectionAnimation = [CABasicAnimation
                                                animationWithKeyPath:@"borderColor"];
        selectionAnimation.toValue = (id)[UIColor blueColor].CGColor;
        selectionAnimation.repeatCount = 8;
        [self.layer addAnimation:selectionAnimation
                          forKey:@"selectionAnimation"];

    }
    return self;
}
@end

And in the view where you receive your taps, do the following:

- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
    UITouch *touch = [[event allTouches] anyObject];
    CGPoint touchPoint = [touch locationInView:touch.view];
    [self focus:touchPoint];

    if (camFocus)
    {
        [camFocus removeFromSuperview];
    }
    if ([[touch view] isKindOfClass:[FBKVideoRecorderView class]])
    {
        camFocus = [[CameraFocusSquare alloc]initWithFrame:CGRectMake(touchPoint.x-40, touchPoint.y-40, 80, 80)];
        [camFocus setBackgroundColor:[UIColor clearColor]];
        [self addSubview:camFocus];
        [camFocus setNeedsDisplay];

        [UIView beginAnimations:nil context:NULL];
        [UIView setAnimationDuration:1.5];
        [camFocus setAlpha:0.0];
        [UIView commitAnimations];
    }
}

- (void) focus:(CGPoint) aPoint;
{
    Class captureDeviceClass = NSClassFromString(@"AVCaptureDevice");
    if (captureDeviceClass != nil) {
        AVCaptureDevice *device = [captureDeviceClass defaultDeviceWithMediaType:AVMediaTypeVideo];
        if([device isFocusPointOfInterestSupported] &&
           [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
            CGRect screenRect = [[UIScreen mainScreen] bounds];
            double screenWidth = screenRect.size.width;
            double screenHeight = screenRect.size.height;
            double focus_x = aPoint.x/screenWidth;
            double focus_y = aPoint.y/screenHeight;
            if([device lockForConfiguration:nil]) {
                [device setFocusPointOfInterest:CGPointMake(focus_x,focus_y)];
                [device setFocusMode:AVCaptureFocusModeAutoFocus];
                if ([device isExposureModeSupported:AVCaptureExposureModeAutoExpose]){
                    [device setExposureMode:AVCaptureExposureModeAutoExpose];
                }
                [device unlockForConfiguration];
            }
        }
    }
}

AVFoundation tap to focus feedback rectangle, AVFoundation tap to focus feedback rectangle. Question. I am developing an iphone application where I directly use AVFoundation to capture videos via the� Gets the bounding rectangle to which the metadata applies. (Inherited from AVMetadataObject) Class (Inherited from NSObject) ClassHandle: The handle for this class. Corners: If not null, an array of CGPoint objects of the detected corners. DebugDescription: A developer-meaningful description of this object. (Inherited from NSObject) Description

Adding to Anil's brilliant answer: Instead of doing the calculations yourself, you should have a look at AVCaptureVideoPreviewLayer's captureDevicePointOfInterestForPoint:. It will give you a much more consistent focus point (available from iOS 6 and forward).

- (void) focus:(CGPoint) aPoint;
{
    Class captureDeviceClass = NSClassFromString(@"AVCaptureDevice");
    if (captureDeviceClass != nil) {
        AVCaptureDevice *device = [captureDeviceClass defaultDeviceWithMediaType:AVMediaTypeVideo];
        if([device isFocusPointOfInterestSupported] &&
           [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {

            CGPoint focusPoint = [self.captureVideoPreviewLayer captureDevicePointOfInterestForPoint:aPoint];
            if([device lockForConfiguration:nil]) {
                [device setFocusPointOfInterest:CGPointMake(focusPoint.x,focusPoint.y)];
                [device setFocusMode:AVCaptureFocusModeAutoFocus];
                if ([device isExposureModeSupported:AVCaptureExposureModeAutoExpose]){
                    [device setExposureMode:AVCaptureExposureModeAutoExpose];
                }
                [device unlockForConfiguration];
            }
        }
    }
}

The documentation is available here: https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVCaptureVideoPreviewLayer_Class/index.html#//apple_ref/occ/instm/AVCaptureVideoPreviewLayer/captureDevicePointOfInterestForPoint:

iphone camera show focus rectangle, La premi�re chose que vous voulez faire est de mettre en œuvre le tap-to-focus, probablement o� vous initiera la couche d'aper�u: UITapGestureRecognizer� Tap to focus. The tap-to-focus technique uses the FocusControl and the RegionsOfInterestControl to specify a subregion of the capture frame where the capture device should focus. The region of focus is determined by the user tapping on the screen displaying the preview stream. This example uses a radio button to enable and disable tap-to-focus

Swift implementation:

CameraFocusSquare view:

    class CameraFocusSquare: UIView,CAAnimationDelegate {

    internal let kSelectionAnimation:String = "selectionAnimation"

    fileprivate var _selectionBlink: CABasicAnimation?

    convenience init(touchPoint: CGPoint) {
        self.init()
        self.updatePoint(touchPoint)
        self.backgroundColor = UIColor.clear
        self.layer.borderWidth = 2.0
        self.layer.borderColor = UIColor.orange.cgColor
        initBlink()
    }

    override init(frame: CGRect) {
        super.init(frame: frame)
    }

    fileprivate func initBlink() {
        // create the blink animation
        self._selectionBlink = CABasicAnimation(keyPath: "borderColor")
        self._selectionBlink!.toValue = (UIColor.white.cgColor as AnyObject)
        self._selectionBlink!.repeatCount = 3
        // number of blinks
        self._selectionBlink!.duration = 0.4
        // this is duration per blink
        self._selectionBlink!.delegate = self
    }



    required init?(coder aDecoder: NSCoder) {
        fatalError("init(coder:) has not been implemented")
    }

    /**
     Updates the location of the view based on the incoming touchPoint.
     */

    func updatePoint(_ touchPoint: CGPoint) {
        let squareWidth: CGFloat = 100
        let frame: CGRect = CGRect(x: touchPoint.x - squareWidth / 2, y: touchPoint.y - squareWidth / 2, width: squareWidth, height: squareWidth)
        self.frame = frame
    }
    /**
     This unhides the view and initiates the animation by adding it to the layer.
     */

    func animateFocusingAction() {

        if let blink = _selectionBlink {
            // make the view visible
            self.alpha = 1.0
            self.isHidden = false
            // initiate the animation
            self.layer.add(blink, forKey: kSelectionAnimation)
        }

    }
    /**
     Hides the view after the animation stops. Since the animation is automatically removed, we don't need to do anything else here.
     */

    public func animationDidStop(_ anim: CAAnimation, finished flag: Bool){
        if flag{
            // hide the view
            self.alpha = 0.0
            self.isHidden = true
        }
    }

}

Gesture action:

 open func tapToFocus(_ gesture : UILongPressGestureRecognizer) {

    if (gesture.state == UIGestureRecognizerState.began) {

        let touchPoint:CGPoint = gesture.location(in: self.previewView)

        if let fsquare = self.focusSquare {
            fsquare.updatePoint(touchPoint)
        }else{
            self.focusSquare = CameraFocusSquare(touchPoint: touchPoint)
            self.previewView.addSubview(self.focusSquare!)
            self.focusSquare?.setNeedsDisplay()
        }

        self.focusSquare?.animateFocusingAction()

        let convertedPoint:CGPoint = self.previewLayer!.captureDevicePointOfInterest(for: touchPoint)

        let currentDevice:AVCaptureDevice = self.videoDeviceInput!.device

        if currentDevice.isFocusPointOfInterestSupported && currentDevice.isFocusModeSupported(AVCaptureFocusMode.autoFocus){

            do {

                try currentDevice.lockForConfiguration()
                currentDevice.focusPointOfInterest = convertedPoint
                currentDevice.focusMode = AVCaptureFocusMode.autoFocus

                if currentDevice.isExposureModeSupported(AVCaptureExposureMode.continuousAutoExposure){
                    currentDevice.exposureMode = AVCaptureExposureMode.continuousAutoExposure
                }
                currentDevice.isSubjectAreaChangeMonitoringEnabled = true
                currentDevice.unlockForConfiguration()

            } catch {

            }
        }
    }
}

alex-seville/ASCameraView: A camera view for iOS, ASCamera - the underlying AVFoundation logic. ASCameraView - a show focus rectangle on autofocus changes (iOS 7+ style). tap to focus. AVFoundation tap to focus feedback rectangle. Ask Question Asked 6 years, 10 months ago. Active today. Viewed 10k times 21. 19. I am developing an iphone application

@Anil's answer is a great start, but it didn't work for me. I wanted to be able to have the user continue to be able to select a focus point, instead of only once (which is what his solution does). Thanks to @Anil for pointing me in the right direction.

There are some differences with my solution.

  1. I wanted to be able to reuse the focus square and animation, rather than only a single time.
  2. I wanted the animation to disappear after it completed (something that I couldn't get @Anil's solution to do.
  3. Instead of using initWithFrame:, I implemented my own initWithTouchPoint:.
  4. I have a method for specifically animating the focus action.
  5. I also have a method for updating the location of the frame.
  6. The size of the frame is within CameraFocusSquare, which means that it's easier to find and update the size as needed.

CameraFocusSquare.h

@import UIKit;

@interface CameraFocusSquare : UIView

- (instancetype)initWithTouchPoint:(CGPoint)touchPoint;
- (void)updatePoint:(CGPoint)touchPoint;
- (void)animateFocusingAction;

@end

CameraFocusSquare.m

#import "CameraFocusSquare.h"

@implementation CameraFocusSquare {
    CABasicAnimation *_selectionBlink;
}

/**
 This is the init method for the square. It sets the frame for the view and sets border parameters. It also creates the blink animation.
 */
- (instancetype)initWithTouchPoint:(CGPoint)touchPoint {
    self = [self init];
    if (self) {
        [self updatePoint:touchPoint];
        self.backgroundColor = [UIColor clearColor];
        self.layer.borderWidth = 2.0f;
        self.layer.borderColor = [UIColor orangeColor].CGColor;

        // create the blink animation
        _selectionBlink = [CABasicAnimation
                animationWithKeyPath:@"borderColor"];
        _selectionBlink.toValue = (id)[UIColor whiteColor].CGColor;
        _selectionBlink.repeatCount = 3;  // number of blinks
        _selectionBlink.duration = 0.4;  // this is duration per blink
        _selectionBlink.delegate = self;
    }
    return self;
}

/**
 Updates the location of the view based on the incoming touchPoint.
 */
- (void)updatePoint:(CGPoint)touchPoint {
    CGFloat squareWidth = 50;
    CGRect frame = CGRectMake(touchPoint.x - squareWidth/2, touchPoint.y - squareWidth/2, squareWidth, squareWidth);
    self.frame = frame;
}

/**
 This unhides the view and initiates the animation by adding it to the layer.
 */
- (void)animateFocusingAction {
    // make the view visible
    self.alpha = 1.0f;
    self.hidden = NO;
    // initiate the animation
    [self.layer addAnimation:_selectionBlink forKey:@"selectionAnimation"];
}

/**
 Hides the view after the animation stops. Since the animation is automatically removed, we don't need to do anything else here.
 */
- (void)animationDidStop:(CAAnimation *)animation finished:(BOOL)flag {
    // hide the view
    self.alpha = 0.0f;
    self.hidden = YES;
}

@end

I initiate all of this on top of a view. This allows me greater flexibility and separates my UI code from my controller code (think MVC).

PreviewView.h

@import UIKit;

@interface PreviewView : UIView

- (IBAction)tapToFocus:(UITapGestureRecognizer *)gestureRecognizer;

@end

PreviewView.m

#import "PreviewView.h"
#import "CameraFocusSquare.h"

@implementation PreviewView {
    CameraFocusSquare *_focusSquare;
}

- (IBAction)tapToFocus:(UITapGestureRecognizer *)gestureRecognizer {
    CGPoint touchPoint = [gestureRecognizer locationOfTouch:0 inView:self];
    if (!_focusSquare) {
        _focusSquare = [[CameraFocusSquare alloc] initWithTouchPoint:touchPoint];
        [self addSubview:_focusSquare];
        [_focusSquare setNeedsDisplay];
    }
    else {
        [_focusSquare updatePoint:touchPoint];
    }
    [_focusSquare animateFocusingAction];
}

@end

Finally, in my UIViewController subclass, I have my UITapGestureRecognizer created and attached to the view. I also implement my tap-to-focus code here.

CameraViewController.m

- (void)viewDidLoad {
    // do other initialization stuff here

    // create the tap-to-focus gesture
    UITapGestureRecognizer *tapToFocusRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(tapToFocus:)];
    tapToFocusRecognizer.numberOfTapsRequired = 1;
    tapToFocusRecognizer.numberOfTouchesRequired = 1;
    [self.previewView addGestureRecognizer:tapToFocusRecognizer];
}

- (IBAction)tapToFocus:(UITapGestureRecognizer *)tapGestureRecognizer {
    if (!_captureDevice) {
        return;
    }
    if (![_captureDevice isFocusPointOfInterestSupported]) {
        return;
    }
    if (![_captureDevice isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
        return;
    }
    [self.previewView tapToFocus:tapGestureRecognizer];
    NSError *error;
    [_captureDevice lockForConfiguration:&error];
    if (error) {
        NSLog(@"Error trying to lock configuration of camera. %@", [error localizedDescription]);
        return;
    }
    CGPoint touchPoint = [tapGestureRecognizer locationOfTouch:0 inView:self.cameraView];
    // range of touch point is from (0,0) to (1,1)
    CGFloat touchX = touchPoint.x / self.previewView.frame.size.width;
    CGFloat touchY = touchPoint.y / self.previewView.frame.size.height;

    _captureDevice.focusMode = AVCaptureFocusModeAutoFocus;
    if ([_captureDevice isExposureModeSupported:AVCaptureExposureModeAutoExpose]) {
        _captureDevice.exposureMode = AVCaptureExposureModeAutoExpose;
    }
    _captureDevice.focusPointOfInterest = CGPointMake(touchX, touchY);
    if ([_captureDevice isExposurePointOfInterestSupported]) {
        _captureDevice.exposurePointOfInterest = CGPointMake(touchX, touchY);
    }
    [_captureDevice unlockForConfiguration];
}

Hope this helps people so they can move onto more important code!

focusPointOfInterest, To focus the camera on a point of interest, first set this property's value, then set the focusMode property to AVCaptureDevice.FocusMode.autoFocus or� Overview. AVCapture Video Preview Layer is a subclass of CALayer that you use to display video as it’s captured by an input device.. You use this preview layer in conjunction with a capture session, as shown in the following code fragment.

Here's a basic Swift view that will show an animated focus square. Just add it to your camera view and hook it up to the focus callback from a tap gesture recognizer.

@objc func didTapToFocus(gesture: UITapGestureRecognizer) {
    let pointInViewCoordinates = gesture.location(in: gesture.view)
    let pointInCameraCoordinates = cameraView.videoPreviewLayer.captureDevicePointConverted(fromLayerPoint: pointInViewCoordinates)
    camera.focusOn(pointInCameraCoordinates: pointInCameraCoordinates)
    cameraView.showFocusBox(at: pointInViewCoordinates)
}

Focus view:

final class CameraFocusBoxView: UIView {

    // MARK: - Instantiation

    init() {
        super.init(frame: .zero)

        backgroundColor = .clear
        layer.addSublayer(focusBoxLayer)
    }

    // MARK: - API

    /// This zooms/fades in a focus square and blinks it a few times, then slowly fades it out
    func showBox(at point: CGPoint) {
        focusBoxLayer.removeAllAnimations()
        let scaleKey = "zoom in focus box"
        let fadeInKey = "fade in focus box"
        let pulseKey = "pulse focus box"
        let fadeOutKey = "fade out focus box"
        guard focusBoxLayer.animation(forKey: scaleKey) == nil,
              focusBoxLayer.animation(forKey: fadeInKey) == nil,
              focusBoxLayer.animation(forKey: pulseKey) == nil,
              focusBoxLayer.animation(forKey: fadeOutKey) == nil
            else { return }

        CATransaction.begin()
        CATransaction.setDisableActions(true)
        focusBoxLayer.position = point
        CATransaction.commit()

        let scale = CABasicAnimation(keyPath: "transform.scale")
        scale.fromValue = 1
        scale.toValue = 0.375
        scale.duration = 0.3
        scale.isRemovedOnCompletion = false
        scale.fillMode = .forwards

        let opacityFadeIn = CABasicAnimation(keyPath: "opacity")
        opacityFadeIn.fromValue = 0
        opacityFadeIn.toValue = 1
        opacityFadeIn.duration = 0.3
        opacityFadeIn.isRemovedOnCompletion = false
        opacityFadeIn.fillMode = .forwards

        let pulsing = CABasicAnimation(keyPath: "borderColor")
        pulsing.toValue = UIColor(white: 1, alpha: 0.5).cgColor
        pulsing.repeatCount = 2
        pulsing.duration = 0.2
        pulsing.beginTime = CACurrentMediaTime() + 0.3 // wait for the fade in to occur

        let opacityFadeOut = CABasicAnimation(keyPath: "opacity")
        opacityFadeOut.fromValue = 1
        opacityFadeOut.toValue = 0
        opacityFadeOut.duration = 0.5
        opacityFadeOut.beginTime = CACurrentMediaTime() + 2 // seconds
        opacityFadeOut.isRemovedOnCompletion = false
        opacityFadeOut.fillMode = .forwards

        focusBoxLayer.add(scale, forKey: scaleKey)
        focusBoxLayer.add(opacityFadeIn, forKey: fadeInKey)
        focusBoxLayer.add(pulsing, forKey: pulseKey)
        focusBoxLayer.add(opacityFadeOut, forKey: fadeOutKey)
    }

    // MARK: - Private Properties

    private lazy var focusBoxLayer: CALayer = {
        let box = CALayer()
        box.bounds = CGRect(x: 0, y: 0, width: 200, height: 200)
        box.borderWidth = 2
        box.borderColor = UIColor.white.cgColor
        box.opacity = 0
        return box
    }()

    // MARK: - Unsupported Initializers

    required init?(coder aDecoder: NSCoder) { fatalError("init(coder:) has not been implemented") }

}

User odemolliens, to specific position � 10 Android Studio:error: illegal character: '\u2028' � 9 AVFoundation tap to focus feedback rectangle � View more network posts →� Touch to Focus Android Camera 02 Jun 2014 on Android I have been working on LenX for few weeks now, and as it is a camera app, it was necessary to add some type of focus. We decided to add touch based focus. Wherever user touches the screen, Android camera would try to focus in that area.

stackoverflow, AVFoundation tap to focus feedback rectangle. 发表于2019-9-23 | | 暂无分类. I am developing an iphone application where I directly use AVFoundation to� The device automatically adjusts the focus once and then changes the focus mode to AVCapture 13.0+ Framework. AVFoundation; Policy Report Bugs Feedback.

User Alexander, AVFoundation tap to focus feedback rectangle � iphone ios ipad avfoundation asked Mar 16 '13 at 12:31. stackoverflow.com � 20 votes� The AVFoundation Capture subsystem provides a common high-level architecture for video, photo, and audio capture services in iOS and macOS. Use this system if you want to: Build a custom camera UI to integrate shooting photos or videos into your app’s user experience.

The coordinate system for metadata ranges from [0,0] in the top-left to [1.0,1.0] in the bottom-right

Comments
  • Unrelated to your problem but, you should be using if (CaptureDeviceClass != Nil)
  • Thanks for your hint. I haven't really cared and known about the differences by now so I looked it up. Due to the fact that I am referencing a class you're absolutely right.
  • @Alexander, I am facing in same problem as yours. Did you solve this problem?
  • unfortunately I haven't solved this issue yet. I think there is only one way - custom implementation. I've found no build in solution
  • @iqueqiorio camFocus is a variable of type CameraFocusSquare, the custom class (first snippet of code).
  • Okay so how do I declare that, I am trying CameraFocusSquare *camFocus = [[CameraFocusSquare alloc] init]; but this is giving me an error?
  • What does it say? Imported CameraFocusSquare.h ?
  • I think you can simply say [device setFocusPointOfInterest: focusPoint]
  • I havent tested it yet but by looking into it I can only say that its PERFECT!