ARKit – Get current position of ARCamera in a scene

arkit unity
arkit tutorial
arkit 4
arkit 3
arkit android
arkit download
arkit ios
arkit wikipedia

I'm in the process of learning both ARKit and Scenekit concurrently, and it's been a bit of a challenge.

With a ARWorldTrackingSessionConfiguration session created, I was wondering if anyone knew of a way to get the position of the user's 'camera' in the scene session. The idea is I want to animate an object towards the user's current position.

let reaperScene = SCNScene(named: "reaper.dae")!
let reaperNode = reaperScene.rootNode.childNode(withName: "reaper", recursively: true)!
reaperNode.position = SCNVector3Make(0, 0, -1)
let scene = SCNScene()
scene.rootNode.addChildNode(reaperNode)

// some unknown amount of time later   
let currentCameraPosition = sceneView.pointOfView?.position
let moveAction = SCNAction.move(to: currentCameraPosition!, duration: 1.0)
reaperNode.runAction(moveAction)

However, it seems that currentCameraPosition is always [0,0,0], even though I am moving the camera around. Any idea what I'm doing wrong? Eventually the idea is I would rotate the object around an invisible sphere until it is in front of the camera and then animate it in, doing something similar to this: Rotate SCNCamera node looking at an object around an imaginary sphere (that way the user sees the object animate towards them)

Thanks for any help.

Set yourself as the ARSession.delegate. Than you can implement session(_:didUpdate:) which will give you an ARFrame for every frame processed in your session. The frame has an camera property that holds information on the cameras transform, rotation and position.

func session(_ session: ARSession, didUpdate frame: ARFrame) {
    // Do something with the new transform
    let currentTransform = frame.camera.transform
    doSomething(with: currentTransform)
}

As rickster pointed out you always can get the current ARFrame and the camera position through it by calling session.currentFrame. This is useful if you need the position just once, eg to move a node where the camera has been but you should use the delegate method if you want to get updates on the camera's position.

ARKit, People Occlusion AR content realistically passes behind and in front of people in the real world, making AR experiences more immersive while also enabling green screen-style effects in almost any environment. Depth estimation improves on iPad Pro in all apps built with ARKit, without any code changes. Discover a smarter, map-based project portfolio. Easier planning, better data management, great collaboration - Arkit gives you one intuitive system to make every day more productive.

I know it had been solved but i have a little neat solution for it .. I would prefere adding a renderer delegate method.. it's a method in ARSCNViewDelegate

func renderer(_ renderer: SCNSceneRenderer, willRenderScene scene: SCNScene, atTime time: TimeInterval) {
    guard let pointOfView = sceneView.pointOfView else { return }
    let transform = pointOfView.transform
    let orientation = SCNVector3(-transform.m31, -transform.m32, transform.m33)
    let location = SCNVector3(transform.m41, transform.m42, transform.m43)
    let currentPositionOfCamera = orientation + location
    print(currentPositionOfCamera)
}

of course you can't by default add the two SCNVector3 out of the box.. so you need to paste out of the class the following

func +(lhv:SCNVector3, rhv:SCNVector3) -> SCNVector3 {
     return SCNVector3(lhv.x + rhv.x, lhv.y + rhv.y, lhv.z + rhv.z)
}

ARKit - Augmented Reality, ARKit and RealityKit are cutting‑edge frameworks that make it easy for developers to create captivating augmented reality experiences for iPhone, iPad, and� ARKit is what Apple calls its set of software development tools to enable developers to build augmented-reality apps for iOS. Most of us will never actually use ARKit, but we see its results and

For your convenience you can create a ViewController extension with an instance method session(_:didUpdate:). Here's the code:

extension ViewController: ARSessionDelegate {

    func session(_ session: ARSession, didUpdate frame: ARFrame) {
        let transform = currentFrame.camera.transform 
        print("Updates")                               // UPDATING        
    }
}

class ViewController: UIViewController, ARSCNViewDelegate {

    @IBOutlet var sceneView: ARSCNView!

    override func viewDidLoad() {
        super.viewDidLoad()
        sceneView.delegate = self                     // ARVIEW DELEGATE
    }   
    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)
        let configuration = ARWorldTrackingConfiguration()
        sceneView.session.run(configuration)
        sceneView.session.delegate = self             // ARSESSION DELEGATE
    }
}

Augmented Reality, ARKit is what Apple calls its set of software development tools to enable developers to build augmented-reality apps for iOS. Most of us will never� ARKit and RealityKit are cutting‑edge frameworks that make it easy for developers to create captivating augmented reality experiences for iPhone, iPad, and iPod touch. Start making something to marvel at today.

What Is Apple's ARKit? Everything You Need to Know, Hand-picked curation of the coolest stuff made with Apple's ARKit. ARKit is a new technology and companies will be jumping on it to create their future app experiences. Augmented Reality Using ARKit can also help you to earn higher salary since now you know how to write a complete app which deals with placing virtual objects in the real world.

Made With ARKit, ARKit is the framework from Apple that handles the processing to built Augmented Reality apps and games for iOS devices. It is a high-level API supplying� A guide to track 2D reference images and detect objects w/ ARKit 2.0, creating realistic content attached to the real world using ViroReact

Introduction to ARKit, (Pocket-lint) - ARKit is Apple's augmented reality (AR) platform for iOS devices. It enables developers to produce apps that interact with the� ARKit 4 Location Anchoring leverages the higher resolution data in Apple Maps to place AR experiences at a specific point in the world in your iPhone and iPad apps.* And face tracking is now supported on all devices with the Apple Neural Engine and a front-facing camera, so even more users can experience the joy of AR in photos and videos.

Comments
  • This works, but you can also get frames without being the session delegate by getting the session's currentFrame property.
  • @rickster you are tight and I have edited the answer, but usually you want to have other nodes react to you changing your position, eg by following you around. Then you would want constant updates on new camera poses anyway.
  • True. Point is, depending on how your app is otherwise set up, you could either have ARKit delivering you frames at 60 fps and you rendering as you get them, or you could be running your own render loop at (up to) 60 fps and ask ARKit for a frame each time you need one. It supports both ways, so use whichever you like.
  • Great, thanks for your help! I'm using frame.camera.transform.columns.3.x ... any reason why that shouldn't work? I'm not clear on what the difference between the columns are, but 3 seems to do it for me.
  • This is not working consistently for me. I'm frequently getting (0,0,0) as xyz coordinates.
  • Why are you just taking the diagonal entries of the rotation matrix? And why is "position" equal to "orientation + location". The position is the location, i.e. the last column of the transform. The upper left 3x3 matrix contains the orientation (and possibly scale and shear). There exist different methods to extract pitch, yaw and roll from it, but the diagonal is not one of them.
  • @kjyv where did you see the location is the last column of the transform? thanks for sharing!
  • @kjyv is transform.m33 for orientation supposed to be -transform.m33?
  • For orientation it's probably better to just use sceneView.pointOfView.presentation.worldOrientation. If you really need to get it from the transform, have a look e.g. here gamedev.stackexchange.com/questions/50963/… You should prefer a quaternion over ambiguous euler angles.
  • This is wrong on so many levels - the transform's orientation is contained in a 3x3 matrix, not just the diagonal entries. The position is the location