## Can I get the measurements for eyes using ARKit 2

I am working on a project which requires size of each eye and distance between the eyes using ARKit and True depth Camera on iPhone X with iOS 12 beta 4 on Xcode 10 beta 4.

As I am new to ARKit, I am searching the solution online with very little understanding. However, I came across a demo code on apple developer portal. In that code, I tried to fetch the required values.

Also referred to the answer by @rickster

I worked on ARSCNFaceGeometry and ARFaceAnchor to fetch few values.

```func update(withFaceAnchor anchor: ARFaceAnchor) {
let faceGeometry = geometry as! ARSCNFaceGeometry
faceGeometry.update(from: anchor.geometry)
print("\nLook at Point :", anchor.lookAtPoint, "\nLeft :", anchor.leftEyeTransform, "\nRight :", anchor.rightEyeTransform)
}
```

Output:

Look at Point : float3(0.14502259, 0.2905016, 0.97809345)

Left : simd_float4x4([[0.99312633, -0.030915117, -0.11289084, 0.0)], [0.0, 0.9644885, -0.26412484, 0.0)], [0.11704737, 0.26230934, 0.9578589, 0.0)], [0.03311049, 0.028666705, 0.028539032, 1.0)]])

Right : simd_float4x4([[0.9830616, -0.047965277, -0.17688738, 0.0)], [0.0, 0.96514606, -0.26171172, 0.0)], [0.18327524, 0.25727874, 0.94879806, 0.0)], [-0.032072492, 0.028482603, 0.028181288, 1.0)]])

Questions

1. How to translate these values to make use of it and get some coordinates?

2. Will this code help me to achieve the size of each eye and distance between the eyes. If no, then what can I do to get the measurements?

In ARKit 2 / Xcode 10 beta 6 / iOS 12 "lookAtPoint" is nulled; but I am able to get the "leftEyeTransform" and "rightEyeTransform", and can then calculate the inter-eye distance using this code (tested on two people at varying distances and validated with a ruler!):

```func update(withFaceAnchor anchor: ARFaceAnchor) {
let faceGeometry = occlusionNode.geometry as! ARSCNFaceGeometry
faceGeometry.update(from: anchor.geometry)
print("\nLook at Point:", anchor.lookAtPoint, "\nLeft:", anchor.leftEyeTransform, "\nRight:", anchor.rightEyeTransform)

var leftEyePosition = SCNVector3(anchor.leftEyeTransform.columns.3.x, anchor.leftEyeTransform.columns.3.y, anchor.leftEyeTransform.columns.3.z)
var rightEyePosition = SCNVector3(anchor.rightEyeTransform.columns.3.x, anchor.rightEyeTransform.columns.3.y, anchor.rightEyeTransform.columns.3.z)
let d = distance(float3(leftEyePosition), float3(rightEyePosition))
print("\nInter-eye distance in centimeters: ", d * 100)
```

}

Some sample output:

```Look at Point: float3(0.0, 0.0, 0.0)
Left: simd_float4x4([[0.9997796, -0.0031222918, 0.020761598, 0.0)], [0.0, 0.98888004, 0.14871553, 0.0)], [-0.020995062, -0.14868274, 0.98866206, 0.0)], [0.031608228, 0.031428084, 0.026632207, 1.0)]])
Right: simd_float4x4([[0.99887705, 0.007054036, -0.04684897, 0.0)], [0.0, 0.9888536, 0.14889139, 0.0)], [0.047377057, -0.1487242, 0.98774314, 0.0)], [-0.031623494, 0.03171173, 0.02669965, 1.0)]])

Inter-eye distance in centimeters:  6.32324
```

NOTE:

Using ARKit to track eye pupils positions |Apple Developer Forums, I've been trying to track eyeball pupils with information provided by ARFaceAnchor's leftEyeTransform & rightEyeTransform properties, however, those do. How would one take accurate measurements of left and right eye pupil arkit 3. Content tagged with arkit 3. , arkit 2. Content tagged with arkit 2. Difficulty getting depth of face landmark points from 2D regions on iPhone X (SceneKit/ARKit app) Can I get the measurements for eyes using ARKit 2. 0.

you can get the vector3 from the transform :

```var leftEyePosition = SCNVector3(leftTransform.columns.3.x, leftTransform.columns.3.y, leftTransform.columns.3.z)

var rightEyePosition = SCNVector3(rightTransform.columns.3.x, rightTransform.columns.3.y, rightTransform.columns.3.z)
```

then you can get the distance form 2 SCNVector3 like this.

```let d = distance(float3(leftEyePosition), float3(rightEyePosition))
```

the result will be in meters, so you may want to divide d by 100

Proceedings of the 1st International and Interdisciplinary , These elements have been called 'fiducial points' and a mesh is applied to them A demonstration is the presentation of ARKit 2 platform that Apple presented unveiling the Measure app for iOS, which uses AR to quickly gauge the size of the dimensions of objects like picture frames, posters and signs, and can also  ARKit provides a coarse 3D mesh geometry matching the size, shape, topology, and current facial expression of the user’s face. ARKit also provides the ARSCNFace Geometry class, offering an easy way to visualize this mesh in SceneKit. Your AR experience can use this mesh to place or draw content that appears to attach to the face.

One line solution for pupils distance:

```simd_distance(faceAnchor.rightEyeTransform.columns.3, faceAnchor.leftEyeTransform.columns.3)
```

Multiple it by 100 to get the answer in centimeters. As mentioned above there is no a simple way to measure iris or pupils size, for that you need to develop an algorithm that will do the following:

1) finds circles and filters the one that describes an iris

2) use ARKit to calculate the pixel to millimeter value and use it to calculate the iris size

Tracking and Visualizing Faces, Live camera video texture-mapped onto the ARKit face mesh, with which you can create effects that appear to distort the user's real face in 3D. A simple robot  You have successfully learned how to measure between two points on a horizontal plane using ARKit. Next time you need to measure a flat surface, try using this app you created instead of a tape measure.

There is no simple way of actually getting the size of the pupils themselves. See this StackOverFlow thread and the answer provided by Rickster for more info: Getting The Sizes Of The Eyes In ARKit

Getting the estimated distance between the eyes however is possible using the `leftEyeTransform` and `rightEyeTransforms` which are simply:

transform matrixes indicating the position and orientation of the face's left and right eyes.

What you are interested in is the 3rd Columns from these which contain the data we need.

A simple way to calculate the distance is to use the GLK helpers which dont require any additional extensions.

As such an example might look like something like so:

```//----------------------
//MARK ARSCNViewDelegate
//----------------------

extension ViewController: ARSCNViewDelegate{

/// Creates A GLKVector3 From a Simd_Float4
///
/// - Parameter transform: simd_float4
/// - Returns: GLKVector3
func glkVector3FromARFaceAnchorTransform(_ transform: simd_float4) -> GLKVector3{

return GLKVector3Make(transform.x, transform.y, transform.z)
}

func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {

//1. Check We Have A Valid ARFaceAnchor
guard let faceAnchor = anchor as? ARFaceAnchor else { return }

//2. Get The Position Of The Left & Right Eyes
let leftEyePosition = glkVector3FromARFaceAnchorTransform(faceAnchor.leftEyeTransform.columns.3)
let righEyePosition = glkVector3FromARFaceAnchorTransform(faceAnchor.rightEyeTransform.columns.3)

//3. Calculate The Distance Between Them
let distanceBetweenEyesInMetres = GLKVector3Distance(leftEyePosition, righEyePosition)
let distanceBetweenEyesInCM = distanceBetweenEyesInMetres/100
print("The Distance Between The Eyes Is Approximatly \(distanceBetweenEyesInCM)")

}

}
```

Hope it helps...

ARKit 101: How to Measure Distance Between Two Points on a , In our last ARKit tutorial, we learned how to measure the sizes of These files will make sure that you don't get lost in this guide, If you're not sure how to do this, follow Step 2 in our post on piloting a 3D plane using News: Magic Leap Patent Reveals the Lightpack May Be More Than Meets the Eye  Now though, let’s dive right in and show you exactly how you can measure quickly – and hopefully reliably – using Apple’s new Measure app within iOS 12 which is based on updated ARKit 2.0 framework. Step 1: First and foremost, you are going to need to have iOS 12 beta 1 installed on an iPhone.

How To Use iOS 12 Measure App Based On ARKit 2.0, Can I get the measurements for eyes using ARKit 2. I am working on a project which requires size of each eye and distance between the eyes using ARKit and​  The biggest differences you're going to see moving forward is going to be with eye tracking and face mapping, a feature enabled in ARKit 2 but not used a lot yet. Developers will be able to use the same sensors which allow you to unlock your phone with your face to actually see where you're looking.

Simple ARKit 2 Tutorial on How to Build an App with , How To Use iOS 12 ARKit 2.0-Based Measure App Step 2: The on-screen instructions will ask you to move your iPhone around. If you have a keen eye, you may also have noticed the “Level” option in the tab bar of the  Hands on with the new ARKit Measure app in iOS 12. By Andrew O'Hara Monday, June 11, 2018, 11:19 am PT (02:19 pm ET) Measure is a simple ARKit application allowing you to measure objects in the

Eye tracking in ARKit 2 introduces a need for new permission in iOS , Check out what updates are there in ARKit 2 and what cool objects; Face tracking (determine light direction, tongue and eye tracking) If you want to know more about face tracking configuration, you can find information in the view raw ViewController-set-size-and-position.swift hosted with ❤ by GitHub. At WWDC 2018 Apple announced a host of improvements that are coming to version 2.0 of ARKit. The first of these improvements is USDZ, a new file format developed by Apple and Disney’s Pixar for ARKit to make it easier to create and share augmented reality content. The new format will be supported natively by Adobe’s