Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
545 views
in Technique[技术] by (71.8m points)

ios - Swift: Get the TruthDepth camera parameters for face tracking in ARKit

My goal:

I am trying to get the TruthDepth camera parameters (such as the intrinsic, extrinsic, lens distortion etc) for the TruthDepth camera while I am doing the face tracking. I read that there is examples and possible to that with OpenCV. I am just wondering should one achieve similar goals in Swift.

What I have read and tried:

I read that the apple documentation about ARCamera: intrinsics and AVCameraCalibrationData: extrinsicMatrix and intrinsicMatrix.

However, all I found was just the declarations for both AVCameraCalibrationData and ARCamera:


For AVCameraCalibrationData


For intrinsicMatrix

var intrinsicMatrix: matrix_float3x3 { get }

For extrinsicMatrix

var extrinsicMatrix: matrix_float4x3 { get }

I also read this post: get Camera Calibration Data on iOS and tried Bourne's suggestion:

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
        let ex = photo.depthData?.cameraCalibrationData?.extrinsicMatrix
        //let ex = photo.cameraCalibrationData?.extrinsicMatrix
        let int = photo.cameraCalibrationData?.intrinsicMatrix
        photo.depthData?.cameraCalibrationData?.lensDistortionCenter
        print ("ExtrinsicM: (String(describing: ex))")
        print("isCameraCalibrationDataDeliverySupported: (output.isCameraCalibrationDataDeliverySupported)")
    }

But it does not printing the matrix at all.


For ARCamera I have read from Andy Fedoroff's Focal Length of the camera used in RealityKit:

var intrinsics: simd_float3x3 { get }
func inst (){
    sceneView.pointOfView?.camera?.focalLength
    DispatchQueue.main.asyncAfter(deadline: .now() + 2.0) {
        print(" Focal Length: (String(describing: self.sceneView.pointOfView?.camera?.focalLength))")
        print("Sensor Height: (String(describing: self.sceneView.pointOfView?.camera?.sensorHeight))")
        // SENSOR HEIGHT IN mm
        let frame = self.sceneView.session.currentFrame
        // INTRINSICS MATRIX
        print("Intrinsics fx: (String(describing: frame?.camera.intrinsics.columns.0.x))")
        print("Intrinsics fy: (String(describing: frame?.camera.intrinsics.columns.1.y))")
        print("Intrinsics ox: (String(describing: frame?.camera.intrinsics.columns.2.x))")
        print("Intrinsics oy: (String(describing: frame?.camera.intrinsics.columns.2.y))")
    }
}

It shows the render camera parameters:

Focal Length: Optional(20.784610748291016)
Sensor Height: Optional(24.0)
Intrinsics fx: Optional(1277.3052)
Intrinsics fy: Optional(1277.3052)
Intrinsics ox: Optional(720.29443)
Intrinsics oy: Optional(539.8974)

However, this only shows the render camera instead of the TruthDepth camera that I am using for face tracking.


So can anyone help me get started with getting the TruthDepth camera parameters as the documentation did not really show any example other than the declarations?

Thank you so much!

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

The reason why you cannot print the intrinsics is probably because you got nil in the optional chaining. You should have a look at Apple's remark here and here.

Camera calibration data is present only if you specified the isCameraCalibrationDataDeliveryEnabled and isDualCameraDualPhotoDeliveryEnabled settings when requesting capture. For camera calibration data in a capture that includes depth data, see the AVDepthData cameraCalibrationData property.

To request capture of depth data alongside a photo (on supported devices), set the isDepthDataDeliveryEnabled property of your photo settings object to true when requesting photo capture. If you did not request depth data delivery, this property's value is nil.

So if you want to get the intrinsicMatrix and extrinsicMatrix of the TrueDepth camera, you should use builtInTrueDepthCamera as the input device, set the isDepthDataDeliveryEnabled of the pipeline's photo output to true, and set isDepthDataDeliveryEnabled to true when you capture the photo. Then you can access the intrinsic matrices in photoOutput(_: didFinishProcessingPhoto: error:) call back by accessing the depthData.cameraCalibrationData attribute of photo argument.

Here's a code sample for setting up such a pipeline.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...