UPDATE 似乎与 LegoCV 有问题,它甚至无法从简单的 UIImage 创建 OCVMat
let image = UIImage(named: "myImage")
let mat = OCVMat(image: image!)
我正在尝试将 CVPixelBuffer (从相机 - 视频输出)转换为 Mat (OpenCV) - OCVMat
我使用下一个框架在我的 iOS Swift 项目中添加 OpenCV https://github.com/Legoless/LegoCV
it wraps OpenCV native C++ classes into lightweight Objective-C
classes, which are then natively bridged to Swift
我为我的相机类实现了 AVCaptureVideoDataOutputSampleBufferDelegate 以获取相机帧缓冲区
设置所需的视频输入并开始 session ,相机工作正常,缓冲区即将到来,但是当我尝试从 CVPixelBuffer 创建 OCVMat (" OCVMat(pixelBuffer: imageBuffer) ") 应用程序崩溃并出现下一个错误
opencv(1934,0x16f087000) malloc: *** error for object 0x16f086108: pointer being freed was not allocated
*** set a breakpoint in malloc_error_break to debug
Objective-C 类:https://github.com/Legoless/LegoCV/blob/master/LegoCV/LegoCV/Wrapper/Core/Mat/OCVMatDataAllocator.mm
一些 Swift 代码
fileprivate func configureVideoOutput() {
let videoOutput = AVCaptureVideoDataOutput()
videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "sample buffer"))
if self.session.canAddOutput(videoOutput) {
print("canAddOutput yes")
self.session.addOutput(videoOutput)
print("canAddOutput yes added")
} else {
print("canAddOutput no")
}
}
private func matFromSampleBuffer(sampleBuffer: CMSampleBuffer) -> OCVMat? {
guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return nil }
return OCVMat(pixelBuffer: imageBuffer)
}
public func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
count = count + 1
print("Got a frame # \(count)")
guard let mat = matFromSampleBuffer(sampleBuffer: sampleBuffer) else { return }
}
Best Answer-推荐答案 strong>
解决方案 - 不要使用包装器,最好在 .mm 文件中编写 OpenCV C++ 代码(与 Objective-C 混合)
关于ios - 将 CVPixelBuffer 转换为 Mat (OpenCV),我们在Stack Overflow上找到一个类似的问题:
https://stackoverflow.com/questions/48540004/
|