我正在使用AVCaptureVideoDataOutput并希望将CMSampleBufferRef转换为UI
Image.许多答案都是一样的,比如
UIImage created from CMSampleBufferRef not displayed in UIImageView?和
AVCaptureSession with multiple previews
如果我将VideoDataOutput颜色空间设置为BGRA,则可以正常工作(记入此答案CGBitmapContextCreateImage error)
Nsstring* key = (Nsstring*)kCVPixelBufferPixelFormatTypeKey; NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; [dataOutput setVideoSettings:videoSettings];
如果没有上述视频设置,我将收到以下错误
CGBitmapContextCreate: invalid data bytes/row: should be at least 2560 for 8 integer bits/component,3 components,kCGImageAlphaPremultipliedFirst. <Error>: CGBitmapContextCreateImage: invalid context 0x0
使用BGRA不是一个好选择,因为从YUV(默认AVCaptureSession颜色空间)到BGRA的转换开销,如Brad和Codo在How to get the Y component from CMSampleBuffer resulted from the AVCaptureSession?所述
那么有没有办法将CMSampleBufferRef转换为UIImage并使用YUV颜色空间?
解决方法
经过大量研究并阅读了苹果文档和wikipedis.我想出了答案,它对我来说非常有效.因此,对于未来的读者,当视频像素类型设置为kCVPixelFormatType_420YpCbCr8BiPlanarFullRange时,我将共享代码以将CMSampleBufferRef转换为UIImage
// Create a UIImage from sample buffer data // Works only if pixel format is kCVPixelFormatType_420YpCbCr8BiPlanarFullRange -(UIImage *) imageFromSamplePlanerPixelBuffer:(CMSampleBufferRef) sampleBuffer{ @autoreleasepool { // Get a CMSampleBuffer's Core Video image buffer for the media data CVImageBufferRef imageBuffer = CMSampleBufferGetimageBuffer(sampleBuffer); // Lock the base address of the pixel buffer CVPixelBufferLockBaseAddress(imageBuffer,0); // Get the number of bytes per row for the plane pixel buffer void *baseAddress = CVPixelBufferGetBaseAddressOfPlane(imageBuffer,0); // Get the number of bytes per row for the plane pixel buffer size_t bytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer,0); // Get the pixel buffer width and height size_t width = CVPixelBufferGetWidth(imageBuffer); size_t height = CVPixelBufferGetHeight(imageBuffer); // Create a device-dependent gray color space CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray(); // Create a bitmap graphics context with the sample buffer data CGContextRef context = CGBitmapContextCreate(baseAddress,width,height,8,bytesPerRow,colorSpace,kCGImageAlphaNone); // Create a Quartz image from the pixel data in the bitmap graphics context CGImageRef quartzImage = CGBitmapContextCreateImage(context); // Unlock the pixel buffer CVPixelBufferUnlockBaseAddress(imageBuffer,0); // Free up the context and color space CGContextRelease(context); CGColorSpaceRelease(colorSpace); // Create an image object from the Quartz image UIImage *image = [UIImage imageWithCGImage:quartzImage]; // Release the Quartz image CGImageRelease(quartzImage); return (image); } }