如何从WebRTC获取当前图片作为图像?

问题描述

我正在使用WebRTC从用户摄像机捕获视频。在某些地方,我想将当前图片作为UIImage保存到照片库中。我正在使用localVideoView来显示本地摄像机的视频,但是当我尝试对该视图进行屏幕截图时,它是空的(只是蓝色背景)。

这是我制作屏幕截图的代码

func screenShotMethod() {
    dispatchQueue.main.async {
        //Create the UIImage
        UIGraphicsBeginImageContext(self.localVideoView!.frame.size)
        self.localVideoView?.layer.render(in: UIGraphicsGetCurrentContext()!)
        let image = UIGraphicsGetimageFromCurrentimageContext()
        UIGraphicsEndImageContext()
        //Save it to the camera roll
        UIImageWritetoSavedPhotosAlbum(image!,nil,nil)
    }
}

解决方法

这是在 WebRTC 视频通话期间无需相机预览即可拍摄照片的示例代码。必须通过信号或按钮点击调用 TakePicture() 方法。

import AVFoundation
import WebRTC
import UIKit
import Foundation
class CallViewController: UIViewController{
var captureSession : AVCaptureSession?
   func TakePicture() {
        DispatchQueue.main.async { [self] in
            captureSession = AVCaptureSession()
            captureSession!.beginConfiguration()
            let photoOutput = AVCapturePhotoOutput()
            photoOutput.isHighResolutionCaptureEnabled = true
            photoOutput.isLivePhotoCaptureEnabled = false
            if let captureDevice = AVCaptureDevice.default(for: .video){
                do
                {
                    let input = try AVCaptureDeviceInput(device: captureDevice)
                    if captureSession!.canAddInput(input){
                        captureSession!.addInput(input)
                    }
                } catch let error {
                    
                }
                
                if captureSession!.canAddOutput(photoOutput){
                    captureSession!.addOutput(photoOutput)
                }
                
                let cameraLayer = AVCaptureVideoPreviewLayer()
                cameraLayer.session = captureSession
                
                captureSession!.commitConfiguration()
                captureSession!.startRunning()
                
                let photoSettings = AVCapturePhotoSettings()
                //photoSettings.flashMode = .auto //check device properties before turning on flash
                photoSettings.photoQualityPrioritization = .balanced
                photoOutput.capturePhoto(with: photoSettings,delegate: self)
                
            }
        }
    }
}
extension CallViewController: AVCapturePhotoCaptureDelegate{
    func photoOutput(_ output: AVCapturePhotoOutput,didFinishProcessingPhoto photo: AVCapturePhoto,error: Error?) {
        captureSession!.stopRunning()
        captureSession = nil
        let imageData = photo.fileDataRepresentation()
        //Do the rest with image bytes
        
    }
}