Skip to content
This repository has been archived by the owner on Dec 22, 2020. It is now read-only.

App crashing while converting video frames to Images iOS #216

Open
rohitphogat19 opened this issue Aug 20, 2020 · 7 comments
Open

App crashing while converting video frames to Images iOS #216

rohitphogat19 opened this issue Aug 20, 2020 · 7 comments

Comments

@rohitphogat19
Copy link

I am using the Agora-Plugin-Raw-Data-API-Objective-C sample to record and save local video during a video call. But it is crashing when converting all stored Video frames to Images. I have used and modify image creation function in AgoraMediaDataPlugin.mm file from the sample. The issue comes when I press the stop recording button and start converting all stored frames to images.

Swift Code:

func mediaDataPlugin(_ mediaDataPlugin: AgoraMediaDataPlugin, didCapturedVideoRawData videoRawData: AgoraVideoRawData) -> AgoraVideoRawData {
        self.videoFrameDatas.append(videoRawData)
        return videoRawData
    }

@objc private func didPressStopRecordingButton() {
        self.agoraKit?.leaveChannel(nil)
        
        for frame in videoFrameDatas {
            if let image = self.agoraMediaDataPlugin?.yuvToUIImage(with: frame) {
                self.recordedImages.append(image)
            }

        }
    }

AgoraMediaDataPlugin.mm code

- (AGImage *)yuvToUIImageWithVideoRawData:(AgoraVideoRawData *)data {
    size_t width = data.width;
    size_t height = data.height;
    size_t yStride = data.yStride;
    size_t uvStride = data.uStride;
    
    char* yBuffer = data.yBuffer;
    char* uBuffer = data.uBuffer;
    char* vBuffer = data.vBuffer;
    
    size_t uvBufferLength = height * uvStride;
    char* uvBuffer = (char *)malloc(uvBufferLength);
    for (size_t uv = 0, u = 0; uv < uvBufferLength; uv += 2, u++) {
        // swtich the location of U、V,to NV12
        uvBuffer[uv] = uBuffer[u];
        uvBuffer[uv+1] = vBuffer[u];
    }
    
    @autoreleasepool {
        void * planeBaseAddress[2] = {yBuffer, uvBuffer};
        size_t planeWidth[2] = {width, width / 2};
        size_t planeHeight[2] = {height, height / 2};
        size_t planeBytesPerRow[2] = {yStride, uvStride * 2};
        
        CVPixelBufferRef pixelBuffer = NULL;
        CVReturn result = CVPixelBufferCreateWithPlanarBytes(kCFAllocatorDefault,
                                                             width, height,
                                                             kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange,
                                                             NULL, 0,
                                                             2, planeBaseAddress, planeWidth, planeHeight, planeBytesPerRow,
                                                             NULL, NULL, NULL,
                                                             &pixelBuffer);
        if (result != kCVReturnSuccess) {
            NSLog(@"Unable to create cvpixelbuffer %d", result);
        }
        
        AGImage *image = [self CVPixelBufferToImage:pixelBuffer rotation:data.rotation];
        CVPixelBufferRelease(pixelBuffer);
        if(uvBuffer != NULL) {
            free(uvBuffer);
            uvBuffer = NULL;
        }
        return image;
        
    }
}

Crash:

Also, is there any alternatives to save the local video other than using Video Raw data?

@rohitphogat19 rohitphogat19 added the bug Something isn't working label Aug 20, 2020
@plutoless plutoless added bug Something isn't working and removed bug Something isn't working labels Aug 25, 2020
@plutoless
Copy link
Contributor

@rohitphogat19 it's not a bug from our end. it's the way how you save the video frame is wrong.
According to your code, you are trying to save every videoFrame, but the data has been freed as soon as didCapturedVideoRawData api call finishes. You will need to open a write buffer to local storage, and every time you get video frame you need to write it to local storage using that.
A more comprehensive way is to prepare a cache buffer yourself, and memcopy the data in videoFrame to your own buffer for cacheing, while at the same time you have another thread reading the data from your own buffer and write to local storage. The advantage of this approach is your writing operation will not block sdk main thread.
Hope this helps.

@rohitphogat19
Copy link
Author

rohitphogat19 commented Oct 17, 2020

@plutoless I was able to convert frames to Images. But how to write these images to a Asset writer to save as video file. I was able to convert AgoraVideoRawData to CVPixelBuffer in swift and Images are also fine. My code is

func mediaDataPlugin(_ mediaDataPlugin: AgoraMediaDataPlugin, didCapturedVideoRawData videoRawData: AgoraVideoRawData) -> AgoraVideoRawData {
        //planeBaseAddress: UnsafeMutablePointer<UnsafeMutableRawPointer?>
        
        let imageHeight = Int(videoRawData.height)
        let imageWidth = Int(videoRawData.width)
        let yStrideValue = Int(videoRawData.yStride)
        let uvStrideValue = Int(videoRawData.uStride)
        let uvBufferLength = imageHeight * uvStrideValue
        
        
        // Buffers
        let uBuffer = videoRawData.uBuffer // UnsafeMutablePointer<Int8>
        let vBuffer = videoRawData.vBuffer // UnsafeMutablePointer<Int8>
        let yBuffer = videoRawData.yBuffer // UnsafeMutablePointer<Int8>
        
        let uvBuffer = UnsafeMutablePointer<Int8>.allocate(capacity: uvBufferLength)
        var uv = 0, u = 0
        while uv < uvBufferLength {
            // swtich the location of U、V,to NV12
            uvBuffer[uv] = uBuffer![u]
            uvBuffer[uv + 1] = vBuffer![u]
            uv += 2
            u += 1
        }

        var planeBaseAddressValues = [UnsafeMutableRawPointer(yBuffer), UnsafeMutableRawPointer(uvBuffer)]
        let planeBaseAddress = UnsafeMutablePointer<UnsafeMutableRawPointer?>.allocate(capacity: 2)
        planeBaseAddress.initialize(from: &planeBaseAddressValues, count: 2)
    
        var planewidthValues: [Int] = [imageWidth, imageWidth/2]
        let planeWidth = UnsafeMutablePointer<Int>.allocate(capacity: 2)
        planeWidth.initialize(from: &planewidthValues, count: 2)
        
        var planeHeightValues: [Int] = [imageHeight, imageHeight/2]
        let planeHeight = UnsafeMutablePointer<Int>.allocate(capacity: 2)
        planeHeight.initialize(from: &planeHeightValues, count: 2)
        
        let planeBytesPerRowValues: [Int] = [yStrideValue, uvStrideValue * 2]
        let planeBytesPerRow = UnsafeMutablePointer<Int>.allocate(capacity: 2)
        planeBytesPerRow.initialize(from: planeBytesPerRowValues, count: 2)
        
        var pixelBuffer: CVPixelBuffer? = nil
        let result = CVPixelBufferCreateWithPlanarBytes(kCFAllocatorDefault, imageWidth, imageHeight, kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, nil, 0, 2, planeBaseAddress, planeWidth, planeHeight, planeBytesPerRow, nil, nil, nil, &pixelBuffer)
        
        if result != kCVReturnSuccess {
            print("Unable to create cvpixelbuffer \(result)")
        }
        
        guard let buffer = pixelBuffer else {
            
            return videoRawData
        }
        self.frameCounts += 1
        self.handleAndWriteBuffer(buffer: buffer, renderTime: videoRawData.renderTimeMs, frameCount: self.frameCounts)
        planeHeight.deinitialize(count: 2)
        planeWidth.deinitialize(count: 2)
        planeBytesPerRow.deinitialize(count: 2)
        planeBaseAddress.deinitialize(count: 2)
        uvBuffer.deinitialize(count: uvBufferLength)
        planeBaseAddress.deallocate()
        planeWidth.deallocate()
        planeHeight.deallocate()
        planeBytesPerRow.deallocate()
        uvBuffer.deallocate()
        return videoRawData
    }

private func handleAndWriteBuffer(buffer: CVPixelBuffer, renderTime: Int64, frameCount: Int) {
        if frameCount == 1 {
            let videoPath = self.getFileFinalPath(directory: "Recordings", fileName: self.fileName)
            let writer = try! AVAssetWriter(outputURL: videoPath!, fileType: .mov)
            var videoSettings = [String:Any]()
            videoSettings.updateValue(1280, forKey: AVVideoHeightKey)
            videoSettings.updateValue(720, forKey: AVVideoWidthKey)
            videoSettings.updateValue(AVVideoCodecType.h264, forKey: AVVideoCodecKey)
            
            let input = AVAssetWriterInput(mediaType: .video, outputSettings: videoSettings)
            input.expectsMediaDataInRealTime = false
            let adapter = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: input, sourcePixelBufferAttributes: [kCVPixelBufferPixelFormatTypeKey as String : NSNumber(value: Int32(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange))])
            if writer.canAdd(input) {
                writer.add(input)
            }
            writer.startWriting()
            writer.startSession(atSourceTime: CMTime(value: renderTime, timescale: 1000))
            self.assetWriter = writer
            self.videoWriterInput = input
            self.videoAdapter = adapter
        } else {
            print(self.assetWriter.status.rawValue)
            renderQueue.async {
                if let sampluBuffer = self.getSampleBuffer(buffer: buffer, renderTime: renderTime) {
                    let time = CMSampleBufferGetPresentationTimeStamp(sampluBuffer)
                    print(time)
                    print(time.seconds)
                    self.videoWriterInput.append(sampluBuffer)
                }
            }
        }
    }

Asset writer status changes from writing to failed after a few frames.

@plutoless
Copy link
Contributor

@rohitphogat19 any failure reasons?

@rohitphogat19
Copy link
Author

@plutoless Just got this

assetWriter.error: Optional(Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12780), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x281cffa50 {Error Domain=NSOSStatusErrorDomain Code=-12780 "(null)"}})

@plutoless
Copy link
Contributor

input.expectsMediaDataInRealTime = false
why are you doing this?
please take a look at https://developer.apple.com/documentation/avfoundation/avassetwriterinput
especially the overview bullet point parts.

@rohitphogat19
Copy link
Author

@plutoless I have tried that also but no success. Also, render time in AgoraVideoRawData is in the range of like 131252828 while if we use AVCapturesession it comes in the range of below

CMTime(value: 131765228549125, timescale: 1000000000, flags: __C.CMTimeFlags(rawValue: 1), epoch: 0)
131765.228549125

Maybe that's why assetWriter is failed. How can we convert AgoraVideoRawData render time to CMTime for writing to AVAssetWriter? The same goes for AgoraAudioRawData. How can we write that to AVAssetWriter.

@plutoless
Copy link
Contributor

plutoless commented Nov 2, 2020

if you believe this is caused by CMTime, try using
let time = CMTime(seconds: CACurrentMediaTime(), preferredTimescale: 1000)
instead

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants