Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
298 views
in Technique[技术] by (71.8m points)

ios - Changing AVCaptureDeviceInput leads to AVAssetWriterStatusFailed

I am trying to change the Camera View Front and Back.it is working well.if video is recorded without flipping with Pause/Record Option it is working fine.But if we Flip Camera View once then, further recording video is not saving which leads to AVAssetWriterStatusFailed-The operation could not be completed. Can anybody help me to find where i have gone wrong ? Below is my code.

Camera.m

- (void)flipCamera{
NSArray * inputs = _session.inputs;
for ( AVCaptureDeviceInput * INPUT in inputs ) {
    AVCaptureDevice * Device = INPUT.device ;
    if ( [ Device hasMediaType : AVMediaTypeVideo ] ) {
        AVCaptureDevicePosition position = Device . position ; AVCaptureDevice * newCamera = nil ; AVCaptureDeviceInput * newInput = nil ;
        if ( position == AVCaptureDevicePositionFront )
            newCamera = [ self cameraWithPosition : AVCaptureDevicePositionBack ] ;
        else
            newCamera = [ self cameraWithPosition : AVCaptureDevicePositionFront ] ; newInput = [ AVCaptureDeviceInput deviceInputWithDevice : newCamera error : nil ] ;
        // beginConfiguration ensures that pending changes are not applied immediately
        [ _session beginConfiguration ] ;
        [ _session removeInput : INPUT ] ;
        [ _session addInput : newInput ] ;
        // Changes take effect once the outermost commitConfiguration is invoked.
        [ _session commitConfiguration ] ;
        break ;
    }
}
for ( AVCaptureDeviceInput * INPUT in inputs ) {
    AVCaptureDevice * Device = INPUT.device ;
    if ( [ Device hasMediaType : AVMediaTypeAudio ] ) {
        // audio input from default mic
        AVCaptureDevice* mic = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
        AVCaptureDeviceInput* newInput = [AVCaptureDeviceInput deviceInputWithDevice:mic error:nil];
        //            [_session addInput:micinput];
        // beginConfiguration ensures that pending changes are not applied immediately
        [ _session beginConfiguration ] ;
        [ _session removeInput : INPUT ] ;
        [ _session addInput : newInput ] ;
        // Changes take effect once the outermost commitConfiguration is invoked.
        [ _session commitConfiguration ] ;
        break ;
    }
}
}

- ( AVCaptureDevice * ) cameraWithPosition : ( AVCaptureDevicePosition ) position
 {
NSArray * Devices = [ AVCaptureDevice devicesWithMediaType : AVMediaTypeVideo ] ;
for ( AVCaptureDevice * Device in Devices )
    if ( Device . position == position )
        return Device ;
return nil ;
}

- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
BOOL bVideo = YES;
@synchronized(self)
{
    if (!self.isCapturing  || self.isPaused)
    {
        return;
    }
    if (connection != _videoConnection)
    {
        bVideo = NO;
    }
    if ((_encoder == nil) && !bVideo)
    {
        CMFormatDescriptionRef fmt = CMSampleBufferGetFormatDescription(sampleBuffer);
        [self setAudioFormat:fmt];
        NSString* filename = [NSString stringWithFormat:@"capture%d.mp4", _currentFile];
        NSString* path = [NSTemporaryDirectory() stringByAppendingPathComponent:filename];
        _encoder = [VideoEncoder encoderForPath:path Height:_cy width:_cx channels:_channels samples:_samplerate];
    }
    if (_discont)
    {
        if (bVideo)
        {
            return;
        }
        _discont = NO;
        // calc adjustment
        CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
        CMTime last = bVideo ? _lastVideo : _lastAudio;
        if (last.flags & kCMTimeFlags_Valid)
        {
            if (_timeOffset.flags & kCMTimeFlags_Valid)
            {
                pts = CMTimeSubtract(pts, _timeOffset);
            }
            CMTime offset = CMTimeSubtract(pts, last);
            NSLog(@"Setting offset from %s", bVideo?"video": "audio");
            NSLog(@"Adding %f to %f (pts %f)", ((double)offset.value)/offset.timescale, ((double)_timeOffset.value)/_timeOffset.timescale, ((double)pts.value/pts.timescale));
            // this stops us having to set a scale for _timeOffset before we see the first video time
            if (_timeOffset.value == 0)
            {
                _timeOffset = offset;
            }
            else
            {
                _timeOffset = CMTimeAdd(_timeOffset, offset);
            }
        }
        _lastVideo.flags = 0;
        _lastAudio.flags = 0;
    }
    // retain so that we can release either this or modified one
    CFRetain(sampleBuffer);
    if (_timeOffset.value > 0)
    {
        CFRelease(sampleBuffer);
        sampleBuffer = [self adjustTime:sampleBuffer by:_timeOffset];
    }
    // record most recent time so we know the length of the pause
    CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
    CMTime dur = CMSampleBufferGetDuration(sampleBuffer);
    if (dur.value > 0)
    {
        pts = CMTimeAdd(pts, dur);
    }
    if (bVideo)
    {
        _lastVideo = pts;
    }
    else
    {
        _lastAudio = pts;
    }
}
// pass frame to encoder
[_encoder encodeFrame:sampleBuffer isVideo:bVideo];
CFRelease(sampleBuffer);
}

Encoder.m

 - (BOOL) encodeFrame:(CMSampleBufferRef) sampleBuffer isVideo:(BOOL)bVideo
  {
 if (CMSampleBufferDataIsReady(sampleBuffer))
{
    if (_writer.status == AVAssetWriterStatusUnknown)
    {
        CMTime startTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
        [_writer startWriting];
        [_writer startSessionAtSourceTime:startTime];
    }
    if (_writer.status == AVAssetWriterStatusFailed)
    {   // If Camera View is Flipped then Loop Enters inside this condition - writer error The operation could not be completed
        NSLog(@"writer error %@", _writer.error.localizedDescription);
        return NO;
    }
    if (bVideo)
    {
        if (_videoInput.readyForMoreMediaData == YES)
        {
            [_videoInput appendSampleBuffer:sampleBuffer];
            return YES;
        }
    }
    else
    {
        if (_audioInput.readyForMoreMediaData)
        {
            [_audioInput appendSampleBuffer:sampleBuffer];
            return YES;
        }
    }
}
return NO;
}

Thanks in Advance.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

The problem is this line:

if (connection != _videoConnection)
    {
        bVideo = NO;
    }

When you change the camera a new videoConnection is created, I don't know where either how. But if you change this line like below it works:

//if (connection != _videoConnection)
if ([connection.output connectionWithMediaType:AVMediaTypeVideo] == nil)
    {
        bVideo = NO;
    }

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...