Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
885 views
in Technique[技术] by (71.8m points)

objective c - iOS AVPlayer cant play 240 fps video

I recorded a 240 fps video after changing the AVCaptureDeviceFormat. If I save that video in the photo library, the slowmo effect is there. But, If I play that file from documents directory, using an AVPlayer, I cant see the slowmo effect.

Code to play the video:

    AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:[AVAsset assetWithURL:[NSURL fileURLWithPath:fullPath]]];

     AVPlayer *feedVideoPlayer = [AVPlayer playerWithPlayerItem:playerItem];

    AVPlayerViewController *playerController = [[AVPlayerViewController alloc] init];

 playerController.view.frame = CGRectMake(0, 0, videoPreviewView.frame.size.width, videoPreviewView.frame.size.height);

 playerController.player = feedVideoPlayer;
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

It's a bit annoying, but I believe you'll need to re-create the video in an AVComposition if you don't want to lose quality. I'd love to know if there is another way, but this is what I've come up with. You can technically export the video via AVAssetExportSession, but using a PassThrough quality will result in the same video file, which won't be slow motion- you'll need to transcode it, which loses quality (AFAIK. See Issue playing slow-mo AVAsset in AVPlayer for that solution).


The first thing you'll need to do is grab the source media's original time mapping objects. You can do that like so:

let options = PHVideoRequestOptions()
options.version = PHVideoRequestOptionsVersion.current
options.deliveryMode = .highQualityFormat

PHImageManager().requestAVAsset(forVideo: phAsset, options: options, resultHandler: { (avAsset, mix, info) in

    guard let avAsset = avAsset else { return }

    let originalTimeMaps = avAsset.tracks(withMediaType: AVMediaTypeVideo)
        .first?
        .segments
        .flatMap { $0.timeMapping } ?? []

}

Once you have timeMappings of the original media (the one sitting in your documents directory), you can pass in the URL of that media and the original CMTimeMapping objects that you would like to recreate. Then create a new AVComposition that is ready to play in an AVPlayer. You'll need a class similar to this:

class CompositionMapper {

let url: URL
let timeMappings: [CMTimeMapping]

init(for url: URL, with timeMappings: [CMTimeMapping]) {
    self.url = url
    self.timeMappings = timeMappings
}

init(with asset: AVAsset, and timeMappings: [CMTimeMapping]) {
    guard let asset = asset as? AVURLAsset else {
        print("cannot get a base URL from this asset.")
        fatalError()
    }

    self.timeMappings = timeMappings
    self.url = asset.url
}

func compose() -> AVComposition {
    let composition = AVMutableComposition(urlAssetInitializationOptions: [AVURLAssetPreferPreciseDurationAndTimingKey: true])

    let emptyTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
    let audioTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)

    let asset = AVAsset(url: url)
    guard let videoAssetTrack = asset.tracks(withMediaType: AVMediaTypeVideo).first else { return composition }

    var segments: [AVCompositionTrackSegment] = []
    for map in timeMappings {

        let segment = AVCompositionTrackSegment(url: url, trackID: kCMPersistentTrackID_Invalid, sourceTimeRange: map.source, targetTimeRange: map.target)
        segments.append(segment)
    }

    emptyTrack.preferredTransform = videoAssetTrack.preferredTransform
    emptyTrack.segments = segments

    if let _ = asset.tracks(withMediaType: AVMediaTypeVideo).first {
        audioTrack.segments = segments
    }

    return composition.copy() as! AVComposition
}

You can then use the compose() function of your CompositionMapper class to give you an AVComposition that is ready to play in an AVPlayer, which should respect the CMTimeMapping objects that you've passed in.

let compositionMapper = CompositionMapper(for: someAVAssetURL, with: originalTimeMaps)
let mappedComposition = compositionMapper.compose()

let playerItem = AVPlayerItem(asset: mappedComposition)
let player = AVPlayer(playerItem: playerItem)
playerItem.audioTimePitchAlgorithm = AVAudioTimePitchAlgorithmVarispeed

Let me know if you need help converting this to Objective-C, but it should be relatively straight forward.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

1.4m articles

1.4m replys

5 comments

57.0k users

...