Video rotado después de aplicar AVVideoComposition

Después de aplicar una AVVideoComposition a mi AVPlayerItem , el filter que aplico funciona, pero el video se rota en AVPlayerLayer .

Sé de hecho que el problema no es con el marco filtrado porque si muestro el marco en un UIImageView , el marco se renderiza al 100% correctamente.

El video muestra correctamente hasta que aplique una videoComposition . Configurar el videoGravity en AVPlayerLayer no ayuda.

El video gira 90º en el sentido de las agujas del reloj y se estira en la capa.

Básicamente, el video se muestra perfectamente en AVPlayerLayer antes de que AVPlayerItem se alimente a través de AVMutableVideoComposition . Una vez que eso sucede, el video se gira -90º, y luego se escala para que se ajuste a las mismas dimensiones que el video antes de filtrar. Esto me sugiere que no se da count de que su transformación ya es correcta, por lo que está volviendo a aplicar la transformación en sí misma.

¿Por qué sucede esto y cómo puedo solucionarlo?

Aquí hay un código:

 private func filterVideo(with filter: Filter?) { if let player = player, let playerItem = player.currentItem { let composition = AVMutableComposition() let videoAssetTrack = playerItem.asset.tracks(withMediaType: .video).first let videoCompositionTrack = composition.addMutableTrack(withMediaType: .video, prefernetworkingTrackID: kCMPersistentTrackID_Invalid) try? videoCompositionTrack?.insertTimeRange(CMTimeRange(start: kCMTimeZero, duration: playerItem.asset.duration), of: videoAssetTrack!, at: kCMTimeZero) videoCompositionTrack?.prefernetworkingTransform = videoAssetTrack!.prefernetworkingTransform let videoComposition = AVMutableVideoComposition(asset: composition, applyingCIFiltersWithHandler: { (request) in let filtenetworkingImage = <...> request.finish(with: filtenetworkingImage, context: nil) }) playerItem.videoComposition = videoComposition } } 

Tiene un problema en el renderingSize de AVVideoComposition que debe aplicar transformar en AVMutableVideoCompositionInstruction (es decir, Rotate y translate transformar).

Lo he hecho en Objective-c Estoy publicando mi código, puedes convertir la syntax en Swift

C objective

  //------------------------------------ // FIXING ORIENTATION //------------------------------------ AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeAdd(firstAsset.duration, secondAsset.duration)); AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack]; // second AVAssetTrack *FirstAssetTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; UIImageOrientation FirstAssetOrientation_ = UIImageOrientationUp; BOOL isFirstAssetPortrait_ = NO; CGAffineTransform firstTransform = FirstAssetTrack.prefernetworkingTransform; if(firstTransform.a == 0 && firstTransform.b == 1.0 && firstTransform.c == -1.0 && firstTransform.d == 0) {FirstAssetOrientation_= UIImageOrientationRight; isFirstAssetPortrait_ = YES;} if(firstTransform.a == 0 && firstTransform.b == -1.0 && firstTransform.c == 1.0 && firstTransform.d == 0) {FirstAssetOrientation_ = UIImageOrientationLeft; isFirstAssetPortrait_ = YES;} if(firstTransform.a == 1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == 1.0) {FirstAssetOrientation_ = UIImageOrientationUp;} if(firstTransform.a == -1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == -1.0) {FirstAssetOrientation_ = UIImageOrientationDown;} CGFloat FirstAssetScaleToFitRatio = 320.0/FirstAssetTrack.naturalSize.width; if(isFirstAssetPortrait_){ FirstAssetScaleToFitRatio = 320.0/FirstAssetTrack.naturalSize.height; CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio); [FirstlayerInstruction setTransform:CGAffineTransformConcat(FirstAssetTrack.prefernetworkingTransform, FirstAssetScaleFactor) atTime:kCMTimeZero]; }else{ CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio); [FirstlayerInstruction setTransform:CGAffineTransformConcat(CGAffineTransformConcat(FirstAssetTrack.prefernetworkingTransform, FirstAssetScaleFactor),CGAffineTransformMakeTranslation(0, 160)) atTime:kCMTimeZero]; } [FirstlayerInstruction setOpacity:0.0 atTime:firstAsset.duration]; AVMutableVideoCompositionLayerInstruction *SecondlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack]; AVAssetTrack *SecondAssetTrack = [[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; UIImageOrientation SecondAssetOrientation_ = UIImageOrientationUp; BOOL isSecondAssetPortrait_ = NO; CGAffineTransform secondTransform = SecondAssetTrack.prefernetworkingTransform; if(secondTransform.a == 0 && secondTransform.b == 1.0 && secondTransform.c == -1.0 && secondTransform.d == 0) {SecondAssetOrientation_= UIImageOrientationRight; isSecondAssetPortrait_ = YES;} if(secondTransform.a == 0 && secondTransform.b == -1.0 && secondTransform.c == 1.0 && secondTransform.d == 0) {SecondAssetOrientation_ = UIImageOrientationLeft; isSecondAssetPortrait_ = YES;} if(secondTransform.a == 1.0 && secondTransform.b == 0 && secondTransform.c == 0 && secondTransform.d == 1.0) {SecondAssetOrientation_ = UIImageOrientationUp;} if(secondTransform.a == -1.0 && secondTransform.b == 0 && secondTransform.c == 0 && secondTransform.d == -1.0) {SecondAssetOrientation_ = UIImageOrientationDown;} CGFloat SecondAssetScaleToFitRatio = 320.0/SecondAssetTrack.naturalSize.width; if(isSecondAssetPortrait_){ SecondAssetScaleToFitRatio = 320.0/SecondAssetTrack.naturalSize.height; CGAffineTransform SecondAssetScaleFactor = CGAffineTransformMakeScale(SecondAssetScaleToFitRatio,SecondAssetScaleToFitRatio); [SecondlayerInstruction setTransform:CGAffineTransformConcat(SecondAssetTrack.prefernetworkingTransform, SecondAssetScaleFactor) atTime:firstAsset.duration]; }else{ ; CGAffineTransform SecondAssetScaleFactor = CGAffineTransformMakeScale(SecondAssetScaleToFitRatio,SecondAssetScaleToFitRatio); [SecondlayerInstruction setTransform:CGAffineTransformConcat(CGAffineTransformConcat(SecondAssetTrack.prefernetworkingTransform, SecondAssetScaleFactor),CGAffineTransformMakeTranslation(0, 160)) atTime:secondAsset.duration]; } MainInstruction.layerInstructions = [NSArray arrayWithObjects:SecondlayerInstruction,nil];; AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition]; MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction]; MainCompositionInst.frameDuration = CMTimeMake(1, 30); MainCompositionInst.renderSize = CGSizeMake(320.0, 480.0); // Now , you have Orientation Fixed Instrucation layer // add this composition to your video 😀 // If you want to export Video than you can do like below NSString *documentsDirectory = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"]; NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"final_merged_video-%d.mp4",arc4random() % 1000]]; NSURL *url = [NSURL fileURLWithPath:myPathDocs]; // 5 - Create exporter AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPreset640x480]; exporter.outputURL=url; exporter.videoComposition=MainCompositionInst; exporter.outputFileType = AVFileTypeQuickTimeMovie; exporter.shouldOptimizeForNetworkUse = YES; [exporter exportAsynchronouslyWithCompletionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^ { [[AppDelegate Getdelegate] hideIndicator]; [self exportDidFinish:exporter]; }); }]; 

para Swift ve esta respuesta Haz clic aquí

Además, también puedes intentar rotar tu capa de video aplicando transformaciones de rotation en él.

 #define degreeToRadian(x) (M_PI * x / 180.0) [_playerLayer setAffineTransform:CGAffineTransformMakeRotation(degreeToRad‌​ian(degree))] 

En lugar de suponer que la image se filtrará, verifique primero si filtenetworkingImage es nil . Si no, entonces request.finish(with: filtenetworkingImage, context: nil)

Sin embargo, si es nil , debe request.finish(with: SomeError)

Esto es según los documentos.

Si está intentando reproducir AVMutableCompostion , debe configurar la AVAssetTrack prefernetworkingTransform de AVMutableCompositionTrack a la AVMutableCompositionTrack prefernetworkingTransform de AVMutableCompositionTrack .

 let asset = AVAsset(url: url!) let composition = AVMutableComposition() let compositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, prefernetworkingTrackID: kCMPersistentTrackID_Invalid) let videoTrack = asset.tracks(withMediaType: AVMediaTypeVideo).first try? compositionTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, asset.duration), of: videoTrack!, at: kCMTimeZero) compositionTrack.prefernetworkingTransform = (videoTrack?.prefernetworkingTransform)! let playerItem = AVPlayerItem(asset: composition) let filter = CIFilter(name: "CIColorInvert") playerItem.videoComposition = AVVideoComposition(asset: composition, applyingCIFiltersWithHandler: { (request: AVAsynchronousCIImageFilteringRequest) in filter?.setValue(request.sourceImage, forKey: kCIInputImageKey) request.finish(with: (filter?.outputImage)!, context: nil) }) .... the rest of code 
Intereting Posts