Exportación de video recortado: resultados diferentes para la camera delantera / trasera

Estoy intentando sacar un video de la camera y exportarlo como un cuadrado. Lo estoy probando en un iPad Air con camera delantera y trasera.

Todo funciona bien cuando captura video con una camera trasera : el video se recorta de la manera que desee. Desafortunadamente, se recorta mal cuando trato de exportar videos tomados de la camera frontal.

La traducción parece ser incorrecta porque obtengo grandes franjas negras en la parte inferior del video. ¿Alguien tiene idea de qué estoy haciendo mal?

Nota: Lo estoy probando en iOS 9 – no estoy seguro si eso podría ser la fuente del problema.

- (AVComposition *)trimmedAndCroppedVideoComposition { AVMutableComposition *composition = [AVMutableComposition composition]; AVURLAsset *sourceAsset = [[AVURLAsset alloc] initWithURL:self.media.videoURL options:@{AVURLAssetPreferPreciseDurationAndTimingKey: @(YES)}]; CMTimeRange timeRange = self.media.trimmedVideoRange; [composition insertTimeRange:timeRange ofAsset:sourceAsset atTime:kCMTimeZero error:nil]; AVAssetTrack *track = [[sourceAsset tracksWithMediaType:AVMediaTypeVideo] firstObject]; AVMutableCompositionTrack *compositionTrack = [[composition tracksWithMediaType:AVMediaTypeVideo] firstObject]; CGSize videoSize = CGSizeApplyAffineTransform(track.naturalSize, track.prefernetworkingTransform); videoSize = CGSizeMake(fabs(videoSize.width), fabs(videoSize.height)); CGFloat fillScale = MAX(self.renderSize.width / videoSize.width, self.renderSize.height / videoSize.height); CGAffineTransform orientationTransform = track.prefernetworkingTransform; if (orientationTransform.tx == videoSize.width || orientationTransform.tx == videoSize.height) { orientationTransform.tx = self.renderSize.width; } if (orientationTransform.ty == videoSize.width || orientationTransform.ty == videoSize.height) { orientationTransform.ty = self.renderSize.width; } CGAffineTransform t1 = CGAffineTransformScale(CGAffineTransformIdentity, fillScale, fillScale); CGAffineTransform t2 = CGAffineTransformConcat(t1, orientationTransform); CGRect cropRect = CGRectMake(0, 0.5, 1, 0.5); CGAffineTransform t3 = CGAffineTransformConcat(t2, CGAffineTransformMakeTranslation (-cropRect.origin.x * videoSize.width * fillScale, -cropRect.origin.y * videoSize.height * fillScale)); compositionTrack.prefernetworkingTransform = t3; return [composition copy]; } - (void)_exportVideo:(void (^)(void))completion { // Trimmed and cropped Asset AVComposition *trimmedAsset = [self trimmedAndCroppedVideoComposition]; // Input clip AVAssetTrack *clipVideoTrack = [[trimmedAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; instruction.timeRange = CMTimeRangeMake(kCMTimeZero, trimmedAsset.duration); // Apple transform AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack]; CGAffineTransform finalTransform = clipVideoTrack.prefernetworkingTransform; [transformer setTransform:finalTransform atTime:kCMTimeZero]; instruction.layerInstructions = [NSArray arrayWithObject:transformer]; // Make it square AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition]; videoComposition.renderSize = CGSizeMake(self.renderSize.width, self.renderSize.height); videoComposition.frameDuration = CMTimeMake(1, 30); videoComposition.instructions = [NSArray arrayWithObject: instruction]; // Export self.exporter = [[AVAssetExportSession alloc] initWithAsset:trimmedAsset presetName:AVAssetExportPresetMediumQuality]; self.exporter.videoComposition = videoComposition; self.exporter.outputURL=[NSURL fileURLWithPath:outputPath]; self.exporter.outputFileType=AVFileTypeQuickTimeMovie; [self.exporter exportAsynchronouslyWithCompletionHandler:^(void){ ... }]; }