iOS: Falta audio en el video exportado

Estoy intentando exportar el video grabado. Y triunfe en eso. Pero al audio le falta el video exportado final. Así que lo busqué y agregué el siguiente código de audio.

if ([[videoAsset tracksWithMediaType:AVMediaTypeAudio] count] > 0) { [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil]; } 

Pero no puedo save el video después de agregar el código anterior. Estoy recibiendo un error:

"session.status 4 error Error Domain = AVFoundationErrorDomain Code = -11841" Operación detenida "UserInfo = 0x17027e140 {NSLocalizedDescription = Operación detenida, NSLocalizedFailureReason = No se pudo componer el video.}"

 - (void)exportDidFinish:(AVAssetExportSession*)session { NSLog(@"session.status %ld error %@",session.status,session.error);} 

Aquí está el código que usé para exportar video. Entonces, ¿tiene alguna idea? ¿Cómo puedo lograr mi objective de exportar video con audio? ¡¡Gracias!!

 - (void)getVideoOutput{ exportInProgress=YES; NSLog(@"videoOutputFileUrl %@",videoOutputFileUrl); AVAsset *videoAsset = [AVAsset assetWithURL:videoOutputFileUrl]; NSLog(@"videoAsset %@",videoAsset); // 1 - Early exit if there's no video file selected NSLog(@"video asset %@",videoAsset); if (!videoAsset) { UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"Please Load a Video Asset First" delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; return; } // 2 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances. AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init]; // 3 - Video track AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo prefernetworkingTrackID:kCMPersistentTrackID_Invalid]; [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil]; /* getting an error AVAssetExportSessionStatusFailed if ([[videoAsset tracksWithMediaType:AVMediaTypeAudio] count] > 0) { [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil]; }*/ // 3.1 - Create AVMutableVideoCompositionInstruction AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration); // 3.2 - Create an AVMutableVideoCompositionLayerInstruction for the video track and fix the orientation. AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack]; AVAssetTrack *videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp; BOOL isVideoAssetPortrait_ = NO; CGAffineTransform videoTransform = videoAssetTrack.prefernetworkingTransform; if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) { videoAssetOrientation_ = UIImageOrientationRight; isVideoAssetPortrait_ = YES; } if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) { videoAssetOrientation_ = UIImageOrientationLeft; isVideoAssetPortrait_ = YES; } if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) { videoAssetOrientation_ = UIImageOrientationUp; } if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) { videoAssetOrientation_ = UIImageOrientationDown; } [videolayerInstruction setTransform:videoAssetTrack.prefernetworkingTransform atTime:kCMTimeZero]; [videolayerInstruction setOpacity:0.0 atTime:videoAsset.duration]; // 3.3 - Add instructions mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil]; AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition]; CGSize naturalSize; if(isVideoAssetPortrait_){ naturalSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width); } else { naturalSize = videoAssetTrack.naturalSize; } float renderWidth, renderHeight; renderWidth = naturalSize.width; renderHeight = naturalSize.height; mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight); mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction]; mainCompositionInst.frameDuration = CMTimeMake(1, 30); int totalSeconds= (int) CMTimeGetSeconds(videoAsset.duration); [self applyVideoEffectsToComposition:mainCompositionInst size:naturalSize videoDuration:totalSeconds]; // 4 - Get path NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent: [NSString stringWithFormat:@"FinalVideo-%d.mov",arc4random() % 1000]]; NSURL *url = [NSURL fileURLWithPath:myPathDocs]; // 5 - Create exporter AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality]; exporter.outputURL=url; exporter.outputFileType = AVFileTypeQuickTimeMovie; exporter.shouldOptimizeForNetworkUse = YES; exporter.videoComposition = mainCompositionInst; [exporter exportAsynchronouslyWithCompletionHandler:^{ //dispatch_async(dispatch_get_main_queue(), ^{ dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{ [self exportDidFinish:exporter]; }); }]; 

}

No estoy seguro si ayuda, pero así es como lo hice en un proyecto:

  1. Preparar la composition final

     AVMutableComposition *composition = [[AVMutableComposition alloc] init]; 
  2. Preparar la pista de video

     AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo prefernetworkingTrackID:kCMPersistentTrackID_Invalid]; 
  3. Preparar la pista de audio

     AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio prefernetworkingTrackID:kCMPersistentTrackID_Invalid]; 
  4. Inserte los datos de video del activo en la pista de video.

     AVAssetTrack *video = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject]; [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:video atTime:kCMTimeZero error:&error]; 
  5. Inserte los datos de audio del activo en la pista de audio

     AVAssetTrack *audio = [[asset tracksWithMediaType:AVMediaTypeAudio] firstObject]; [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:audio atTime:kCMTimeZero error:&error]; 
  6. Luego, puede agregar algunas instrucciones para procesar su video y / o datos de audio

  7. Finalmente debería poder exportar usando:

     AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality]; [exporter exportAsynchronouslyWithCompletionHandler:^{ /* code when the export is complete */ }]; 

Además, verifique si ha grabado correctamente el audio.
La primera vez que se dispara la camera, iOS debería preguntar si desea permitir el uso del micrófono. Comtesting la configuration de tu dispositivo si está permitido.

Otra opción, puede recuperar su recurso bruto usando la window Ventana> Dispositivo en Xcode.
Seleccione su dispositivo y exporte los datos a su computadora. Luego, busque un activo registrado y ábralo usando VLC, por ejemplo. Inspeccione las transmisiones usando Cmd + I para ver si hay una pista de audio y video.