Superposition de image / text en video rápido

Trabajo con superposition de imágenes para el efecto de marca de agua en el video usando swift. Estoy usando AVFoundation para esto, pero de alguna manera no tengo éxito.

Lo que sigue es mi código para superponer image / text

  let path = NSBundle.mainBundle().pathForResource("sample_movie", ofType:"mp4") let fileURL = NSURL(fileURLWithPath: path!) let composition = AVMutableComposition() var vidAsset = AVURLAsset(URL: fileURL, options: nil) // get video track let vtrack = vidAsset.tracksWithMediaType(AVMediaTypeVideo) let videoTrack:AVAssetTrack = vtrack[0] as! AVAssetTrack let vid_duration = videoTrack.timeRange.duration let vid_timerange = CMTimeRangeMake(kCMTimeZero, vidAsset.duration) var error: NSError? let compositionvideoTrack:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeVideo, prefernetworkingTrackID: CMPersistentTrackID()) compositionvideoTrack.insertTimeRange(vid_timerange, ofTrack: videoTrack, atTime: kCMTimeZero, error: &error) compositionvideoTrack.prefernetworkingTransform = videoTrack.prefernetworkingTransform // Watermark Effect let size = videoTrack.naturalSize let imglogo = UIImage(named: "image.png") let imglayer = CALayer() imglayer.contents = imglogo?.CGImage imglayer.frame = CGRectMake(5, 5, 100, 100) imglayer.opacity = 0.6 // create text Layer let titleLayer = CATextLayer() titleLayer.backgroundColor = UIColor.whiteColor().CGColor titleLayer.string = "Dummy text" titleLayer.font = UIFont(name: "Helvetica", size: 28) titleLayer.shadowOpacity = 0.5 titleLayer.alignmentMode = kCAAlignmentCenter titleLayer.frame = CGRectMake(0, 50, size.width, size.height / 6) let videolayer = CALayer() videolayer.frame = CGRectMake(0, 0, size.width, size.height) let parentlayer = CALayer() parentlayer.frame = CGRectMake(0, 0, size.width, size.height) parentlayer.addSublayer(videolayer) parentlayer.addSublayer(imglayer) parentlayer.addSublayer(titleLayer) let layercomposition = AVMutableVideoComposition() layercomposition.frameDuration = CMTimeMake(1, 30) layercomposition.renderSize = size layercomposition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videolayer, inLayer: parentlayer) // instruction for watermark let instruction = AVMutableVideoCompositionInstruction() instruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration) let videotrack = composition.tracksWithMediaType(AVMediaTypeVideo)[0] as! AVAssetTrack let layerinstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videotrack) instruction.layerInstructions = NSArray(object: layerinstruction) as [AnyObject] layercomposition.instructions = NSArray(object: instruction) as [AnyObject] // create new file to receive data let dirPaths = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true) let docsDir: AnyObject = dirPaths[0] let movieFilePath = docsDir.stringByAppendingPathComponent("result.mov") let movieDestinationUrl = NSURL(fileURLWithPath: movieFilePath) // use AVAssetExportSession to export video let assetExport = AVAssetExportSession(asset: composition, presetName:AVAssetExportPresetHighestQuality) assetExport.outputFileType = AVFileTypeQuickTimeMovie assetExport.outputURL = movieDestinationUrl assetExport.exportAsynchronouslyWithCompletionHandler({ switch assetExport.status{ case AVAssetExportSessionStatus.Failed: println("failed \(assetExport.error)") case AVAssetExportSessionStatus.Cancelled: println("cancelled \(assetExport.error)") default: println("Movie complete") // play video NSOperationQueue.mainQueue().addOperationWithBlock({ () -> Void in self.playVideo(movieDestinationUrl!) }) } }) 

Por este código, no logro superposition … No sé qué estoy haciendo mal …

Preguntas:

  • ¿Hay algún elemento faltante en este código? ¿O algún problema con este código?
  • ¿Este código solo funciona con videos grabados o todos los videos, incluidos videos de la galería?

El código proporcionado por @El Capitán funcionaría. Solo falta:

  assetExport.videoComposition = layercomposition 

Puede agregar esto justo después de la instanciación de AVAssetExportSession

NOTA: El código proporcionado originalmente solo exportaría la pista de video, pero no la pista de audio. Si necesita la pista de audio, puede agregar algo como esto después de configurar la composition de la pista de video :

 let compositionAudioTrack:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, prefernetworkingTrackID: CMPersistentTrackID()) for audioTrack in audioTracks { try! compositionAudioTrack.insertTimeRange(audioTrack.timeRange, ofTrack: audioTrack, atTime: kCMTimeZero) } 

Para mí (lo que veo en su código), su no está agregando el parentlayer a la pantalla.

Usted crea un CALayer () para agregar videolayer , imglayer y titleLayer en una nueva capa, pero no agrega este último en la pantalla.

 yourView.layer.addSublayer(parentlayer) 

Espero que esto te ayude

@Rey Hernandez ¡esto me ayudó mucho! Si alguien quiere aclaraciones adicionales sobre cómo agregar un recurso de audio con el video, aquí está el código para combinarlos

  let vtrack = vidAsset.tracksWithMediaType(AVMediaTypeVideo) let videoTrack:AVAssetTrack = vtrack[0] let vid_duration = videoTrack.timeRange.duration let vid_timerange = CMTimeRangeMake(kCMTimeZero, vidAsset.duration) let atrack = vidAsset.tracksWithMediaType(AVMediaTypeAudio) let audioTrack:AVAssetTrack = atrack[0] let audio_duration = audioTrack.timeRange.duration let audio_timerange = CMTimeRangeMake(kCMTimeZero, vidAsset.duration) do { let compositionvideoTrack:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeVideo, prefernetworkingTrackID: CMPersistentTrackID()) try compositionvideoTrack.insertTimeRange(vid_timerange, ofTrack: videoTrack, atTime: kCMTimeZero) compositionvideoTrack.prefernetworkingTransform = videoTrack.prefernetworkingTransform let compositionAudioTrack:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, prefernetworkingTrackID: CMPersistentTrackID()) try! compositionAudioTrack.insertTimeRange(audio_timerange, ofTrack: audioTrack, atTime: kCMTimeZero) compositionvideoTrack.prefernetworkingTransform = audioTrack.prefernetworkingTransform } catch { print(error) }