Thursday, December 20, 2012

Video Manipulation Tutorial on iPhone

“Video Overlapping means merging of two or more video clips .We can also provide special effects ,position,duration etc. for video clips as per the needs.”

For such type of video manipulation/Video Editing we can use AVFoundation framework available in iOS.

There are various ways to manipulate videos. Those are
1. Video Trimming
2. Merging number of videos into one video.
3. Adding special effects and transition to existing video
4. Manipulation of text, image and audio in frame of video
5. Sound mixing in a video.

In this tutorial , will focus more on “AVFoundation framework”. So that you can do manipulations of video on your own.

What is AVFoundation ?
AVFoundation -
This AVFoundation is core framework . In iPhone Architecture “AVFoundation” is present in between the Application layer (UIKit Framework) and Core Media Services Layer.

This framework has advantages mentioned as below.

Inspect – If you need to know details of an asset
Play – In more sophisticated ways than are possible with e.g. MPMoviePlayer
Compose/edit – Use the media you want, where and when you want it
Re-encode/export – Create new assets
Enjoy full access to camera – Obtain data from input devices, with the option to write it to storage

Because of the above features “AVFoundation” used for video manipulation.

This framework can be used for -
1.Media Asset management
2.Media Editing
3.Audio , Video recording
4.Playback
5.Metadata management for media items
6.Track management

Let’s discuss about AVFoundation framework classes which we are going to use.

AVAsset – AVAsset is an abstract class of AVFoundation which indicates media of any like audio,video etc. Each asset is collection of tracks.

AVAssetExportSession – This class is used for displaying output in the required form by preset.The preset can be LowQuality,MediumQuality,HighestQuality.

AVMutableComposition – AVComposition is used for temporal editing.AVComposition it is collection of tracks. Each track presents media of specific type e.g. audios ,videos according to it’s timeline.AVMutableComposition is mutable subclass of AVComposition. We can use it to create new composition from existing assets.
Tracks are fixed in AVComposition . For Track manipulation AVMutableComposition is used.

AVMutableCompositionTrack – Each track represented by AVCompositionTrack .And tracks includes a media type,track identifier etc. AVMutableCompositionTrack this is mutable subclass of AVCompositionTrack. that allows us for insert, remove, scale etc. operations on track segments without affecting their lower level representation.

AVMutableVideoComposition – This class is provided to add instructions.AVMutableVideoComposition contains array of AVMutableVideoCompositionInstruction.

AVMutableVideoCompositionInstruction – AVMutableComposition maintains an array of instructions i.e. AVMutableVideoCompositionInstruction to perform it’s composition.

AVMutableVideoCompositionLayerInstruction – AVVideoCompositionInstructions itself contains an Array of AVVideoCompositionLayerInstruction defined for each layer.

Other than AVFoundation we are using below mentioned frameworks-

MediaPlayer framework -
This framework is present in Application layer [Above UIKit framework] and it is used to play audio,video,music ,movie etc.
This Framework Provides classes like MPMediaController and MPMediaViewController used to play streamed video content.
In Our tutorial we are going to use it for playing overlapped video.
Instead you can use AVPlayer also for playing overlapped video if you want custom player for video.

AssetsLibrary framework -
This framework used to access and store videos and pictures from Photos application.
We are going to save our video after overlapping to photos application.
For that we are using ALAssetsLibrary class.

CoreMedia framework -
This is low-level C interface.
This framework is present in Core Services Layer. which is below the Application layer [below AVFoundation framework].
It provides many of the primitives used in AVFoundation like Time related data structures and objets for description and carriage for media data.
Here we are going to use Time related data structures

CMTime – For arithmetic comparisons.
CMTimeRange – For containment ,unions and intersection.

and we can variantly use classes from this framework as per our requirement.

MobileCoreServices framework -
This framework is also from Core Services Layer means it is below the Application layer [below AVFoundation framework].
Defines low level types used in Uniform Type Identifiers (UTI’s).
we are using here class

UTCoreTypes – it defines number of constants. Out of them we are going to use kUTTypeMovie referencing when we selecting media.

I am going to take merging of two videos as an example for this tutorial –

1.Create a new project with view based application named ‘VideoOverlappingDemo’
2.Include Following framework

Common setup -

1.Add Three buttons to ViewController.xib as in figure below -

2.Add this code into our ViewController.h file

#import <UIKit/UIKit.h>#import <MobileCoreServices/UTCoreTypes.h>@interface ViewController : UIViewController@property (nonatomic, strong) IBOutlet UIButton *videoOneButton;@property (nonatomic, strong) IBOutlet UIButton *videoTwoButton;@property (nonatomic, strong) IBOutlet UIButton *overlapVideoButton;@property (nonatomic, strong) NSURL *videoOneURL;@property (nonatomic, strong) NSURL *videoTwoURL;@property (nonatomic, strong) UIImagePickerController *picker;- (IBAction)videoOneButtonTapped:(id)sender;- (IBAction)videoTwoButtonTapped:(id)sender;- (IBAction)overlapVideoButtonTapped:(id)sender;@end

Implement UIImagePickerControllerDelegate for UIImagePickerController.

4.Make connection in xib for all buttons and button actions –
Step 1 – Get Video From Media Library
Step 2 – Manipulation on video and save
Step 3 – play video.

Step 1 – Get Video From Media Library

1.Add below code into ViewController.m

#pragma mark - Event Handlers -(IBAction)videoOneButtonTapped:(id)sender {    isVideoOne = YES;    [self browseMediaLibraryFromViewController:self withDelegate:self];}- (IBAction)videoTwoButtonTapped:(id)sender {    isVideoOne = NO;    [self browseMediaLibraryFromViewController:self withDelegate:self];}

This method will be called on VideoOneButton and VideoTwoButton tapped respectively.

2. Definition for method browseMediaLibraryFromViewController:withDelegate: as below -

#pragma mark - Private Methods- (BOOL)browseMediaLibraryFromViewController:(UIViewController *)controller withDelegate:(id)delegate {    //Validation    if (([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeSavedPhotosAlbum] == NO)        || (delegate == nil)        || (controller == nil)) {        return NO;    }    //Create image picker    _picker = [[UIImagePickerController alloc] init];    self.picker.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum;    self.picker.mediaTypes = [[NSArray alloc] initWithObjects:(NSString *)kUTTypeMovie, nil];    self.picker.allowsEditing = YES;    self.picker.delegate = self;    [self presentViewController:self.picker animated:YES completion:nil];    return YES;}

In this method we are presented a UIImagePickerController for SavedPhotosAlbum with media type Video.

3.Add delegate methods for UIIMagePickerController as below-

#pragma mark - ImagePicker Delegate Methods- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {    //dismiss ImagePickerController    [self.picker dismissViewControllerAnimated:YES completion:nil];    NSString *mediaType = [info objectForKey: UIImagePickerControllerMediaType];    //handle Movie capture    if ([mediaType isEqualToString:(NSString *)kUTTypeMovie]) {        NSLog(@"Matching Success");        if (isVideoOne){            NSLog(@"Video One  Loaded");            UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Video One Loaded"                                                            message:@"Video One Loaded"                                                           delegate:nil                                                  cancelButtonTitle:@"OK"                                                  otherButtonTitles:nil];            [alert show];            //capture selected videoOneURL            self.videoOneURL = [info objectForKey:UIImagePickerControllerMediaURL];            NSLog(@"videoOneURL = %@",self.videoOneURL);        } else {            NSLog(@"Video two Loaded");            UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Video Two Loaded"                                                            message:@"Video Two Loaded"                                                           delegate:nil                                                  cancelButtonTitle:@"OK"                                                  otherButtonTitles:nil];            [alert show];            //capture selected videoTwoURL            self.videoTwoURL =[info objectForKey:UIImagePickerControllerMediaURL];            NSLog(@"videoTwoURL = %@",self.videoTwoURL);        }    }}

This method dismiss UIImagePickerController and handle movie capture.

           [info objectForKey:UIImagePickerControllerMediaURL]; 

with the help of this line capture video url.

- (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker {    //dismiss ImagePickerController      [self.picker dismissViewControllerAnimated:YES completion:nil];}

This delegate method dismiss UIImagePickerController on the cancel button tapped of picker.

Step 2 – Manipulation on video and saving

1.Added new file CustomVideoOverlapper with superclass NSObject into our project.

2.Add below code to CustomVideoOverlapper.h file -

#import <UIKit/UIKit.h>#import <AVFoundation/AVFoundation.h>#import <CoreMedia/CoreMedia.h>#import <AssetsLibrary/AssetsLibrary.h>@interface CustomVideoOverlapper : NSObject  {}@property(nonatomic ,strong) UIViewController *controller;- (id)initWithController:(UIViewController *)controller;- (void)videoOverlappingMethodWithVideoOneURL:(NSURL *)videoOneURL andVideoTwoURL:(NSURL *)videoTwoURL;@end

3.Add this code in CustomVideoOverlapper.m

@synthesize controller = _controller;#pragma mark - Initialisation- (id)initWithController:(UIViewController *)controller {    self = [super init];    if (self) {        self.controller = controller;    }    return self;}

3.We need to add below code ViewController.h file

#import "CustomVideoOverlapper.h"@interface ViewController () {    CustomVideoOverlapper *videoOverlapper;    BOOL isVideoOne;}

4.and below code for EventHandler into ViewController.m file.

- (IBAction)overlapVideoButtonTapped:(id)sender {    videoOverlapper = [[CustomVideoOverlapper alloc] initWithController:self];    //Pass Two video URL's we saved previously for merging process    [videoOverlapper videoOverlappingMethodWithVideoOneURL:self.videoOneURL andVideoTwoURL:self.videoTwoURL];}

this method will be called on Overlap Button Tapped.

5.Definition for method videoOverlappingMethodWithVideoOneURL: andVideoTwoURL into
CustomVideoOverlapper.m file.Complete Video Overlapping method is here -

- (void)videoOverlappingMethodWithVideoOneURL:(NSURL *)videoOneURL andVideoTwoURL:(NSURL *)videoTwoURL {    //addedd ProgressHud    [MBProgressHUD showHUDAddedTo:self.controller.view animated:YES];    if (videoOneURL!= nil && videoTwoURL!=nil) {        //First Video        //firstAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"sample1" ofType:@"mp4"]] options:nil];        firstAsset = [AVAsset assetWithURL:videoOneURL];        //second Video        //secondAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"sample2" ofType:@"mp4"]] options:nil];        secondAsset = [AVAsset assetWithURL:videoTwoURL];    }    if (firstAsset != nil && secondAsset != nil) {        //Create AVMutableComposition object        mixComposition = [[AVMutableComposition alloc] init];        // create first track        AVMutableCompositionTrack *firstTrack = [mixComposition     addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];        [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration)                            ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]                             atTime:kCMTimeZero                              error:nil];        //create second track        AVMutableCompositionTrack *secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];        [secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration)                             ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]                              atTime:kCMTimeZero                               error:nil];        //Main Instruction Layer        //crete Object of main Instruction layer and set timerange to it        AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];//Duration For Final Video should be max Duration of both video        CMTime finalDuration;        CMTime result;        NSLog(@"values =%f and %f",CMTimeGetSeconds(firstAsset.duration),CMTimeGetSeconds(secondAsset.duration));        if (CMTimeGetSeconds(firstAsset.duration) == CMTimeGetSeconds(secondAsset.duration)) {            finalDuration = firstAsset.duration;        }else if (CMTimeGetSeconds(firstAsset.duration) > CMTimeGetSeconds(secondAsset.duration)) {            finalDuration = firstAsset.duration;            result = CMTimeSubtract(firstAsset.duration, secondAsset.duration);        }else {            finalDuration = secondAsset.duration;            result = CMTimeSubtract(secondAsset.duration, firstAsset.duration);        }        mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, secondAsset.duration);        //InstructionLayer for first Track        AVMutableVideoCompositionLayerInstruction *firstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];        CGAffineTransform scale = CGAffineTransformMakeScale(0.5f, 0.5f);        CGAffineTransform move = CGAffineTransformMakeTranslation(320, 0);        [firstlayerInstruction setTransform:CGAffineTransformConcat(scale, move) atTime:kCMTimeZero];        //InstructionLayer for second Track        AVMutableVideoCompositionLayerInstruction *secondlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack];        CGAffineTransform SecondScale = CGAffineTransformMakeScale(1.6f,1.6f);        CGAffineTransform SecondMove = CGAffineTransformMakeTranslation(0,0);        //[secondlayerInstruction setOpacity:0.5 atTime:kCMTimeZero];        [secondlayerInstruction setOpacityRampFromStartOpacity:0.0 toEndOpacity:1.0 timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(2.9,60), CMTimeMakeWithSeconds(8.0, 60))];        [secondlayerInstruction setTransform:CGAffineTransformConcat(SecondScale,SecondMove) atTime:kCMTimeZero];        //Now adding FirstInstructionLayer and SecondInstructionLayer to mainInstruction        mainInstruction.layerInstructions = [NSArray arrayWithObjects:firstlayerInstruction,secondlayerInstruction, nil];        // attach Main Instrcution To VideoCopositionn        //we can attch multipe Instrction to it        mainComposition = [AVMutableVideoComposition videoComposition];        mainComposition.instructions = [NSArray arrayWithObjects:mainInstruction,nil];        mainComposition.frameDuration = CMTimeMake(1, 30);        mainComposition.renderSize = CGSizeMake(640, 480);        //  Get path To save merged video        NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);        NSString *documentsDirectory = [paths objectAtIndex:0];        NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:                                 [NSString stringWithFormat:@"mergeVideo.mov"]];        NSFileManager *fileManager = [NSFileManager defaultManager];        [fileManager removeItemAtPath:myPathDocs error:NULL];        url = [NSURL fileURLWithPath:myPathDocs];        NSLog(@"URL:-  %@", [url description]);        //create Exporter        AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];        exporter.outputURL = url;        exporter.outputFileType = AVFileTypeQuickTimeMovie;        exporter.shouldOptimizeForNetworkUse = YES;        exporter.videoComposition = mainComposition;        [exporter exportAsynchronouslyWithCompletionHandler:^{            dispatch_async(dispatch_get_main_queue(), ^{                [self exportDidFinish:exporter];            });        }];    }else {        UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error!" message:@"Video Not Selected.PLease select videos to merege"                                                       delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];        [alert show];    }}

In the above code we do the following -

1.

AVAsset   *firstAsset = [AVAsset assetWithURL:videoOneURL];AVAsset   *secondAsset = [AVAsset assetWithURL:videoTwoURL];

using this code videos URL which we are obtained after selection of videos from media library to two different AVAsset objects.

2. Create AVMutableComposition object.

AVMutableComposition  *mixComposition = [[AVMutableComposition alloc] init];

3. Create AVMutableCompositionTrack object for firstAsset and secondAsset.

 AVMutableCompositionTrack *firstTrack =               [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo                                           preferredTrackID:kCMPersistentTrackID_Invalid];               [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration)                            ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]                             atTime:kCMTimeZero                              error:nil];

using Set time range for track using method insertTimeRange:ofTrack:atTime:error: ..By default inserted track’s time set to it’s natural duration and rates.We can mange the duration for each video using atTime parameter of this method or we can scale it to different duration using method scaleTimeRange:toDuration:
Do same for secondAsset also.

4. Now we can add instruction to AVMutableComposition such as video size,transitions ,effects,etc.Without this we can not play both videos simultaneously.For that AVVideoComposition class is provided. AVMutableComposition maintains an array of instructions to perform it’s composition i.e. AVMutableVideoCompositionInstruction. AVVideoCompositionInstructions itself contains an Array of AVVideoCompositionLayerInstruction defined for each layer.

//Main Instruction Layer//crete Object of main Instruction layer and set timerange to it        AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];//InstructionLayer for first Track        AVMutableVideoCompositionLayerInstruction *firstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];        CGAffineTransform scale = CGAffineTransformMakeScale(0.5f, 0.5f);        CGAffineTransform move = CGAffineTransformMakeTranslation(320, 0);        [firstlayerInstruction setTransform:CGAffineTransformConcat(scale, move) atTime:kCMTimeZero];//InstructionLayer for second Track        AVMutableVideoCompositionLayerInstruction *secondlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack];        CGAffineTransform secondScale = CGAffineTransformMakeScale(1.3f,1.5f);        CGAffineTransform secondMove = CGAffineTransformMakeTranslation(0,0);//[secondlayerInstruction setOpacity:0.5 atTime:kCMTimeZero];        [secondlayerInstruction setOpacityRampFromStartOpacity:0.0 toEndOpacity:1.0 timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(2.9,80), CMTimeMakeWithSeconds(8.0, 80))];        [secondlayerInstruction setTransform:CGAffineTransformConcat(secondScale,secondMove) atTime:kCMTimeZero];//Now adding FirstInstructionLayer and SecondInstructionLayer to mainInstruction        mainInstruction.layerInstructions = [NSArray arrayWithObjects:firstlayerInstruction,secondlayerInstruction, nil];

We created a object of AVMutableVideoCompositionInstruction named “mainInstruction”.Create AVMutableVideoCompositionLayerInstruction object for firstTrack and secondTrack named as “firstlayerInstruction” and “secondlayerInstruction” as above.
Using CGAffineTransform assign move and scale to both videos as per our requirement of where we want location and size of video. setTransform:atTime: method provided by class AVMutableVideoCompositionLayerInstruction used to set CGAffineTransformation.In our example we set values for CGAffineTransformation as such second video occupy whole screen and first video at top of right corner.
Now adding AVMutableVideoCompositionLayerInstruction objects i.e. firstlayerInstruction and secondlayerInstruction to AVMutableVideoCompositionInstruction’s instance property layerInstructions[NSArray].

5. Set time range to mainInstruction -

//Duraton For Final Video should be max Duration of both video        CMTime finalDuration;        CMTime result;        NSLog(@"values =%f and %f",CMTimeGetSeconds(firstAsset.duration),CMTimeGetSeconds(secondAsset.duration));        if (CMTimeGetSeconds(firstAsset.duration) == CMTimeGetSeconds(secondAsset.duration)) {            finalDuration = firstAsset.duration;        }else if (CMTimeGetSeconds(firstAsset.duration) > CMTimeGetSeconds(secondAsset.duration)) {            finalDuration = firstAsset.duration;            result = CMTimeSubtract(firstAsset.duration, secondAsset.duration);        }else {            finalDuration = secondAsset.duration;            result = CMTimeSubtract(secondAsset.duration, firstAsset.duration);        }        //CMTime duration = MAX(secondAsset.duration , firstAsset.duration);        mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero,finalDuration);

Set time range to mainInstruction as maximum of both the videos i.e. max (firstAsset.duration ,secondAsset.duration) ,but duration is CMTime object ,for comparison we need to convert them into integer or float so there is method called CMTimeGetSeconds provided by CMTime class.Then set check accordingly ,means set duration for Final ‘mainInstruction’ to max of both asset and if both equal then set any of it.

6. Create AVMutableVideoComposition object with method videoComposition named “mainComposition”

// attach Main Instrcution To VideoCopositionn//we can attch multipe Instrction to it	AVMutableVideoComposition *mainComposition = [AVMutableVideoComposition 	videoComposition];        	mainComposition.instructions = [NSArray arrayWithObjects:mainInstruction,nil];     	mainComposition.frameDuration = CMTimeMake(1, 30);              mainComposition.renderSize = CGSizeMake(640, 480);

And add AVMutableVideoCompositionInstruction to it’s instance property – instructions and also set frameDuration and render size for mainComposition.
We can add multiple AVMutableVideoCompositionInstruction objects to AVMutableVideoComposition object.in our example we have added only one AVMutableVideoCompositionInstruction.We can use multiple objects to add multiple layers of effects such as transition and fade but take care of that time ranges of the AVMutableVideoCompositionInstruction objects not to be overlap.

7. Create Path for saving final Merged video as -

//  Get pathNSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);        NSString *documentsDirectory = [paths objectAtIndex:0];        NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:                                 [NSString stringWithFormat:@"mergeVideo.mov"]];        NSFileManager *fileManager = [NSFileManager defaultManager];        [fileManager removeItemAtPath:myPathDocs error:NULL];        url = [NSURL fileURLWithPath:myPathDocs];        NSLog(@"URL:-  %@", [url description]);

8. Create AVAssetExportSession object named “exporter”.

AVAssetExportSession *exporter = [[AVAssetExportSession alloc]    initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];         exporter.outputURL = url;        exporter.outputFileType = AVFileTypeQuickTimeMovie;        exporter.shouldOptimizeForNetworkUse = YES;        exporter.videoComposition = mainComposition;        [exporter exportAsynchronouslyWithCompletionHandler:^{            dispatch_async(dispatch_get_main_queue(), ^{                [self exportDidFinish:exporter];            });        }];

For exporter set asset to AVMutableComposition – ‘mixComposition’ and set videoComposition to ‘mainComposition’ i.e. AVMutableVideoComposition. Set outputURL as path we are created in step 7 to exporter.and set outputFileType of exporter to AVFileTypeQuickTimeMovie. We can start export running by calling ExportAsynchronouslyWithCompletionHandler.

9. ExportAsynchronouslyWithCompletionHandler – Starts asynchronous execution of an export session and it is return immediately. Status indicates the terminal state of the export session, and if a failure occurs, error describes the problem.

completion handler calls exportDidFinish: Implementation of this method

-(void)exportDidFinish:(AVAssetExportSession*)session {    NSLog(@"exportDidFinish");    //Remove ProgressHud    [MBProgressHUD hideHUDForView:self.controller.view animated:YES];    if (session.status == AVAssetExportSessionStatusCompleted) {        NSURL *outputURL = session.outputURL;        ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];        if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL])  {            [library writeVideoAtPathToSavedPhotosAlbum:outputURL completionBlock:^(NSURL *assetURL, NSError *error) {                dispatch_async(dispatch_get_main_queue(), ^{                    if (error) {                        NSLog(@"write Video To AssestsLibrary failed: %@", error);                        UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error"                                                                        message:@"Video Saving Failed"                                                                       delegate:nil                                                              cancelButtonTitle:@"OK"                                                              otherButtonTitles:nil];                        [alert show];                    }else {                        UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Video Saved"                                                                        message:@"Saved To Photo Album"                                                                       delegate:self                                                              cancelButtonTitle:@"OK"                                                              otherButtonTitles:nil];                        alert.tag = 1000;                        [alert show];                        NSLog(@"Writing3");                    }                });            }];        }else {            UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error!"                                                            message:@"Video asset export failed"                                                           delegate:self                                                  cancelButtonTitle:@"OK"                                                  otherButtonTitles:nil];            [alert show];        }    }}

In this method implementation check exporter status and if it is AVAssetExportSessionStatusCompleted then save that video to photo album with Path we set as above for final video by giving alert to save video or not and if we select save then handle validation for video saved successfully or not.

If we selected for “Want play video alert “ yes then below code will be called

#pragma mark - UIAlertView Delegate Method- (void)alertView:(UIAlertView *)alertView clickedButtonAtIndex:(NSInteger)buttonIndex  {    if (alertView.tag == 1000) {        if (buttonIndex == 0) {         UIAlertView *playAlert = [[UIAlertView alloc] initWithTitle:@"Play Video"                                                             message:@"Dou you want play it immdiately?"                                                            delegate:self                                                   cancelButtonTitle:@"Yes"                                                   otherButtonTitles:@"later", nil];            playAlert.tag = 2000;            [playAlert show];        }else {        }    }else if(alertView.tag == 2000){        if (buttonIndex == 0) {            CustomMoviePlayer *moviePlayer = [[CustomMoviePlayer alloc] initWithController:self.controller andURLForMovie:url];            [moviePlayer playVideo];            }        }}

Now we are ready to play merged video.We can play video by AVPlayer of AVFoundation. Or we can play video with application layer framework Media Player.
In our example we are using MediaPlayerViewController class of media player to play saved video.

Step 3 – Play Saved Video Using MediaPlayerViewController

1. Create file CustomMoviePlayer and include following code CustomMoviePlayer.h file -

#import <Foundation/Foundation.h>#import <MediaPlayer/MediaPlayer.h>@interface CustomMoviePlayer : NSObject@property(nonatomic ,strong) UIViewController *controller;@property(nonatomic ,strong) NSURL *url;- (id)initWithController:(UIViewController *)controller andURLForMovie:(NSURL *)url;- (void)playVideo;

2. Include below into CustomMoviePlayer.m file-

@synthesize url = _url;@synthesize controller = _controller;#pragma mark - Initialisation- (id)initWithController:(UIViewController *)controller andURLForMovie:(NSURL *)url {    self = [super init];    if (self) {        self.controller = controller;        self.url = url;    }    return self;}

3. code for playing video -

#pragma mark - Public Method- (void)playVideo {    MPMoviePlayerViewController *theMovie = [[MPMoviePlayerViewController alloc]                                             initWithContentURL:self.url];    [self.controller presentMoviePlayerViewControllerAnimated:theMovie];    // Register for the playback finished notification    [[NSNotificationCenter defaultCenter] addObserver:self                                             selector:@selector(myMovieFinishedCallback:)                                                 name:MPMoviePlayerPlaybackDidFinishNotification                                               object:theMovie];}

Create Object of MPMoviePlayerViewController and pass URL for our saved video and set that object to presentMoviePlayerViewControllerAnimated for our own controller.
Here register for playback notification, Adds a callback method that will be executed once the movie finishes playing – myMovieFinishedCallback: So video is playing here.

4. myMovieFinishedCallback implementation -

#pragma mark - Notification- (void)myMovieFinishedCallback:(NSNotification *)notification {    [self.controller dismissMoviePlayerViewControllerAnimated];    MPMoviePlayerViewController *movie = [notification object];    [[NSNotificationCenter defaultCenter] removeObserver:self                                                    name:MPMoviePlayerPlaybackDidFinishNotification object:movie];}

Here dismiss Movie player and remove that registered observer.

After Merging Of two videos => finalVideo

Complete source code is available here – VideoOverlappingDemo


Source : mobisoftinfotech[dot]com

0 comments:

Post a Comment