iOS camera engine with Vine-like tap to record, animatable filters, slow motion, segments editing

Related tags

Camera SCRecorder
Overview

SCRecorder

A Vine/Instagram like audio/video recorder and filter framework in Objective-C.

In short, here is a short list of the cool things you can do:

  • Record multiple video segments
  • Zoom/Focus easily
  • Remove any record segment that you don't want
  • Display the result into a convenient video player
  • Save the record session for later somewhere using a serializable NSDictionary (works in NSUserDefaults)
  • Add a configurable and animatable video filter using Core Image
  • Add a UIView as overlay, so you can render anything you want on top of your video
  • Merge and export the video using fine tunings that you choose

Examples for iOS are provided.

Want something easy to create your filters in this project? Checkout https://github.com/rFlex/CoreImageShop

Framework needed:

  • CoreVideo
  • AudioToolbox
  • GLKit

Podfile

If you are using cocoapods, you can use this project with the following Podfile

platform :ios, '7.0'
pod 'SCRecorder'

Manual install

Drag and drop the SCRecorder.xcodeproject in your project. In your project, add the libSCRecorder.a dependency in the Build Phases into the "Link Binary with Librairies" section (as done in the example).

Swift

For using the project in Swift, follow either the Podfile or Manual install instructions (they both work on Swift too). Then, to allow SCRecorder to be accessible from Swift, just add the following line in your bridge header:

#import <SCRecorder/SCRecorder.h>

Easy and quick

SCRecorder is the main class that connect the inputs and outputs together. It processes the audio and video buffers and append them in a SCRecordSession.

// Create the recorder
SCRecorder *recorder = [SCRecorder recorder]; // You can also use +[SCRecorder sharedRecorder]
	
// Start running the flow of buffers
if (![recorder startRunning]) {
	NSLog(@"Something wrong there: %@", recorder.error);
}

// Create a new session and set it to the recorder
recorder.session = [SCRecordSession recordSession];

// Begin appending video/audio buffers to the session
[recorder record];

// Stop appending video/audio buffers to the session
[recorder pause];

Configuring the recorder

You can configure the input device settings (framerate of the video, whether the flash should be enabled etc...) directly on the SCRecorder.

// Set the AVCaptureSessionPreset for the underlying AVCaptureSession.
recorder.captureSessionPreset = AVCaptureSessionPresetHigh;

// Set the video device to use
recorder.device = AVCaptureDevicePositionFront;

// Set the maximum record duration
recorder.maxRecordDuration = CMTimeMake(10, 1);

// Listen to the messages SCRecorder can send
recorder.delegate = self;

You can configure the video, audio and photo output settings in their configuration instance (SCVideoConfiguration, SCAudioConfiguration, SCPhotoConfiguration), that you can access just like this:

// Get the video configuration object
SCVideoConfiguration *video = recorder.videoConfiguration;

// Whether the video should be enabled or not
video.enabled = YES;
// The bitrate of the video video
video.bitrate = 2000000; // 2Mbit/s
// Size of the video output
video.size = CGSizeMake(1280, 720);
// Scaling if the output aspect ratio is different than the output one
video.scalingMode = AVVideoScalingModeResizeAspectFill;
// The timescale ratio to use. Higher than 1 makes a slow motion, between 0 and 1 makes a timelapse effect
video.timeScale = 1;
// Whether the output video size should be infered so it creates a square video
video.sizeAsSquare = NO;
// The filter to apply to each output video buffer (this do not affect the presentation layer)
video.filter = [SCFilter filterWithCIFilterName:@"CIPhotoEffectInstant"];

// Get the audio configuration object
SCAudioConfiguration *audio = recorder.audioConfiguration;

// Whether the audio should be enabled or not
audio.enabled = YES;
// the bitrate of the audio output
audio.bitrate = 128000; // 128kbit/s
// Number of audio output channels
audio.channelsCount = 1; // Mono output
// The sample rate of the audio output
audio.sampleRate = 0; // Use same input 
// The format of the audio output
audio.format = kAudioFormatMPEG4AAC; // AAC

// Get the photo configuration object
SCPhotoConfiguration *photo = recorder.photoConfiguration;
photo.enabled = NO;

Playing back your recording

SCRecorder provides two easy classes to play a video/audio asset: SCPlayer and SCVideoPlayerView.

SCPlayer is a subclass of AVPlayer that adds some methods to make it easier to use. Plus, it also adds the ability to use a filter renderer, to apply a live filter on a video.

SCRecordSession *recordSession = ... // Some instance of a record session
	
// Create an instance of SCPlayer
SCPlayer *player = [SCPlayer player];
	
// Set the current playerItem using an asset representing the segments
// of an SCRecordSession
[player setItemByAsset:recordSession.assetRepresentingSegments];
	
UIView *view = ... // Some view that will get the video
	
// Create and add an AVPlayerLayer
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
playerLayer.frame = view.bounds;
[view.layer.addSublayer:playerLayer];

// Start playing the asset and render it into the view
[player play];
	
// Render the video directly through a filter
SCFilterImageView *filterView = [[SCFilterImageView alloc] initWithFrame:view.bounds];
filterVieww.filter = [SCFilter filterWithCIFilterName:@"CIPhotoEffectInstant"];
	
player.SCImageView = filterView;
	
[view addSubview:filterView];

SCVideoPlayerView is a subclass of UIView that holds an SCPlayer. The video buffers are rendered directly in this view. It removes the need to handle the creation of an AVPlayerLayer and makes it really easy to play a video in your app.

SCRecordSession *recordSession = ... // Some instance of a record session
	
SCVideoPlayerView *playerView = // Your instance somewhere
	
// Set the current playerItem using an asset representing the segments
// of an SCRecordSession
[playerView.player setItemByAsset:recordSession.assetRepresentingSegments];
	
// Start playing the asset and render it into the view
[playerView.player play];

Editing your recording

SCRecordSession gets the video and audio buffers from the SCRecorder and append them into a SCRecordSessionSegment. A SCRecordSessionSegment is just a continuous file, really. When calling [SCRecorder pause], the SCRecorder asks the SCRecordSession to asynchronously complete its current record segment. Once done, the segment will be added in the [SCRecordSession segments] array. SCRecorder has also [SCRecorder pause:] with a completion handler. In this method, the completion handler will be called once the SCRecordSession has completed and added the record segment in the segments array.

You can add/remove segments easily in a SCRecordSession. You can also merge all the segments into one file.

SCRecordSession *recordSession = ... // An SCRecordSession instance

// Remove the last segment
[recordSession removeLastSegment];

// Add a segment at the end
[recordSession addSegment:[SCRecordSessionSegment segmentWithURL:anURL info:nil]];

// Get duration of the whole record session
CMTime duration = recordSession.duration;

// Get a playable asset representing all the record segments
AVAsset *asset = recordSession.assetRepresentingSegments;

// Get some information about a particular segment
SCRecordSessionSegment *segment = [recordSession.segments firstObject];

// Get thumbnail of this segment
UIImage *thumbnail = segment.thumbnail;

// Get duration of this segment
CMTime duration = segment.duration;

Exporting your recording

You basically have two ways for exporting an SCRecordSession.

First, you can use [SCRecordSession mergeSegmentsUsingPreset:completionHandler:]. This methods takes an AVAssetExportPreset as parameter and will use an AVAssetExportSession behind the hood. Although this is the fastest and easiest way of merging the record segments, this also provide no configuration on the output settings.

// Merge all the segments into one file using an AVAssetExportSession
[recordSession mergeSegmentsUsingPreset:AVAssetExportPresetHighestQuality completionHandler:^(NSURL *url, NSError *error) {
	if (error == nil) {
	   	// Easily save to camera roll
		[url saveToCameraRollWithCompletion:^(NSString *path, NSError *saveError) {
		     
		}];
	} else {
		NSLog(@"Bad things happened: %@", error);
	}
}];

You can also use SCAssetExportSession, which is the SCRecorder counterpart of AVAssetExportSession. This provides a lot more options, like configuring the bitrate, the output video size, adding a filter, adding a watermark... This is at a cost of a little more configuration and more processing time. Like SCRecorder, SCAssetExportSession also holds an SCVideoConfiguration and SCAudioConfiguration instance (ain't that amazing?).

AVAsset *asset = session.assetRepresentingSegments;
SCAssetExportSession assetExportSession = [[SCAssetExportSession alloc] initWithAsset:asset];
assetExportSession.outputUrl = recordSession.outputUrl;
assetExportSession.outputFileType = AVFileTypeMPEG4;
assetExportSession.videoConfiguration.filter = [SCFilter filterWithCIFilterName:@"CIPhotoEffectInstant"];
assetExportSession.videoConfiguration.preset = SCPresetHighestQuality;
assetExportSession.audioConfiguration.preset = SCPresetMediumQuality;
[assetExportSession exportAsynchronouslyWithCompletionHandler: ^{
	if (assetExportSession.error == nil) {
		// We have our video and/or audio file
	} else {
		// Something bad happened
	}
}];

Creating/manipulating filters

SCRecorder comes with a filter API built on top of Core Image. SCFilter is the class that wraps a CIFilter. Each filter can also have a chain of sub filters. When processing an image through a filter, first all its sub filters will process the image then the filter itself. An SCFilter can be saved directly into a file and restored from this file.

SCFilter *blackAndWhite = [SCFilter filterWithCIFilterName:@"CIColorControls"];
[blackAndWhite setParameterValue:@0 forKey:@"inputSaturation"];

SCFilter *exposure = [SCFilter filterWithCIFilterName:@"CIExposureAdjust"];
[exposure setParameterValue:@0.7 forKey:@"inputEV"];

// Manually creating a filter chain
SCFilter *filter = [SCFilter emptyFilter];
[filter addSubFilter:blackAndWhite];
[filter addSubFilter:exposure];

SCVideoConfiguration *videoConfiguration = ... // A video configuration

videoConfiguration.filter = blackAndWhite; // Will render a black and white video
videoConfiguration.filter = exposure; // Will render a video with less exposure
videoConfiguration.filter = filter; // Will render a video with both black and white and less exposure

// Saving to a file
NSError *error = nil;
[filter writeToFile:[NSURL fileUrlWithPath:@"some-url.cisf"] error:&error];
if (error == nil) {

}

// Restoring the filter group
SCFilter *restoredFilter = [SCFilter filterWithContentsOfUrl:[NSURL fileUrlWithPath:@"some-url.cisf"]];

// Processing a UIImage through the filter
UIImage *myImage = ... // Some image
UIImage *processedImage = [restoredFilter UIImageByProcessingUIImage:myImage];

// Save it to the photo library
[processedImage saveToCameraRollWithCompletion: ^(NSError *error) {

}];

If you want to create your own filters easily, you can also check out CoreImageShop which is a Mac application that will generate serialized SCFilter directly useable by the filter classes in this project.

Using the filters

SCFilter can be either used in a view to render a filtered image in real time, or in a processing object to render the filter to a file. You can use an SCFilter in one of the following classes:

Animating the filters

Parameters of SCFilter can be animated. You can for instance, progressively blur your video. To do so, you need to add an animation within an SCFilter. Animations are represented as SCFilterAnimation which is a model object that represents a ramp from a start value to an end value and start applying at a given time and duration.

Some examples:

// Fade from completely blurred to sharp at the beginning of the video
SCFilter *blurFadeFilter = [SCFilter filterWithCIFilterName:@"CIFilterGaussianBlur"];
[blurFadeFilter addAnimationForPameterKey:kCIInputRadiusKey startValue:@100 endValue:@0 startTime:0 duration:0.5];

// Make the video instantly become black and white at 2 seconds for 1 second
SCFilter *blackAndWhite = [SCFilter filterWithCIFilterName:@"CIColorControls"];
[blackAndWhite addAnimationForParameterKey:kCIInputSaturationKey startValue:@1 endValue:@1 startTime:0 duration:2];
[blackAndWhite addAnimationForParameterKey:kCIInputSaturationKey startValue:@0 endValue:@0 startTime:2 duration:1];
[blackAndWhite addAnimationForParameterKey:kCIInputSaturationKey startValue:@1 endValue:@1 startTime:3 duration:1];

Some details about the other provided classes

SCRecorderToolsView

Configurable view that can have an SCRecorder instance and handle tap to focus, pinch to zoom.

SCImageView

Class that can render a CIImage through either EAGL, Metal or CoreGraphics. This class is intended for live rendering of CIImage's. If you want to alter the rendering when subclassing, you can override renderedCIImageInRect:.

SCFilterImageView

A subclass of SCImageView that can have a filter. It renders the input CIImage using the SCFilter, if there is any.

SCSwipeableFilterView

A subclass of SCImageView that has a scrollview and a list of SCFilter. It let the user scrolls between the filters so he can chose one. The selected filter can be retrieved using -[SCSwipeableFilterView selectedFilter]. This basically works the same as the Snapchat composition page.

SCPlayer

Player based on the Apple AVPlayer. It adds some convenience methods and the possibility to have a CIImageRenderer that will be used to render the video image buffers. You can combine this class with a CIImageRenderer to render a live filter on a video.

SCVideoPlayerView

A view that render an SCPlayer easily. It supports tap to play/pause. By default, it holds an SCPlayer instance itself and share the same lifecycle as this SCPlayer. You can disable this feature by calling +[SCVideoPlayerView setAutoCreatePlayerWhenNeeded:NO].

Comments
  • Cannot get watermarking to work

    Cannot get watermarking to work

    Couldn't find any documentation or a working example of watermarking videos. The below is what I could piece together. Can you tell me what is wrong with this? The video gets exported, but the watermark never appears. The watermark is a valid .png file ([email protected]) in the app bundle.

    AVAsset *asset = self.recordSession.assetRepresentingSegments;
        NSURL *tempUrl = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingPathComponent:@"videomarked.mov"]];
    
        SCAssetExportSession *assetExportSession = [[SCAssetExportSession alloc] initWithAsset:asset];
        assetExportSession.outputUrl = tempUrl;
        assetExportSession.outputFileType = AVFileTypeMPEG4;
        assetExportSession.videoConfiguration.preset = SCPresetMediumQuality;
        assetExportSession.videoConfiguration.watermarkImage = [UIImage imageNamed:@"my_watermark"];
        assetExportSession.videoConfiguration.watermarkFrame = CGRectMake(0, 0, 320, 480);
        [assetExportSession exportAsynchronouslyWithCompletionHandler: ^{
            if (assetExportSession.error == nil) {
                // I get success here, but the video does not have a watermark
    
            } else {
                // Something bad happened
    
            }
        }];
    

    I've confirmed that generatedWatermarkImage is valid in SCAssetExportSession:_buildWatermarkFilterForVideoSize

    opened by jln19 17
  • Unable to allocate pixelBuffer: -6661 in SCAssetExportSession

    Unable to allocate pixelBuffer: -6661 in SCAssetExportSession

    I am exporting videos using SCAssetExportSession and applying a filterGroup and I get this error intermittently. I am not sure what could be causing it.

    Looking up error -6661 in the docs it is: kCVReturnInvalidArgument.

    The underlying export session error is:

    Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x16372100 {NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x1634cb00 "The operation couldn’t be completed. (OSStatus error -12780.)", NSLocalizedFailureReason=An unknown error occurred (-12780)}

    Does anyone have any idea why this could be happening?

    My code is the following:

    class func export(recordSession: SCRecordSession,
            withFilter filter: SCFilterGroup?, completion:(url: NSURL!, error : NSError?) -> Void) {
                let asset = recordSession.assetRepresentingRecordSegments()
                var exportSession = SCAssetExportSession(asset: asset)
                var exportFilter: SCFilterGroup = SCFilterGroup.filterGroupWithContentsOfURL(NSBundle.mainBundle().URLForResource("Plain", withExtension: "cisf"), groupName: "Plain")
                if let _filter = filter {
                    exportFilter = _filter
                }
                exportSession.filterGroup = exportFilter
                exportSession.sessionPreset = SCAssetExportSessionPresetHighestQuality
                exportSession.outputUrl = recordSession.outputUrl
                exportSession.outputFileType = AVFileTypeMPEG4
                exportSession.keepVideoSize = true
                exportSession.exportAsynchronouslyWithCompletionHandler{
                    completion(url: exportSession.outputUrl, error: exportSession.error)
                }
        }
    
    opened by mjgaylord 14
  • Easy SCImageView Implementation resulting in black screen Swift

    Easy SCImageView Implementation resulting in black screen Swift

    Really simple implementation using your tutorial results in a black screen.

    self.moviePlayer = SCPlayer()
    self.moviePlayer.setItemByAsset(recordSession.assetRepresentingSegments())
    var playerLayer = AVPlayerLayer(player: moviePlayer)
    playerLayer.frame.origin = CGPoint(x: 0,y: 0)
    playerLayer.frame.size = self.mainview.frame.size
    self.mainview.layer.addSublayer(playerLayer)
    moviePlayer.loopEnabled = true
    moviePlayer.play()
    

    This works perfectly fine but once I add

    var a = SCImageView(frame: CGRect(origin: CGPoint(x: 0, y: 0), size: self.mainview.frame.size))
    a.filter = SCFilter(CIFilter: CIFilter(name: "CIPhotoEffectInstant"))
    moviePlayer.CIImageRenderer = a
    self.mainview.addSubview(a)
    

    the mainview turns black. Any Idea?

    opened by jjeon5 13
  • AVAssetWriterInput error when saving to device using SVAssetExportSession

    AVAssetWriterInput error when saving to device using SVAssetExportSession

    First of all, thank you considerably for this library, including the Xamarin bindings ;)

    I am basically trying to reproduce parts of the examples you have provided in Xamarin.iOS. However, I get an error when trying to save a video with a filter to my library.

    I get the following error when using the ExportAsynchronously method of the export session:

    MonoTouch.Foundation.MonoTouchException: Objective-C exception thrown.  Name: NSInternalInconsistencyException Reason: *** -[AVAssetWriterInput requestMediaDataWhenReadyOnQueue:usingBlock:] Cannot call method when status is 0
    

    Any help would be greatly appreciated.

    opened by jp-src 13
  • Accessing the AVCaptureSession on separate queue

    Accessing the AVCaptureSession on separate queue

    There doesn't seem to be any pattern with how you access the capture session on your _dispatchQueue.

    For example, on openSession:, you do this on the current queue:

    AVCaptureSession *session = [[AVCaptureSession alloc] init];
     _beginSessionConfigurationCount = 0;
     _captureSession = session;
    

    But then later on you do:

    if (!_captureSession.isRunning) {
         dispatch_async(_dispatchQueue, ^{
              [_captureSession startRunning];
    

    Why are you accessing the capture session from different queues? At first I ignored it thinking it was harmless, but I noticed that many times when I tried using the recorder, the entire iOS media server would crash, and this notification would be posted:

    AVAudioSessionMediaServicesWereResetNotification

    I forked your repo, and moved all access of the capture session to its respective queue, and I no longer see this issue.

    On a separate note, you don't seem to be handling any of the error notifications, such as:

    AVCaptureSessionRuntimeErrorNotification
    AVCaptureSessionWasInterruptedNotification
    AVAudioSessionMediaServicesWereResetNotification
    AVAudioSessionMediaServicesWereLostNotification
    
    opened by moughxyz 13
  • Recording square video

    Recording square video

    I'm having trouble configuring the recorder to capture and export video cropped to a square aspect ratio. Without expertise in AVFoundation I am having a hard time understanding whether this is currently possible to do. Is this currently a supported feature for iOS?

    I can see that Issue #1 relates to this question but it was closed 7 months ago and the codebase has changed since that resolution. Specifically, it seems that the functionality associated with useInputFormatTypeAsOutputType has been rewritten.

    I have tried to configure the recorder using the following methods without success:

    • camera.videoEncoder.outputVideoSize
    • camera.sessionPreset
    • [camera setActiveFormatThatSupportsFrameRate:width:andHeight:error:]

    Any information on this matter would be greatly appreciated.

    opened by sethroot 12
  • Setting SCImageView on Player stops playback.

    Setting SCImageView on Player stops playback.

    I've been trying to implement filters, but I am having several issues. The code below should work, but there is 0 playback.

        [_playerView setFrame:[[self containerView ] bounds]];
        [[ _playerView player] setItemByAsset:[[self recordSession] assetRepresentingSegments]];
        [[ _playerView playerLayer] setFrame:[[self containerView ] bounds]];
        [[ _playerView playerLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
    
        [[ _playerView player] setDelegate:self];
        [[ _playerView player] setLoopEnabled:YES];
        [[ _playerView player] beginSendingPlayMessages];
    
        [[ _playerView player] play];
    
        SCFilterImageView *filterView = [[SCFilterImageView alloc] initWithFrame:[[self containerView ] bounds]];
        filterView.filter = [SCFilter filterWithCIFilterName:@"CIPhotoEffectInstant"];
     [[_playerView player] setSCImageView:filterView];
    
    
    
    

    when I remove [[_playerView player] setSCImageView:filterView];

    The video plays...

    Anyone have any luck on gettings Filters working with SCRecorder?

    opened by MMasterson 11
  • Way to generate

    Way to generate "edit list"-free composition

    I love how SCRecorder provides easy management of different segments. However, I think that because of how Apple generates MP4s from mutiple segments that it results in an MP4 that has edit lists that make is not play well with other video processing tools.

    We have been uploading our videos to a server and then trying to convert them to HTTP Live Streaming with ffmpeg. The problem is that it always complains about "multiple edit list entries, a/v desync might occur". I have confirmed that if the SCRecorder output is from a single segment that it doesn't have this characteristic. It only occurs when there are multiple segments.

    I realize this is more of a problem of ffmpeg and the other tools that process these videos. However, do you know if there is a way to make the AVAssetExportSession for the AVMutableComposition not generate MP4s with these edit lists, but instead somehow just makes it like a single segment?

    opened by jpswensen 11
  • Issue when saving videos

    Issue when saving videos

    Hey, thanks for the great library. We are using it in something really cool, and i can't wait to show you!

    So, to the problem at hand. When i'm using the exact same code as the example on iOS, i often encounter a writing error on the AVAssetWriter in SCAudioVideoRecorder.m

    *** Terminating app due to uncaught exception 
    'NSInternalInconsistencyException', reason: 
    '*** -[AVAssetWriter finishWritingWithCompletionHandler:] 
    Cannot call method when status is 1'
    

    The error occurs on line 266:

    - (void) finishWriter:(NSURL*)fileUrl {
    #if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE
        [self.assetWriter finishWritingWithCompletionHandler:^ { // <-- Here
            dispatch_async(self.dispatch_queue, ^{
                [self stopInternal];
            });
        }];
    

    Hope that you can help with this!

    opened by Jensen2k 11
  • SCSwipeableFilterView with wrong contentMode and Video stretched on iPhone X

    SCSwipeableFilterView with wrong contentMode and Video stretched on iPhone X

    So, I'm using SCSwipeableFilterView on my project, it works well on all devices, when using SCSwipeableFilterView. But, when I run on a iPhone X, my video gets stretched. For some reason, it does not respect the content mode applied. Does anyone know how to solve it?

    opened by gutiago 10
  • SCRecorderExamples don't save video with audio

    SCRecorderExamples don't save video with audio

    I tested the ObjC example of the SCRecorderExamples, using an iPhone 5c with iOS9.1. I noticed when saving video only the first 3 seconds of audio are recorded. The video is completely recorded but the audio is only recorded during the first three seconds. I tested several times, searched for a maximum audio time record and searched in the SCRecorder issues, but with no avail.

    This behaviour has been detected before? There is any configuration to limit the time of audio recording in the video? Or is this a bug?

    Best regards.

    opened by JoseMendesGouveia 10
  • Question: Can SCRecorder be used to record video in distinct mp4 segments?

    Question: Can SCRecorder be used to record video in distinct mp4 segments?

    I have a requirement to be able to save video into separate video files that can be stitched back together after being uploaded to a server. Is this something that SCRecorder can help with?

    If so, any examples or suggestions?

    opened by curtisshipley 0
  • Applying Multiple Filter

    Applying Multiple Filter

    Hello , i am applying multiple filter with the help of addSubFilter method, but now i want to add the start time and end time of the animation of the filter, but issue is , some of the filters does not have any parameter key's so how we can use the addAnimation(forParameterKey) method, its working when i am passing the key but some of the filter does not have the key like:- CIPhotoEffectChrome (This does not have any key parameter , but want to use this with specific time range)

    opened by swapneshp-spaceo 0
  • Unable to export video using SCAssetExportSession

    Unable to export video using SCAssetExportSession

    I have used screcorder for capturing the video and I am using SCAssetExportSession for exporting the video. I am able to export video only for the first time after that I am getting following error

    some : Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12138), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x283903690 {Error Domain=NSOSStatusErrorDomain Code=-12138 "(null)"}}

    I have checked some links over stackoverflow but none of that will works for me,I have tried following things also

            self.session.deinitialize()
            self.exportSession = SCAssetExportSession()
    

    I had used following code :

    func exportSessionVideo() {

    self.Loader_show()
    
    let videoName = randomString(length: 5) + ".mp4"
    
    let exportedVideoURL = (applicationDocumentsDirectory()?.appendingPathComponent(videoName))!
    
    exportSession = SCAssetExportSession()
    exportSession = SCAssetExportSession(asset: session.assetRepresentingSegments())
    exportSession.videoConfiguration.preset = SCPresetHighestQuality
    exportSession.audioConfiguration.preset = SCPresetHighestQuality
    exportSession.videoConfiguration.maxFrameRate = 35
    exportSession.outputUrl = session.outputUrl
    exportSession.outputFileType = AVFileType.mp4.rawValue
    exportSession.delegate = self
    exportSession.contextType = .auto
    //        let audioMix: AVMutableAudioMix = AVMutableAudioMix()
    //        var audioMixParam: [AVMutableAudioMixInputParameters] = []
    //
    //        let aAudioAssetTrack: AVAssetTrack = session.assetRepresentingSegments().tracks.first!
    //        let videoParam: AVMutableAudioMixInputParameters = AVMutableAudioMixInputParameters(track: aAudioAssetTrack)
    //        videoParam.trackID = aAudioAssetTrack.trackID
    //
    //        let videoVolume : Float = 1
    //
    //        videoParam.setVolume(videoVolume, at: kCMTimeZero)
    //
    //        videoParam.setVolumeRamp(fromStartVolume: videoVolume, toEndVolume: videoVolume, timeRange: aAudioAssetTrack.timeRange)
    //
    //        audioMixParam.append(videoParam)
    //        audioMix.inputParameters = audioMixParam
    //        exportSession.audioConfiguration.audioMix = audioMix
    
    exportSession.exportAsynchronously {
        self.Loader_Hide()
        let error = self.exportSession.error
        if (self.exportSession.cancelled) {
            showMessage("Export was cancelled")
        } else if error == nil {
            print("url: \(self.exportSession.outputUrl?.absoluteString ?? "nil url")")
            let assetURL = self.exportSession.outputUrl
    
            self.recorder.session = nil
            self.recorder.previewView = nil
            self.exportSession = SCAssetExportSession()
            self.session.deinitialize()
            self.recorder.session?.deinitialize()
    
            self.dismiss(animated: true, completion: {
                DispatchQueue.main.async(execute: {
                    let vc = loadVC(strStoryboardId: SB_CAMERA, strVCId: "CreatePostVC") as! CreatePostVC
                    vc.objEnum_PostType = .Video
                    vc.strPostURL = assetURL!.absoluteString
    
                    APP_DELEGATE.appNavigation?.present(vc, animated: true, completion: nil)
                })
            })
    
        } else {
            showMessage((error?.localizedDescription)!)
        }
    }
    

    }

    opened by CearsKhush 0
Owner
Simon Corsin
Simon Corsin
Fasttt and easy camera framework for iOS with customizable filters

FastttCamera is a wrapper around AVFoundation that allows you to build your own powerful custom camera app without all the headaches of using AVFounda

IFTTT 1.8k Dec 10, 2022
Camera engine for iOS, written in Swift, above AVFoundation. :monkey:

?? The most advanced Camera framework in Swift ?? CameraEngine is an iOS camera engine library that allows easy integration of special capture feature

Remi ROBERT 575 Dec 25, 2022
Library for iOS Camera API. CameraKit helps you add reliable camera to your app quickly.

CameraKit helps you add reliable camera to your app quickly. Our open source camera platform provides consistent capture results, service that scales, and endless camera possibilities.

CameraKit 628 Dec 27, 2022
Instagram-like photo browser and a camera feature with a few line of code in Swift.

Fusuma is a Swift library that provides an Instagram-like photo browser with a camera feature using only a few lines of code.

Yuta Akizuki 2.4k Dec 31, 2022
NextLevel is a Swift camera system designed for easy integration, customized media capture, and image streaming in iOS

NextLevel is a Swift camera system designed for easy integration, customized media capture, and image streaming in iOS. Integration can optionally leverage AVFoundation or ARKit.

NextLevel 2k Jan 2, 2023
Custom camera with AVFoundation. Beautiful, light and easy to integrate with iOS projects.

?? Warning This repository is DEPRECATED and not maintained anymore. Custom camera with AVFoundation. Beautiful, light and easy to integrate with iOS

Tudo Gostoso Internet 1.4k Dec 16, 2022
A fully customisable and modern camera implementation for iOS made with AVFoundation.

Features Extremely simple and easy to use Controls autofocus & exposure Customizable interface Code-made UI assets that do not lose resolution quality

Gabriel Alvarado 1.3k Nov 30, 2022
A simple, customizable camera control - video recorder for iOS.

LLSimpleCamera: A simple customizable camera - video recorder control LLSimpleCamera is a library for creating a customized camera - video recorder sc

Ömer Faruk Gül 1.2k Dec 12, 2022
A light weight & simple & easy camera for iOS by Swift.

DKCamera Description A light weight & simple & easy camera for iOS by Swift. It uses CoreMotion framework to detect device orientation, so the screen-

Bannings 86 Aug 18, 2022
A Snapchat Inspired iOS Camera Framework written in Swift

Overview SwiftyCam is a a simple, Snapchat-style iOS Camera framework for easy photo and video capture. SwiftyCam allows users to capture both photos

Andrew Walz 2k Dec 21, 2022
An iOS framework that uses the front camera, detects your face and takes a selfie.

TakeASelfie An iOS framework that uses the front camera, detects your face and takes a selfie. This api opens the front camera and draws an green oval

Abdullah Selek 37 Jan 3, 2023
Video and photo camera for iOS

Features: Description Records video ?? takes photos ?? Flash on/off ⚡ Front / Back camera ↕️ Hold to record video ✊ Tap to take photo ?? Tap to focus

André J 192 Dec 17, 2022
ALCameraViewController - A camera view controller with custom image picker and image cropping.

ALCameraViewController A camera view controller with custom image picker and image cropping. Features Front facing and rear facing camera Simple and c

Alex Littlejohn 2k Dec 6, 2022
UIView+CameraBackground - Show camera layer as a background to any UIView.

UIView+CameraBackground Show camera layer as a background to any UIView. Features Both front and back camera supported. Flash modes: auto, on, off. Co

Yonat Sharon 63 Nov 15, 2022
This plugin defines a global navigator.camera object, which provides an API for taking pictures and for choosing images from the system's image library.

title description Camera Take pictures with the device camera. AppVeyor Travis CI cordova-plugin-camera This plugin defines a global navigator.camera

null 0 Nov 2, 2021
Simple Swift class to provide all the configurations you need to create custom camera view in your app

Camera Manager This is a simple Swift class to provide all the configurations you need to create custom camera view in your app. It follows orientatio

Imaginary Cloud 1.3k Dec 29, 2022
BarcodeScanner is a simple and beautiful wrapper around the camera with barcode capturing functionality and a great user experience.

Description BarcodeScanner is a simple and beautiful wrapper around the camera with barcode capturing functionality and a great user experience. Barco

HyperRedink 1.6k Jan 7, 2023
A camera designed in Swift for easily integrating CoreML models - as well as image streaming, QR/Barcode detection, and many other features

Would you like to use a fully-functional camera in an iOS application in seconds? Would you like to do CoreML image recognition in just a few more sec

David Okun 868 Dec 29, 2022
Mock UIImagePickerController for testing camera based UI in simulator

Mock UIImagePickerController to simulate the camera in iOS simulator.

Yonat Sharon 18 Aug 18, 2022