Quantcast
Channel: www.iphonelife.com
Viewing all articles
Browse latest Browse all 13234

iOS 7 to Support 60fps Video Recording

$
0
0

This article is intended for iPhone 4S and 5 users who want to shoot video at 60fps (double the framerate) using iOS 7 as well as programmers wanting to support the new 60fps mode in their apps. No iPad or iPhone 4 users should read further since, to my knowledge, their hardware doesn't support 60fps.


Executive Summary

iOS 7 will support 60fps video recording at 720p (as opposed to iOS 6). While it does have image quality problems on current, compatible iPhones (4S and 5), at least it works.

One of the most-discussed announcements at Apple's Worldwide Developers Conference this year was 60fps video recording. The following video frame grab (annotation by me) shows how the company put a lot of emphasis on the reintroduction of 60fps during the keynote of in June.



Having written several jailbreak tweaks allowing for recording 720p60 using the stock Camera client back in the iOS 5 + iPhone 4S times, I've been asked by several of my readers to elaborate on how iOS 7 reintroduced the same feature.

1.1 A Little Bit of History

Back in the iOS 5 times, the iPhone 4S alone could record 60 frame-per-second videos at 720p (But not at Full HD, that is, 1080p). While the image quality of these recordings weren't the best (see below), at least they weren't framedrops.

On jailbroken devices, you could do this right in the stock Camera app by using the framerate changer capabilities of the absolutely essential JB tweak “CameraTweak.” Or, you could just edit /System/Library/ PrivateFrameworks/ MediaToolbox.framework/N94/AVCaptureSession.plist to make the 720p60 mode the default, as is explained HERE.

On non-jailbroken devices, you could use third-party, 60fps-capable camcorder apps likeFiLMiC Pro 2 ($4.99), SloPro (free) andBetter Camcorder (free). (Note that Better Camcorder is the worst of the bunch when it comes to actual, recorded frame rate, as is also explained HERE.)

Then came iOS 6, removing all kinds of 60fps recordings, both jailbroken and non-jailbroken. That is, currently, there is no way to record 60fps videos on any iPhone 4S (or the later 5) if it runs iOS 6. This has caused a lot of anger in the iOS community—absolutely understandably.

Now, a year after iOS 6 (and a year living without any kind of 60fps recording capabilities), Apple reintroduced 60fps. Let's see how it compares to both the ideal case (true 720p resolution) and the iOS 5 implementation!

2. What Can You Expect, Image Quality-Wise?

Unfortunately, I have some bad news for you all: Apple's current implementation uses pixel binning. Readers of my past 60fps articles already know what this means: yes, not only the vertical (as was the case with 60fps recordings under iOS 5), but also the horizontal resolution is halved. This is a major blow: after all, the actual, effective resolution of will be 1280/2 * 720/2 = 640*360 pixels only, not counting in the anti-aliasing introduced by demosaicing done by later components in the image path.

Let me show you proof of this, recorded on my brand new iPhone 5 (manufacture date: week 22 of 2013) via exchange by Apple. The test setup was as follows:



This is a crop of the binned 720p60 mode's true resolution (mode 12 in the AVCaptureDevice.formats list shown in Section 3.1 below):



(click for the original, full-quality, full size reschart frame grab!)


This is how it looks in non-binned, that is, full-resolution 30 fps 720p mode (here, mode 10):



Again, you won't have the same quality as under iOS 5 (on the iPhone 4S only) or most current 60 fps-capable regular, standalone cameras. However, at least you can record at 60fps now on your iPhone 4S or 5 if you really need to.

2.1 Want to Save Some Storage?

As the effective resolution of the video is 640*360 pixels only, you may ask whether you can recompress the originally 1280*720 footage into quarter-sized (that is, 640*360) footage to save a lot of storage (or, say, upload bandwidth, should you want to, later, upload the file somewhere). 640*360 60p footage, when properly transcoded (with a quality H.264 encoder like X264, also used by the excellent encoder HandBrake), only needs to have the bitrate of about 600-1000 kbps, which is at least 3-4 times less than that of 720p60 footage – and at least 15 times less than the originally 15 Mbps bitrate of the iPhone 5's recording. That is, you would indeed save a lot of storage.

I've downsized the above frame grab (the first one) to show its effects. As you can see, there's virtually no difference between the rendition of high-frequency components (converging lines beyond the “6” mark both vertically and horizontally). After all, the sensor input just couldn't cope with that kind of resolution and there was just no usable input. I've annotated these regions with red rectangles in the first crop below. 

The situation is entirely different in low-frequency regions, that is, areas where there are multiple-pixel-thick lines, particularly before the mark 3 (in both dimensions). There, while the sensor still produced an input of effectively 640*360 pixels, the demosaicing / anti-aliasing algorithms later in the signal processing do smooth the edges. That is, the difference is pretty much visible in low-frequency areas. (Blue has been used below to annotate these regions.)

Let's compare two crops: one from the above 720p frame grab to that of a downsized version (here, I used the default settings of the OS X-based Preview, exporting the results in a lossless PNG):



(non-downsized)



(downsized)

As you can see, the low-frequency areas are indeed much more pixelated in the 640*360 image while there's almost no discernible difference in the high-frequency regions.

All in all, while downsizing doesn't have as detrimental effect on the overall quality of 720p60 footage, you still don't want to use it, unless you really need to reduce storage usage with any means necessary.

3. Configuring iOS to Record at 60 fps

The following section is only meant for programmers. As iOS 7 requires an entirely different way of setting up 720p60 recording than iOS 5, existing 60fps-capable App Store apps (including SloPro and Better Camcorder) don't record at 60 fps since they still haven't been updated to iOS 7. That is, if you currently want to record 60fps footage, you must be able to compile and deploy apps on your iPhone.

Below, I both explain how:
- you can get a list of the available video modes,
- the ones with 30+ fps capability can be found and, finally,
- how this 60 fps mode can be made the one to record with.

Only the above is different from how pre-iOS 7 camera recording workflow is done; therefore, I elaborate mostly on these three bullets.

3.1 Getting All Available Modes (incl.60 fps Ones) Via AVCaptureDevice.formats

A (publicly - more on this later) new, iOS 7+-only property in AVCaptureDevice is formats. It returns an array of AVCaptureDeviceFormat objects. For the iPhone5 running b2, the complete set (acquired by a simple NSLog(), without the leading, unnecessary info) is as follows (trailed by the array index I've added):

'vide'/'420v'  192x 144, { 1- 30 fps}, fov:56.700, binned, max zoom:76.50 (upscales @8.50)> - 0
'vide'/'420f'  192x 144, { 1- 30 fps}, fov:56.700, binned, max zoom:76.50 (upscales @8.50)> - 1
'vide'/'420v'  352x 288, { 1- 30 fps}, fov:51.975, binned, max zoom:76.50 (upscales @4.25)> - 2
'vide'/'420f'  352x 288, { 1- 30 fps}, fov:51.975, binned, max zoom:76.50 (upscales @4.25)> - 3
'vide'/'420v'  480x 360, { 1- 30 fps}, fov:56.700, binned, max zoom:76.50 (upscales @3.40)> - 4
'vide'/'420f'  480x 360, { 1- 30 fps}, fov:56.700, binned, max zoom:76.50 (upscales @3.40)> - 5
'vide'/'420v'  640x 480, { 1- 30 fps}, fov:56.700, binned, max zoom:76.50 (upscales @2.55)> - 6
'vide'/'420f'  640x 480, { 1- 30 fps}, fov:56.700, binned, max zoom:76.50 (upscales @2.55)> - 7
'vide'/'420v'  960x 540, { 1- 30 fps}, fov:53.890, supports vis, max zoom:108.00 (upscales @2.91)> - 8
'vide'/'420f'  960x 540, { 1- 30 fps}, fov:53.890, supports vis, max zoom:108.00 (upscales @2.91)> - 9
'vide'/'420v' 1280x 720, { 1- 30 fps}, fov:53.890, supports vis, max zoom:108.00 (upscales @2.18)> - 10
'vide'/'420f' 1280x 720, { 1- 30 fps}, fov:53.890, supports vis, max zoom:108.00 (upscales @2.18)> - 11
'vide'/'420v' 1280x 720, { 1- 60 fps}, fov:51.940, binned, supports vis, max zoom:51.75 (upscales @1.05)> - 12
'vide'/'420f' 1280x 720, { 1- 60 fps}, fov:51.940, binned, supports vis, max zoom:51.75 (upscales @1.05)> - 13
'vide'/'420v' 1920x1080, { 1- 30 fps}, fov:53.890, supports vis, max zoom:108.00 (upscales @1.45)> - 14
'vide'/'420f' 1920x1080, { 1- 30 fps}, fov:53.890, supports vis, max zoom:108.00 (upscales @1.45)> - 15
'vide'/'420v' 2592x1936, { 1- 20 fps}, fov:56.700, max zoom:153.00 (upscales @1.26)> - 16
'vide'/'420f' 2592x1936, { 1- 20 fps}, fov:56.700, max zoom:153.00 (upscales @1.26)> - 17
'vide'/'420v' 3264x2448, { 1- 20 fps}, fov:56.700, max zoom:153.00 (upscales @1.00)> - 18
'vide'/'420f' 3264x2448, { 1- 20 fps}, fov:56.700, max zoom:153.00 (upscales @1.00)> - 19

(Note that, in b1, the first four records all had 2.55 as the maximal lossless zoom factor. This has been fixed in b2.)

Note that, despite their type ('vide'/'420f') being of video, the last four records (index 16...19) are not valid video modes. Should you want to use them, you will get a runtime exception. All the other modes work just fine (I've tested them all). Interestingly, the frame range field of all of them prove I was right when I stated in several of my iPhone oversampling-related articles and forum posts that the iPhone 5 can only sample the sensor up to 20 fps only (and that's only under good light).

The first field, which has the format 'vide'/'420X' YxZ, defines the following:
- X=V: Video range, which means the Y component only uses the byte values from 16 to 235 (for some historical reasons). 'F' Full range uses the full range of a byte, namely, 0 to 255.
YxZ: the resolution.

Programmatically (instead of just parsing the above NSString representation – which is, after all, not very reliable), you can access the first media type (here, always “vide”) as a simple NSString via AVCaptureDeviceFormat.mediaType, the media subtype (here, either 420v or 420f) via the global function CMFormatDescriptionGetMediaSubType(AVCaptureDeviceFormat.formatDescription). The resolution can be accessed via CMVideoFormatDescriptionGetDimensions(AVCaptureDeviceFormat.formatDescription).width/.height.

The second field defines the valid frame range. It's 1...30 for all traditional video modes (between the indexes 0 and 11 and at 14 and 15). For the two 720p60 modes, it's 1...60. Programmatically, it can be accessed via ((AVFrameRateRange*)[AVCaptureDeviceFormat .videoSupportedFrameRateRanges objectAtIndex:0]).minFrameRate/maxFrameRate.

The third field provides you with the field-of-view (FoV) in degrees. (Accessing it is possible via AVCaptureDeviceFormat.videoFieldOfView.) With full-sensor modes (all modes with the 4:3 aspect ratio), it's the same (56.7 degrees) as the still mode (the last four records at the bottom). 352* 288, which has the aspect ratio of ~1.222:1, has a significantly narrower FoV. Finally, as has also been explained in my yesterday's article, all 16:9 modes have narrower FoV, the narrowest, with 51.940 degrees, being mode 12 and 13, that is, 720p60 – the subject of this entire article.

The fourth, when existing, shows whether the given mode is binned. (Access:  AVCaptureDeviceFormat.videoBinned)  It's always on for low-resolution modes, which isn't a problem as the resolution hit introduced by binning isn't an issue with those low-res output formats. No high-res 16:9 modes have binned mode, except for our own 720p60. Unfortunately, as you can see, there's not a single  720p60 mode without binning – that is, at that (comparatively) high output resolution, severe effective resolution decrease.

The fifth, when existing, provides information on whether you can explicitly enable electronic video image stabilization (here: “vis”). (Access:  AVCaptureDeviceFormat.videoStabilizationSupported.) As has been explained in yesterday's IS article, you will want to make IS optional for your users – after all, not everybody needs stabilization at the expense of the wideness of the FoV.

Finally, the last field shows the maximal zoom and the upscale ratio. Please read my dedicated article HERE for more info on these fields. Programmatic access via AVCaptureDeviceFormat.videoZoomFactorUpscaleThreshold and AVCaptureDeviceFormat.videoMaxZoomFactor.

3.1.1 AVCaptureDevice.formats and iOS6

If you compile your project in Xcode 5 with the target of iOS 6.0, your code accessing AVCaptureDevice.formats/activeFormat will compile just fine – and also run on iOS6-only devices. This won't be the case under previous, iOS6-only Xcode versions; for example, the now-current 4.6.3, these properties having been private in the iOS6 times.

The results obtained by reading AVCaptureDevice.formats only contain the recording and encoding resolutions and the  available framerates, though; therefore, they'll be of limited utility.

For the iPhone 5 running 6.1.4, it'll be as follows:

'vide'/'420v' enc dims = 480x360, pres dims = 480x360 { 1 - 30 fps }> - 0
'vide'/'420f' enc dims = 480x360, pres dims = 480x360 { 1 - 30 fps }> - 1
'vide'/'420v' enc dims = 640x480, pres dims = 640x480 { 1 - 30 fps }> - 2
'vide'/'420f' enc dims = 640x480, pres dims = 640x480 { 1 - 30 fps }> - 3
'vide'/'420v' enc dims = 960x540, pres dims = 960x540 { 1 - 30 fps }> - 4
'vide'/'420f' enc dims = 960x540, pres dims = 960x540 { 1 - 30 fps }> - 5
'vide'/'420v' enc dims = 1280x720, pres dims = 1280x720 { 1 - 30 fps }> - 6
'vide'/'420f' enc dims = 1280x720, pres dims = 1280x720 { 1 - 30 fps }> - 7
'vide'/'420v' enc dims = 1920x1080, pres dims = 1920x1080 { 1 - 30 fps }> - 8
'vide'/'420f' enc dims = 1920x1080, pres dims = 1920x1080 { 1 - 30 fps }> - 9
'vide'/'420v' enc dims = 2592x1936, pres dims = 2592x1936 { 1 - 20 fps }> - 10
'vide'/'420f' enc dims = 2592x1936, pres dims = 2592x1936 { 1 - 20 fps }> - 11
'vide'/'420v' enc dims = 3264x2448, pres dims = 3264x2448 { 1 - 20 fps }> - 12
'vide'/'420f' enc dims = 3264x2448, pres dims = 3264x2448 { 1 - 20 fps }> - 13

For the iPhone 3GS on 6.1.3, the following:

'vide'/'420v' enc dims = 640x480, pres dims = 640x480 { 1 - 30 fps }> - 0
'vide'/'420f' enc dims = 640x480, pres dims = 640x480 { 1 - 30 fps }> - 1
'vide'/'420v' enc dims = 2048x1536, pres dims = 2048x1536 { 1 - 17 fps }> - 2
'vide'/'420f' enc dims = 2048x1536, pres dims = 2048x1536 { 1 - 17 fps }> - 3

3.1.2 Source Code: Getting and Displaying AVCaptureDevice.formats

I've created a very simple Xcode 5 project to obtain the list above. As usual, I provide you with the full sources – download HERE. Basically, it's very simple: in the view controller (the only place I've changed anything)'s header, I #import <AVFoundation/AVFoundation.h>. (Note that, unlike with previous Xcode versions, Xcode 5 automatically resolves references to  AVFoundation, which means you don't need to add it by hand under General > Linked Frameworks and Libraries.)

Then, I declare two properties:

@property (retain) AVCaptureSession *captureSession;
@property (retain) AVCaptureDevice *videoDevice;


In the implementation file, right upon loading (in viewDidLoad), I create a capture session and add the video input to it. As I don't even start streaming, I don't add a AVCaptureVideoPreviewLayer to display the camera view or any kind of output (e.g., AVCaptureMovieFileOutput) to the session - just getting the device model-specific video modes doesn't require an active connection.

Finally, I just iterate over AVCaptureDevice.formats, postfixing the record's printed representation with its actual array index:

    int idx=0;
    for (AVCaptureDeviceFormat* currdf in self.videoDevice.formats)
        NSLog(@"%@ - %i", currdf, idx++);


Note that, below, when presenting a full 720p60 recorder, I'll provide you with the sources of a much more useful demo app, which parses all the above-explained properties and selects the (currently, with the iPhone 4S / iPhone 5, only one) full range + 60 fps combo to set the current video mode to it.

3.2 What About AVCaptureSession.sessionPreset?

Before iOS7, the only (official - again, you could access and use the private AVCaptureDevice.formats/activeFormat there with some hacking) way of selecting the predefined session preset (and, though this, the resolution and all associated parameters) was assigning AVCaptureSession.sessionPreset an NSString. These NSString constants are all declared in AVCaptureSession.h and are all have the naming convention AVCaptureSessionPreset*.

In iOS7, there's no 60 fps-related, new AVCaptureSessionPreset*. There's no AVCaptureSessionPreset* equivalent of the AVCaptureSessionPreset1280x720p60 dictionary entry in /System/Library/Frameworks/MediaToolbox.framework/N42/AVCaptureSession.plist. (N42 is iPhone5-specific; some iPhone5 models have an N41 folder instead, as is also discussed HERE, with my full iPhone 5 oversampler tweak.)

If you, on a 60-fps-capable iDevice (currently, the iPhone 4S and 5) try to load one of the 720p presets (AVCaptureSessionPreset1280x720 and AVCaptureSessionPresetiFrame1280x720, existing since iOS4 and iOS5, respectively) and, then, try to manually set a higher-than-30-fps framerate, you'll get a runtime exception.

All in all, you in no way can configure the recording session to record with 720p60 using AVCaptureSession.sessionPreset. You MUST use AVCaptureDevice.activeFormat instead. Again, while it's compilable for iOS6 too (in the iOS7-ready Xcode 5+), under iOS6, there's no way of configuring it to record at  720p60.

4. A full 720p60 recorder

I've made available the sources of a full recorder HERE. Feel free to compile and deploy it on your own iPhone.

In viewDidLoad, the initial session + source set up is pretty much the same as in pre-iOS7 times (see for example THIS example).

After this, I start iterating over  AVCaptureDevice.formats to find the one with full range (with the media subtype being 420f and not 420v) and 60 fps maximum framerate. Upon finding it, I immediately configure the current video AVCaptureDevice to use it – in a, from pre-iOS7 times, well-known locked block. Note that I read and parse all the other parameters of AVCaptureDeviceFormat. Here, I only construct the same NSString of these parameters to exactly match the string NSLog() displays of the original AVCaptureDeviceFormat instance. However, in a real program, you'll want to use each and every of these fields to populate for example video source configuration setup lists so that your users can easily choose the resolution etc. they want to record in. (Just don't forget to add for example en/disable-able image stabilization options to the setup dialog. It's very important, as has also been explained in my yesterday's IS article.)

As soon as the user taps the “Start” button in the (in Portrait) upper left corner, I get the current date and time to form a filename, add a suffix to it so that it won't overwrite files created in the same second and start recording. The recorded files are stored in the Documents directory of the app. To keep the code as simple as possible (so that it's as comprehensible as possible – after all, the aim of this article is “only” showing how the new iOS7 features should be used to properly set up a 720p60 recording session), I didn't implement for example Camera Roll export. Again, should you want to add extra features to the app (including proper orientation sensing, a configuration dialog etc.), you'll want to check out other, previous, non-iOS7 camcorder implementations like THIS.

The full source code is as follows: (there are no other changes anywhere else, except for “Application supports iTunes file sharing” in the plist file. The XIB file doesn't contain anything either because I add the single Start/Stop button programmatically) The .h file:

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>

@interface iOS760fpsRecorderViewController : UIViewController  <AVCaptureFileOutputRecordingDelegate>
@property (retain) AVCaptureSession *captureSession;
@property (retain) AVCaptureMovieFileOutput* fo;
@property (retain) UIButton* startStopButton;
@end



The .m file (note that I commented out the enableIS method so that you can easily implement it, should the need arise):

#import "iOS760fpsRecorderViewController.h"

@interface iOS760fpsRecorderViewController ()
@end

@implementation iOS760fpsRecorderViewController
@synthesize captureSession;
@synthesize fo, startStopButton;

- (void)viewDidLoad
{
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.
    
    self.startStopButton = [[UIButton alloc] initWithFrame:CGRectMake(40, 40, 80, 60)];
    [startStopButton setTitle:@"Start" forState:UIControlStateNormal];
    [startStopButton addTarget:self action:@selector(buttonPressed) forControlEvents:UIControlEventTouchUpInside];
    
    // 1. session
    self.captureSession = [[AVCaptureSession alloc] init];
    
    // 2. in: video
    AVCaptureDevice * videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    NSError *error;
    AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
    if (!error) {
        if ([self.captureSession canAddInput:videoIn])
            [self.captureSession addInput:videoIn];
        else
            NSLog(@"Video input add-to-session failed");
    }
    else
        NSLog(@"Video input creation failed");

   // 2. in: audio
    AVCaptureDevice *audioDevice= [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    AVCaptureDeviceInput *audioIn = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
    [self.captureSession addInput:audioIn];

    // search for a Full Range video + 60 fps combo
    int idx=0;
    for (AVCaptureDeviceFormat* currdf in videoDevice.formats)
    {
//        NSLog(@"%@ - %i", currdf, idx++);
        NSString* compoundString = @"";
        compoundString = [compoundString stringByAppendingString:[NSString stringWithFormat:@"'%@'", currdf.mediaType]];
//      Alternative way of doing the same:
//        CMMediaType cMMediaType = CMFormatDescriptionGetMediaType(myCMFormatDescriptionRef);
//        compoundString = [compoundString stringByAppendingString:
//                          (cMMediaType==kCMMediaType_Video ? @"'vide'" : @"'UNKNOWN'")];
        
        CMFormatDescriptionRef myCMFormatDescriptionRef= currdf.formatDescription;
        FourCharCode mediaSubType = CMFormatDescriptionGetMediaSubType(myCMFormatDescriptionRef);
        BOOL fullRange = NO;
        if (mediaSubType==kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)
            compoundString = [compoundString stringByAppendingString:@"/'420v'"];
        else if (mediaSubType==kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)
        {
            compoundString = [compoundString stringByAppendingString:@"/'420f'"];
            fullRange = YES;
        }
        else [compoundString stringByAppendingString:@"'UNKNOWN'"];
        
        CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(myCMFormatDescriptionRef);
        compoundString = [compoundString stringByAppendingString:[NSString stringWithFormat:@" %ix %i", dimensions.width, dimensions.height]];

        float maxFramerate = ((AVFrameRateRange*)[currdf.videoSupportedFrameRateRanges objectAtIndex:0]).maxFrameRate;
        compoundString = [compoundString stringByAppendingString:[NSString stringWithFormat:@", { %.0f- %.0f fps}", ((AVFrameRateRange*)[currdf.videoSupportedFrameRateRanges objectAtIndex:0]).minFrameRate,
                                                    maxFramerate]];
        
        compoundString = [compoundString stringByAppendingString:[NSString stringWithFormat:@", fov: %.3f", currdf.videoFieldOfView]];
        compoundString = [compoundString stringByAppendingString:
                          (currdf.videoBinned ? @", binned" : @"")];

        compoundString = [compoundString stringByAppendingString:
                          (currdf.videoStabilizationSupported ? @", supports vis" : @"")];
        
        compoundString = [compoundString stringByAppendingString:[NSString stringWithFormat:@", max zoom: %.2f", currdf.videoMaxZoomFactor]];
        
        compoundString = [compoundString stringByAppendingString:[NSString stringWithFormat:@" (upscales @%.2f)", currdf.videoZoomFactorUpscaleThreshold]];
//        NSLog(@"cs: %@", compoundString);
        if (fullRange && maxFramerate>59)
        {
            NSLog(@"Found 60 fps mode: %@", compoundString);
            [videoDevice lockForConfiguration:nil];
            videoDevice.activeFormat = currdf;
            videoDevice.activeVideoMaxFrameDuration = CMTimeMake(1,60);
            [videoDevice unlockForConfiguration];
        }
    }
    
    // 3. out
    self.fo = [[AVCaptureMovieFileOutput alloc] init];
    [self.captureSession addOutput:self.fo];

    // 4. display preview
    AVCaptureVideoPreviewLayer * previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
    previewLayer.frame = CGRectMake(0, 0, self.view.frame.size.width,self.view.frame.size.height);
    previewLayer.contentsGravity = kCAGravityResizeAspectFill;
    [self.view.layer addSublayer:previewLayer];
    [self.view addSubview:self.startStopButton];
    [self.captureSession startRunning];
}
//-(void)enableIS
//{
//    AVCaptureConnection *videoConnection = [self.fo connectionWithMediaType:AVMediaTypeVideo];
//    if ([videoConnection isVideoStabilizationSupported])
//    {
//        NSLog(@"VideoStabilizationSupported! Curr val: %i", [videoConnection isVideoStabilizationEnabled]);
//        if (![videoConnection isVideoStabilizationEnabled])
//        {
//            NSLog(@"enabling Video Stabilization!");
//            
//            videoConnection.enablesVideoStabilizationWhenAvailable= YES;
//            NSLog(@"after: %i", [videoConnection isVideoStabilizationEnabled]);
//        }
//    }
//}
-(NSUInteger)supportedInterfaceOrientations{
    return UIInterfaceOrientationMaskPortrait;
}

-(void)startVideoRecording
{
//        NSLog(@"lowLightBoostEnabled: %i", videoDevice.lowLightBoostEnabled); // always NO, independent of the current activeFormat
    NSDateFormatter* formatter = [[NSDateFormatter alloc] init];
    [formatter setDateFormat:@"yyyy-MM-dd-HH-mm-ss"];
    NSString* dateTimePrefix = [formatter stringFromDate:[NSDate date]];

    int fileNamePostfix = 0;
    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSString *filePath = nil;
    do
        filePath =[NSString stringWithFormat:@"/%@/%@-%i.mp4", documentsDirectory, dateTimePrefix, fileNamePostfix++];
    while ([[NSFileManager defaultManager] fileExistsAtPath:filePath]);

    NSURL* fileURL = [NSURL URLWithString:[@"file://" stringByAppendingString:filePath]];
    [self.fo startRecordingToOutputFileURL:fileURL recordingDelegate:self];
}

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections
{}

- (void)buttonPressed
{
    if ([self.startStopButton.titleLabel.text isEqualToString:@"Start"])
    {
        [startStopButton setTitle:@"Stop" forState:UIControlStateNormal];
        [self startVideoRecording];
    }
    else
    {
        [startStopButton setTitle:@"Start" forState:UIControlStateNormal];
        [self.fo stopRecording];
    }
}

- (void)didReceiveMemoryWarning
{
    [super didReceiveMemoryWarning];
}

@end


Viewing all articles
Browse latest Browse all 13234

Trending Articles