相机速度怎么调成180

iOS自定义相机/参数调节/视频速率调节/视频合并 - 简书
iOS自定义相机/参数调节/视频速率调节/视频合并
AVFoundation框架
1.AVAsset:用于获取一个多媒体文件的信息,相当于获取一个视频或音频文件,是一个抽象类,不能直接使用。
2.AVURLAsset:AVAsset的子类,通过URL路径创建一个包含多媒体信息的对象。
NSURL *url = &#A URL that identifies an audiovisual asset such as a movie file#&;
AVURLAsset *anAsset = [[AVURLAsset alloc] initWithURL:url options:nil];
3.AVCaptureSession:用于捕捉视频和音频,负责协调视频和音频的输入流和输出流。
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
if ([_captureSession canSetSessionPreset:AVCaptureSessionPreset]) _captureSession.sessionPreset = AVCaptureSessionPreset;
4.AVCaptureDevice:表示输入设备,如照相机或麦克风。
AVCaptureDevice *device = [self getCameraDeviceWithPosition:AVCaptureDevicePositionBack];
- (AVCaptureDevice *)getCameraDeviceWithPosition:(AVCaptureDevicePosition)position {
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices) {
if ([device position] == position) {
if ([device supportsAVCaptureSessionPreset:AVCaptureSessionPreset])
5.AVCaptureDeviceInput:视频或音频的输入流,把该对象添加到AVCaptureSession对象中管理。
AVCaptureDeviceInput *input =
[AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
if ([captureSession canAddInput:captureDeviceInput]) {
[captureSession addInput:captureDeviceInput];
6.AVCaptureOutput:视频或音频的输出流,通常使用它的子类:AVCaptureAudioDataOutput,AVCaptureVideoDataOutput,AVCaptureStillImageOutput,AVCaptureFileOutput等,把该对象添加到AVCaptureSession对象中管理。
AVCaptureMovieFileOutput *movieOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([captureSession canAddOutput:movieOutput]) {
[captureSession addOutput:movieOutput];
7.AVCaptureVideoPreviewLayer:预览图层,实时查看摄像头捕捉的画面。
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectF
captureVideoPreviewLayer.frame = &#Set layer frame#&;
8.AVCaptureConnection:AVCaptureSession和输入输出流之间的连接,可以用来调节一些设置,如光学防抖。
AVCaptureConnection *captureConnection = [movieOutput connectionWithMediaType:AVMediaTypeVideo];
// 打开影院级光学防抖
captureConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeC
9.AVCaptureDeviceFormat:输入设备的一些设置,可以用来修改一些设置,如ISO,慢动作,防抖等。
AVCaptureDevice *captureDevice = [self.captureDeviceInput device];
if ([captureDevice lockForConfiguration:&error]) {
CGFloat minISO = captureDevice.activeFormat.minISO;
CGFloat maxISO = captureDevice.activeFormat.maxISO;
// 调节ISO为全范围的70%
CGFloat currentISO = (maxISO - minISO) * 0.7 + minISO;
[captureDevice setExposureModeCustomWithDuration:AVCaptureExposureDurationCurrent ISO:currentISO completionHandler:nil];
[captureDevice unlockForConfiguration];
// Handle the error appropriately.
初始化相机
/// 负责输入和输出设备之间的数据传递
@property (nonatomic, strong) AVCaptureSession *captureS
/// 负责从AVCaptureDevice获得视频输入流
@property (nonatomic, strong) AVCaptureDeviceInput *captureDeviceI
/// 负责从AVCaptureDevice获得音频输入流
@property (nonatomic, strong) AVCaptureDeviceInput *audioCaptureDeviceI
/// 视频输出流
@property (nonatomic, strong) AVCaptureMovieFileOutput *captureMovieFileO
/// 相机拍摄预览图层
@property (nonatomic, strong) AVCaptureVideoPreviewLayer *captureVideoPreviewL
... 公用方法
/// 获取摄像头设备
- (AVCaptureDevice *)getCameraDeviceWithPosition:(AVCaptureDevicePosition)position {
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices) {
if ([device position] == position) {
if ([device supportsAVCaptureSessionPreset:AVCaptureSessionPreset])
... 创建自定义相机
// 创建AVCaptureSession
_captureSession = [[AVCaptureSession alloc] init];
if ([_captureSession canSetSessionPreset:AVCaptureSessionPreset]) _captureSession.sessionPreset = AVCaptureSessionPreset;
// 获取摄像设备
AVCaptureDevice *videoCaptureDevice = [self getCameraDeviceWithPosition:AVCaptureDevicePositionBack];
if (!videoCaptureDevice)
// Handle the error appropriately.
// 获取视频输入流
NSError *error =
_captureDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&error];
if (error) {
// Handle the error appropriately.
// 获取录音设备
AVCaptureDevice *audioCaptureDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
// 获取音频输入流
_audioCaptureDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error];
if (error) {
// Handle the error appropriately.
// 将视频和音频输入添加到AVCaptureSession
if ([_captureSession canAddInput:_captureDeviceInput] && [_captureSession canAddInput:_audioCaptureDeviceInput]) {
[_captureSession addInput:_captureDeviceInput];
[_captureSession addInput:_audioCaptureDeviceInput];
// 创建输出流
_captureMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
// 将输出流添加到AVCaptureSession
if ([_captureSession canAddOutput:_captureMovieFileOutput]) {
[_captureSession addOutput:_captureMovieFileOutput];
// 根据设备输出获得连接
AVCaptureConnection *captureConnection = [_captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
// 判断是否支持光学防抖
if ([videoCaptureDevice.activeFormat isVideoStabilizationModeSupported:AVCaptureVideoStabilizationModeCinematic]) {
// 如果支持防抖就打开防抖
captureConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeC
// 保存默认的AVCaptureDeviceFormat
// 之所以保存是因为修改摄像头捕捉频率之后,防抖就无法再次开启,试了下只能够用这个默认的format才可以,所以把它存起来,关闭慢动作拍摄后在设置会默认的format开启防抖
_defaultFormat = videoCaptureDevice.activeF
_defaultMinFrameDuration = videoCaptureDevice.activeVideoMinFrameD
_defaultMaxFrameDuration = videoCaptureDevice.activeVideoMaxFrameD
// 创建预览图层
_captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
_captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectF//填充模式
_captureVideoPreviewLayer.frame = self.
// 相机的预览图层是一个CALayer,所以可以创建一个UIView,在view的layer上addSublayer就可以
// 因为这里是写在view的init方法里,所以直接调用了self.layer的addSublayer方法
[self.layer addSublayer:_captureVideoPreviewLayer];
// 开始捕获
[self.captureSession startRunning];
配置操作界面
可以在相机的预览图层所在的view上面直接addSubview我们需要的视图,我的做法是直接创建一个和当前预览图层一样大的UIView做控制面板,背景色为透明。然后整体盖在相机预览图层上面,所有的手势方法,按钮点击等都在我们的控制面板上作响应,具体代码其实就是通过代理传递控制面板的操作让相机界面去做对应的处理,这里就不贴无用代码了。
1.切换到后摄像头
#pragma mark - 切换到后摄像头
- (void)cameraBackgroundDidClickChangeBack {
AVCaptureDevice *toChangeD
AVCaptureDevicePosition toChangePosition = AVCaptureDevicePositionB
toChangeDevice = [self getCameraDeviceWithPosition:toChangePosition];
AVCaptureDeviceInput *toChangeDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:toChangeDevice error:nil];
[self.captureSession beginConfiguration];
[self.captureSession removeInput:self.captureDeviceInput];
if ([self.captureSession canAddInput:toChangeDeviceInput]) {
[self.captureSession addInput:toChangeDeviceInput];
self.captureDeviceInput = toChangeDeviceI
[self.captureSession commitConfiguration];
2.切换到前摄像头
- (void)cameraBackgroundDidClickChangeFront {
AVCaptureDevice *toChangeD
AVCaptureDevicePosition toChangePosition = AVCaptureDevicePositionF
toChangeDevice = [self getCameraDeviceWithPosition:toChangePosition];
AVCaptureDeviceInput *toChangeDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:toChangeDevice error:nil];
[self.captureSession beginConfiguration];
[self.captureSession removeInput:self.captureDeviceInput];
if ([self.captureSession canAddInput:toChangeDeviceInput]) {
[self.captureSession addInput:toChangeDeviceInput];
self.captureDeviceInput = toChangeDeviceI
[self.captureSession commitConfiguration];
3.打开闪光灯
- (void)cameraBackgroundDidClickOpenFlash {
AVCaptureDevice *captureDevice = [self.captureDeviceInput device];
if ([captureDevice lockForConfiguration:&error]) {
if ([captureDevice isTorchModeSupported:AVCaptureTorchModeOn]) [captureDevice setTorchMode:AVCaptureTorchModeOn];
// Handle the error appropriately.
4.关闭闪光灯
- (void)cameraBackgroundDidClickCloseFlash {
AVCaptureDevice *captureDevice = [self.captureDeviceInput device];
if ([captureDevice lockForConfiguration:&error]) {
if ([captureDevice isTorchModeSupported:AVCaptureTorchModeOff]) [captureDevice setTorchMode:AVCaptureTorchModeOff];
// Handle the error appropriately.
5.调节焦距
// 焦距范围0.0-1.0
- (void)cameraBackgroundDidChangeFocus:(CGFloat)focus {
AVCaptureDevice *captureDevice = [self.captureDeviceInput device];
if ([captureDevice lockForConfiguration:&error]) {
if ([captureDevice isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus]) [captureDevice setFocusModeLockedWithLensPosition:focus completionHandler:nil];
// Handle the error appropriately.
6.数码变焦
// 数码变焦 1-3倍
- (void)cameraBackgroundDidChangeZoom:(CGFloat)zoom {
AVCaptureDevice *captureDevice = [self.captureDeviceInput device];
if ([captureDevice lockForConfiguration:&error]) {
[captureDevice rampToVideoZoomFactor:zoom withRate:50];
// Handle the error appropriately.
7.调节ISO,光感度
// 调节ISO,光感度 0.0-1.0
- (void)cameraBackgroundDidChangeISO:(CGFloat)iso {
AVCaptureDevice *captureDevice = [self.captureDeviceInput device];
if ([captureDevice lockForConfiguration:&error]) {
CGFloat minISO = captureDevice.activeFormat.minISO;
CGFloat maxISO = captureDevice.activeFormat.maxISO;
CGFloat currentISO = (maxISO - minISO) * iso + minISO;
[captureDevice setExposureModeCustomWithDuration:AVCaptureExposureDurationCurrent ISO:currentISO completionHandler:nil];
[captureDevice unlockForConfiguration];
// Handle the error appropriately.
8.点击屏幕自动对焦
// 当前屏幕上点击的点坐标
- (void)cameraBackgroundDidTap:(CGPoint)point {
AVCaptureDevice *captureDevice = [self.captureDeviceInput device];
if ([captureDevice lockForConfiguration:&error]) {
CGPoint location =
CGPoint pointOfInerest = CGPointMake(0.5, 0.5);
CGSize frameSize = self.captureVideoPreviewLayer.frame.
if ([captureDevice position] == AVCaptureDevicePositionFront) location.x = frameSize.width - location.x;
pointOfInerest = CGPointMake(location.y / frameSize.height, 1.f - (location.x / frameSize.width));
[self focusWithMode:AVCaptureFocusModeAutoFocus exposureMode:AVCaptureExposureModeAutoExpose atPoint:pointOfInerest];
[[self.captureDeviceInput device] addObserver:self forKeyPath:@"ISO" options:NSKeyValueObservingOptionNew context:NULL];
// Handle the error appropriately.
-(void)focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{
AVCaptureDevice *captureDevice = [self.captureDeviceInput device];
if ([captureDevice lockForConfiguration:&error]) {
if ([captureDevice isFocusModeSupported:focusMode]) [captureDevice setFocusMode:AVCaptureFocusModeAutoFocus];
if ([captureDevice isFocusPointOfInterestSupported]) [captureDevice setFocusPointOfInterest:point];
if ([captureDevice isExposureModeSupported:exposureMode]) [captureDevice setExposureMode:AVCaptureExposureModeAutoExpose];
if ([captureDevice isExposurePointOfInterestSupported]) [captureDevice setExposurePointOfInterest:point];
// Handle the error appropriately.
9.获取录制时视频的方向
因为相机的特殊性,不能够用常规的控制器的方向来获取当前的方向,因为用户可能关闭屏幕旋转,这里用重力感应来计算当前手机的放置状态。
@property (nonatomic, strong) CMMotionManager *motionM
@property (nonatomic, assign) UIDeviceOrientation deviceO
_motionManager = [[CMMotionManager alloc] init];
_motionManager.deviceMotionUpdateInterval = 1/15.0;
if (_motionManager.deviceMotionAvailable) {
[_motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue currentQueue] withHandler:^(CMDeviceMotion * _Nullable motion, NSError * _Nullable error) {
[self performSelectorOnMainThread:@selector(handleDeviceMotion:) withObject:motion waitUntilDone:YES];
NSLog(@"No device motion on device");
/// 重力感应回调
- (void)handleDeviceMotion:(CMDeviceMotion *)deviceMotion{
double x = deviceMotion.gravity.x;
double y = deviceMotion.gravity.y;
CGAffineTransform videoT
if (fabs(y) &= fabs(x)) {
if (y &= 0) {
videoTransform = CGAffineTransformMakeRotation(M_PI);
_deviceOrientation = UIDeviceOrientationPortraitUpsideD
videoTransform = CGAffineTransformMakeRotation(0);
_deviceOrientation = UIDeviceOrientationP
if (x &= 0) {
videoTransform = CGAffineTransformMakeRotation(-M_PI_2);
_deviceOrientation = UIDeviceOrientationLandscapeR
// Home键左侧水平拍摄
videoTransform = CGAffineTransformMakeRotation(M_PI_2);
_deviceOrientation = UIDeviceOrientationLandscapeL
// Home键右侧水平拍摄
// 告诉操作界面当前屏幕的方向,做按钮跟随屏幕方向旋转的操作
[self.backgroundView setOrientation:_deviceOrientation];
11.慢动作拍摄
- (void)cameraBackgroundDidClickOpenSlow {
[self.captureSession stopRunning];
CGFloat desiredFPS = 240.0;
AVCaptureDevice *videoDevice = self.captureDeviceInput.
AVCaptureDeviceFormat *selectedFormat =
int32_t maxWidth = 0;
AVFrameRateRange *frameRateRange =
for (AVCaptureDeviceFormat *format in [videoDevice formats]) {
for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {
CMFormatDescriptionRef desc = format.formatD
CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(desc);
int32_t width = dimensions.
if (range.minFrameRate &= desiredFPS && desiredFPS &= range.maxFrameRate && width &= maxWidth) {
selectedFormat =
frameRateRange =
maxWidth =
if (selectedFormat) {
if ([videoDevice lockForConfiguration:nil]) {
NSLog(@"selected format: %@", selectedFormat);
videoDevice.activeFormat = selectedF
videoDevice.activeVideoMinFrameDuration = CMTimeMake(1, (int32_t)desiredFPS);
videoDevice.activeVideoMaxFrameDuration = CMTimeMake(1, (int32_t)desiredFPS);
[videoDevice unlockForConfiguration];
[self.captureSession startRunning];
12.慢动作拍摄关
- (void)cameraBackgroundDidClickCloseSlow {
[self.captureSession stopRunning];
CGFloat desiredFPS = 60.0;
AVCaptureDevice *videoDevice = self.captureDeviceInput.
AVCaptureDeviceFormat *selectedFormat =
int32_t maxWidth = 0;
AVFrameRateRange *frameRateRange =
for (AVCaptureDeviceFormat *format in [videoDevice formats]) {
for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {
CMFormatDescriptionRef desc = format.formatD
CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(desc);
int32_t width = dimensions.
if (range.minFrameRate &= desiredFPS && desiredFPS &= range.maxFrameRate && width &= maxWidth) {
selectedFormat =
frameRateRange =
maxWidth =
if (selectedFormat) {
if ([videoDevice lockForConfiguration:nil]) {
NSLog(@"selected format: %@", selectedFormat);
videoDevice.activeFormat = _defaultF
videoDevice.activeVideoMinFrameDuration = _defaultMinFrameD
videoDevice.activeVideoMaxFrameDuration = _defaultMaxFrameD
[videoDevice unlockForConfiguration];
[self.captureSession startRunning];
13.防抖开启
- (void)cameraBackgroundDidClickOpenAntiShake {
AVCaptureConnection *captureConnection = [_captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
NSLog(@"change captureConnection: %@", captureConnection);
AVCaptureDevice *videoDevice = self.captureDeviceInput.
NSLog(@"set format: %@", videoDevice.activeFormat);
if ([videoDevice.activeFormat isVideoStabilizationModeSupported:AVCaptureVideoStabilizationModeCinematic]) {
captureConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeC
14.防抖关闭
#pragma mark - 防抖关
- (void)cameraBackgroundDidClickCloseAntiShake {
AVCaptureConnection *captureConnection = [_captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
NSLog(@"change captureConnection: %@", captureConnection);
AVCaptureDevice *videoDevice = self.captureDeviceInput.
if ([videoDevice.activeFormat isVideoStabilizationModeSupported:AVCaptureVideoStabilizationModeOff]) {
captureConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeO
15.录制视频
#pragma mark - 录制
- (void)cameraBackgroundDidClickPlay {
// 根据设备输出获得连接
AVCaptureConnection *captureConnection = [self.captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
// 根据连接取得设备输出的数据
if (![self.captureMovieFileOutput isRecording]) {
captureConnection.videoOrientation = (AVCaptureVideoOrientation)_deviceO // 视频方向和手机方向一致
NSString *outputFilePath = [kCachePath stringByAppendingPathComponent:[self movieName]];
NSURL *fileURL = [NSURL fileURLWithPath:outputFilePath];
[self.captureMovieFileOutput startRecordingToOutputFileURL:fileURL recordingDelegate:self];
_currentMoviePath = outputFileP
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections {
NSLog(@"开始录制");
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {
NSLog(@"录制完成");
16.暂停录制
[self.captureMovieFileOutput stopRecording];
17.调节视频的速度
慢动作拍摄的时候要调节摄像头的捕捉频率,快速的时候直接调节视频速度就可以了。
慢动作下拍摄的视频视频的播放时长还是实际拍摄的时间,这里根据设置的慢速倍率,把视频的时长拉长。
/// 处理速度视频
- (void)setSpeedWithVideo:(NSDictionary *)video completed:(void(^)())completed {
dispatch_async(dispatch_get_global_queue(0, 0), ^{
NSLog(@"video set thread: %@", [NSThread currentThread]);
// 获取视频
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:video[kMoviePath]] options:nil];
// 视频混合
AVMutableComposition* mixComposition = [AVMutableComposition composition];
// 视频轨道
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
// 音频轨道
AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
// 视频的方向
CGAffineTransform videoTransform = [videoAsset tracksWithMediaType:AVMediaTypeVideo].lastObject.preferredT
if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
NSLog(@"垂直拍摄");
videoTransform = CGAffineTransformMakeRotation(M_PI_2);
}else if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
NSLog(@"倒立拍摄");
videoTransform = CGAffineTransformMakeRotation(-M_PI_2);
}else if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
NSLog(@"Home键右侧水平拍摄");
videoTransform = CGAffineTransformMakeRotation(0);
}else if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
NSLog(@"Home键左侧水平拍摄");
videoTransform = CGAffineTransformMakeRotation(M_PI);
// 根据视频的方向同步视频轨道方向
compositionVideoTrack.preferredTransform = videoT
compositionVideoTrack.naturalTimeScale = 600;
// 插入视频轨道
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMake(videoAsset.duration.value, videoAsset.duration.timescale)) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] firstObject] atTime:kCMTimeZero error:nil];
// 插入音频轨道
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMake(videoAsset.duration.value, videoAsset.duration.timescale)) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] firstObject] atTime:kCMTimeZero error:nil];
// 适配视频速度比率
CGFloat scale = 1.0;
if([video[kMovieSpeed] isEqualToString:kMovieSpeed_Fast]){
scale = 0.2f;
// 快速 x5
} else if ([video[kMovieSpeed] isEqualToString:kMovieSpeed_Slow]) {
scale = 4.0f;
// 慢速 x4
// 根据速度比率调节音频和视频
[compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMake(videoAsset.duration.value, videoAsset.duration.timescale)) toDuration:CMTimeMake(videoAsset.duration.value * scale , videoAsset.duration.timescale)];
[compositionAudioTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMake(videoAsset.duration.value, videoAsset.duration.timescale)) toDuration:CMTimeMake(videoAsset.duration.value * scale, videoAsset.duration.timescale)];
// 配置导出
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPreset];
// 导出视频的临时保存路径
NSString *exportPath = [kCachePath stringByAppendingPathComponent:[self movieName]];
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
// 导出视频的格式 .MOV
_assetExport.outputFileType = AVFileTypeQuickTimeM
_assetExport.outputURL = exportU
_assetExport.shouldOptimizeForNetworkUse = YES;
// 导出视频
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
dispatch_async(dispatch_get_main_queue(), ^{
[_processedVideoPaths addObject:exportPath];
// 将导出的视频保存到相册
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if (![library videoAtPathIsCompatibleWithSavedPhotosAlbum:[NSURL URLWithString:exportPath]]){
NSLog(@"cache can't write");
completed();
[library writeVideoAtPathToSavedPhotosAlbum:[NSURL URLWithString:exportPath] completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
completed();
NSLog(@"cache write error");
completed();
NSLog(@"cache write success");
18.将多个视频合并为一个视频
- (void)mergeVideosWithPaths:(NSArray *)paths completed:(void(^)(NSString *videoPath))completed {
if (!paths.count)
dispatch_async(dispatch_get_main_queue(), ^{
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
videoTrack.preferredTransform = CGAffineTransformRotate(CGAffineTransformIdentity, M_PI_2);
CMTime totalDuration = kCMTimeZ
NSMutableArray&AVMutableVideoCompositionLayerInstruction *& *instructions = [NSMutableArray array];
for (int i = 0; i & paths. i++) {
AVURLAsset *asset = [AVURLAsset assetWithURL:[NSURL fileURLWithPath:paths[i]]];
AVAssetTrack *assetAudioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] firstObject];
AVAssetTrack *assetVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo]firstObject];
NSLog(@"%lld", asset.duration.value/asset.duration.timescale);
NSError *erroraudio =
BOOL ba = [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:assetAudioTrack atTime:totalDuration error:&erroraudio];
NSLog(@"erroraudio:%@--%d", erroraudio, ba);
NSError *errorVideo =
BOOL bl = [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:assetVideoTrack atTime:totalDuration error:&errorVideo];
NSLog(@"errorVideo:%@--%d",errorVideo,bl);
AVMutableVideoCompositionLayerInstruction *instruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
UIImageOrientation assetOrientation = UIImageOrientationUp;
BOOL isAssetPortrait = NO;
// 根据视频的实际拍摄方向来调整视频的方向
CGAffineTransform videoTransform = assetVideoTrack.preferredT
if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
NSLog(@"垂直拍摄");
assetOrientation = UIImageOrientationR
isAssetPortrait = YES;
}else if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
NSLog(@"倒立拍摄");
assetOrientation = UIImageOrientationL
isAssetPortrait = YES;
}else if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
NSLog(@"Home键右侧水平拍摄");
assetOrientation = UIImageOrientationUp;
}else if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
NSLog(@"Home键左侧水平拍摄");
assetOrientation = UIImageOrientationD
CGFloat assetScaleToFitRatio = 720.0 / assetVideoTrack.naturalSize.
if (isAssetPortrait) {
assetScaleToFitRatio = 720.0 / assetVideoTrack.naturalSize.
CGAffineTransform assetSacleFactor = CGAffineTransformMakeScale(assetScaleToFitRatio, assetScaleToFitRatio);
[instruction setTransform:CGAffineTransformConcat(assetVideoTrack.preferredTransform, assetSacleFactor) atTime:totalDuration];
竖直方向视频尺寸:720*1280
水平方向视频尺寸:720*405
水平方向视频需要剧中的y值:(1280 - 405)/ 2 = 437.5
CGAffineTransform assetSacleFactor = CGAffineTransformMakeScale(assetScaleToFitRatio, assetScaleToFitRatio);
[instruction setTransform:CGAffineTransformConcat(CGAffineTransformConcat(assetVideoTrack.preferredTransform, assetSacleFactor), CGAffineTransformMakeTranslation(0, 437.5)) atTime:totalDuration];
// 把新的插入到最上面,最后是按照数组顺序播放的。
[instructions insertObject:instruction atIndex:0];
totalDuration = CMTimeAdd(totalDuration, asset.duration);
// 在当前视频时间点结束后需要清空尺寸,否则如果第二个视频尺寸比第一个小,它会显示在第二个视频的下方。
[instruction setCropRectangle:CGRectZero atTime:totalDuration];
AVMutableVideoCompositionInstruction *mixInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mixInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, totalDuration);
mixInstruction.layerInstructions =
AVMutableVideoComposition *mixVideoComposition = [AVMutableVideoComposition videoComposition];
mixVideoComposition.instructions = [NSArray arrayWithObject:mixInstruction];
mixVideoComposition.frameDuration = CMTimeMake(1, 25);
mixVideoComposition.renderSize = CGSizeMake(720.0, 1280.0);
NSString *outPath = [kVideoPath stringByAppendingPathComponent:[self movieName]];
NSURL *mergeFileURL = [NSURL fileURLWithPath:outPath];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL = mergeFileURL;
exporter.outputFileType = AVFileTypeQuickTimeM
exporter.videoComposition = mixVideoC
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
completed(outPath);
如果对你有帮助请点赞加关注,每周都会更新。
若一切归零开始,心境反而更加充实
最终诉求? 拍摄、保存、播放、上传。就这四个步骤,当然首先拍摄就有许许多多的优化小功能,切换摄像头、单击跳帧焦距、双击远近距离切换、上移取消拍摄。 保存功能就只有一个保存到本地沙盒的功能。 播放自然就涉及到原生或是三方库来播放的问题,用什么第三方库来播放。 上传就有几种情况...
原文链接http://www.cnblogs.com/kenshincui/p/4186022.html 音频在iOS中音频播放从形式上可以分为音效播放和音乐播放。前者主要指的是一些短音频播放,通常作为点缀音频,对于这类音频不需要进行进度、循环等控制。后者指的是一些较长的音...
两种方式 使用高度封装的类UIImagePickerController 使用AVFoundation框架 方式一:UIImagePickerController 特点: 可用于从相簿中选取照片、拍照、录制视频,使用简便,提供的配置有录制视频的质量、前后置摄像头、闪光灯。缺...
最近开发中遇到一个需求,就是想微信那样录制一个小视频,然后在录制视频的图层上播放,然后发布到朋友圈,无声播放,但有滚动起来不影响性能。一开始接到这个需求的时候我是很兴奋的,可以好好研究一番 AVFoundation 的东西了。但是在研究中不断的高潮迭起,也是让我心力交瘁呀。...
序 今天介绍自定义相机的两种方式,一种是UIImagePickerController,一种是AVCaptureSession. UIImagePickerController UIImagePickerController非常方便简单,是苹果自己封装好了的一套API,效果...
在当今社会,人人都在说着三观。三观是什么,很多人可能也说不大出来。每个人心里都有自己的一套标准,我不喜欢这样的人,所以我不跟TA玩,但是我不喜欢TA哪一点,可能又说不清楚。
我从个人、家庭、工作写了我的几个价值观。 一、个人 1、正向:积极正能量,保持向...
table布局基本格式 tr 表示行,td表示列,th表示表头标题最终上面的代码形成下面对应的表格 下面我们一起看看table的属性有哪些呢? border:给表格及单元格加边框,属性值可以是1、2、3...... width: 设置整个表格的宽 height:设置整个表格...
张爱玲说:“我要你知道,在这个世界上总有一个人是等着你的,不管在什么时候,不管在什么地方,反正你知道,总有这么个人。” ...... 这次的旅行,说起来不顺利,忽然爆发的咳嗽,身体不适。原定计划全盘推翻,心心念念想去的地方未曾抵达,总是平添遗憾,或许是机缘未到,想想如此也便...
考每日一题 | 英语:动词短语(高考频度:五星)
学科网 学科网 每日一题=典型例题+知识点解读+学霸推荐试题,内容选自高考必备APP里的免费资料。“每天一道题,学会一类题”,适合高中师生使用! 今天分享的题是英语“动词短语”。 备注:限于篇幅,仅选刊...
据新京报快讯“沅江市三中一学生杀死班主任”,学生罗某与班主任鲍某发生争执后,拿出随身携带的刀具刺伤对方,鲍某经抢救无效死亡。 作为一名普通的教师,听闻此事件真的感到心中凄楚,不仅仅扼腕长叹写下下面这些文字。为的是呼吁诸位教育人所有教师和从事教育工作的朋友,面对这样的惨...

我要回帖

 

随机推荐