AVFoundation相机预览层不工作

问题描述:

所以,我试图使用AVFoundation实现一个摄像头。我想我做的一切正确。这是我在做的

So, I am trying to implement a camera using AVFoundation. I think I do everything right. this is what i am doing


  1. 创建会话

  2. 获取视频类型的设备

  3. 通过设备循环播放照相机

  4. 使用#3中提及的设备获取设备输入并将其添加到会话

  5. 创建 AVCaptureStillImageOutput

  6. 类型的输出设置输出设置并将其添加到会话

  7. 从我的视图2获取CALayer(将在下面解释我的意思2)

  8. 创建 AVCaptureVideoPreviewLayer

  9. 将它添加到#7中提到的图层

  10. 开始运行会话

  1. create session
  2. get devices of video type
  3. loop through devices to get the camera at the back
  4. get a device input using the device mentioned in #3 and add it to the session
  5. create an output of type AVCaptureStillImageOutput
  6. set output settings and add it to the session
  7. get a CALayer from my view 2(will explain below what I mean by view 2)
  8. create an instance of AVCaptureVideoPreviewLayer
  9. add it to the layer mentioned in #7
  10. start running the session

所以我有2个视图一个在另一个。上面的是View 1,下面的是View 2. View 1应该提供自定义的摄像头控件。

So I have 2 views one over the other. The one on top is View 1 and the one below is view 2. View 1 is supposed to provide with custom camera controls.

这里是代码:

self.session = [[AVCaptureSession alloc]init];
[self.session setSessionPreset:AVCaptureSessionPresetHigh];
NSArray *devices = [[NSArray alloc]init];
devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices){
    if([device position] == AVCaptureDevicePositionBack){
        self.device = device;
        break;
    }
}
NSError *error;
self.input = [[AVCaptureDeviceInput alloc]initWithDevice:self.device error:&error];
if([self.session canAddInput:self.input]){
    [self.session addInput:self.input];    
}


self.stillImageOutput = [[AVCaptureStillImageOutput alloc]init];
NSDictionary *outputSettings = @{AVVideoCodecKey : AVVideoCodecJPEG};
[self.stillImageOutput setOutputSettings:outputSettings];

[self.session addOutput:self.stillImageOutput];

CALayer *cameraLayer = self.cameraView.layer;
self.cameraView.backgroundColor = [UIColor clearColor];

AVCaptureVideoPreviewLayer *preview = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.session];
[cameraLayer addSublayer:preview];

[self.session startRunning];

我得到的是View 1(它有一个.png图像作为背景,洞,使其下的视图,视图2可以看到),视图2是可见的,但我看不到我应该做的。因为我改变了视图2的背景颜色以清除颜色,我看到所有的黑色。

What I get is View 1(it has a .png image as its background. the image has a hole so that the view under it, view 2 can be visible) and view 2 is visible but I dont see what I am supposed to. Because I changed the background color for view 2 to clear color I see all black. I am supposed to see what the camera sees.

结果是你必须设置框架,maskToBounds和重力为你的预览层以正常工作。这是我做的

Turns out you have to set frame, maskToBounds and gravity for your preview layer to work correctly. This is how I did it

CALayer *cameraLayer = self.cameraView.layer;
self.cameraView.backgroundColor = [UIColor clearColor];
[cameraLayer setMasksToBounds:YES];
AVCaptureVideoPreviewLayer *preview = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.session];
[preview setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[preview setFrame:[cameraLayer bounds]];


[cameraLayer addSublayer:preview];