English 中文(简体)
AVCaptureSession为什么产生错误倾向?
原标题:Why AVCaptureSession output a wrong orientation?
问题回答

AVCaptureSession.h 有一个名为的版面定义,其中界定了各种视频取向。 在AVCaptureConnection的物体上,有一个称为录像的财产。 定向,即。 你应该能够这样做,改变录像的方向。 您可能希望AVCaptureVideoOrientationLand爱因

你可以通过审查本届会议的产出,找到本届会议的AVCaptureConnections。 这些产出具有联系财产,是该产出的一系列联系。

这一切都使这一点变得困难。

在DidOutput SampleBuffer,简单地改变了方向,然后看重形象。 它是独一无二的。

    public class OutputRecorder : AVCaptureVideoDataOutputSampleBufferDelegate {    
        public override void DidOutputSampleBuffer (AVCaptureOutput captureOutput, CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
        {
            try {
                connection.videoOrientation = AVCaptureVideoOrientation.LandscapeLeft;

in C. 这种方法

- ( void ) captureOutput: ( AVCaptureOutput * ) captureOutput
   didOutputSampleBuffer: ( CMSampleBufferRef ) sampleBuffer
      fromConnection: ( AVCaptureConnection * ) connection

我对<代码>imageFrom SampleBuffer作了简单的一线修改,以纠正取向问题(见我对“修改......”守则的评论)。 希望会帮助某个人,因为我花了太多时间。

// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer  {
    // Get a CMSampleBuffer s Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 

    // Get the number of bytes per row for the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context1 = CGBitmapContextCreate(baseAddress, width, height, 8, 
                                                 bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);

    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context1); 
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // Free up the context and color space
    CGContextRelease(context1); 
    CGColorSpaceRelease(colorSpace);

    // Create an image object from the Quartz image
    //I modified this line: [UIImage imageWithCGImage:quartzImage]; to the following to correct the orientation:
    UIImage *image =  [UIImage imageWithCGImage:quartzImage scale:1.0 orientation:UIImageOrientationRight]; 

    // Release the Quartz image
    CGImageRelease(quartzImage);

    return (image);
}

这里正确的顺序是:

AVCaptureVideoDataOutput *videoCaptureOutput = [[AVCaptureVideoDataOutput alloc] init];

if([self.captureSession canAddOutput:self.videoCaptureOutput]){
    [self.captureSession addOutput:self.videoCaptureOutput];
}else{
    NSLog(@"cantAddOutput");
}

// set portrait orientation
AVCaptureConnection *conn = [self.videoCaptureOutput connectionWithMediaType:AVMediaTypeVideo];
[conn setVideoOrientation:AVCaptureVideoOrientationPortrait];

对于那些需要与来自缓冲地带的冰雪和方向合作的人,我使用了这一纠正方法。

这样做容易。 BTW,3,1,6,8个来自以下网站:https://developer.apple.com/vis/imageio/kcgimagepropertyorientation”rel=“noreferer”>:https://developer.apple.com/vis/imageio/kcgimagepropertyorientation

不要问我,3,1,6,8 是正确的组合。 我使用了强势方法找到。 如果你知道为何在评论中解释?

- (void)captureOutput:(AVCaptureOutput *)captureOutput
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
           fromConnection:(AVCaptureConnection *)connection
{

    // common way to get CIImage

    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    CFDictionaryRef attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, sampleBuffer, kCMAttachmentMode_ShouldPropagate);

    CIImage *ciImage = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer
                                                      options:(__bridge NSDictionary *)attachments];

    if (attachments) {
       CFRelease(attachments);
    }

    // fixing the orientation of the CIImage

    UIInterfaceOrientation curOrientation = [[UIApplication sharedApplication] statusBarOrientation];

    if (curOrientation == UIInterfaceOrientationLandscapeLeft){
        ciImage = [ciImage imageByApplyingOrientation:3];
    } else if (curOrientation == UIInterfaceOrientationLandscapeRight){
        ciImage = [ciImage imageByApplyingOrientation:1];
    } else if (curOrientation == UIInterfaceOrientationPortrait){
        ciImage = [ciImage imageByApplyingOrientation:6];
    } else if (curOrientation == UIInterfaceOrientationPortraitUpsideDown){
        ciImage = [ciImage imageByApplyingOrientation:8];
    }



    // ....

}

如果“AVCaptureVideoPreview Layer的取向正确,你可以简单地确定方向,然后掌握图像。

AVCaptureStillImageOutput *stillImageOutput;
AVCaptureVideoPreviewLayer *previewLayer;
NSData *capturedImageData;

AVCaptureConnection *videoConnection = [stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
if ([videoConnection isVideoOrientationSupported]) {
    [videoConnection setVideoOrientation:previewLayer.connection.videoOrientation];
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
    CFDictionaryRef exifAttachments =
            CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
    if (exifAttachments) {
        // Do something with the attachments.
    }
    // TODO need to manually add GPS data to the image captured
    capturedImageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
    UIImage *image = [UIImage imageWithData:capturedImageData];
}];

此外,必须指出,<代码>UIImageOrientation和AVCaptureVideoOrientation是不同的。 <代码>UIImageOrientation 系指对地体进行量控制(<>m>down>>/em>的地貌模式(not),如果你想把体积控制作为封闭的纽吨的话。

因此,对电纽吨至天空()的孔隙方向实际上是UIImageOrientationLeft

首先,在您的视频产出组合中,这些内容如下:

guard let connection = videoOutput.connection(withMediaType: 
AVFoundation.AVMediaTypeVideo) else { return }
guard connection.isVideoOrientationSupported else { return }
guard connection.isVideoMirroringSupported else { return }
connection.videoOrientation = .portrait
connection.isVideoMirrored = position == .front

然后,通过在总组合中不核对地貌景观模式,将你支持港口的目标混为一谈。

(Source)

定向问题涉及前线照相机、检查装置类型和产生新形象,肯定会解决方向问题:

-(void)capture:(void(^)(UIImage *))handler{

AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in self.stillImageOutput.connections)
{
    for (AVCaptureInputPort *port in [connection inputPorts])
    {
        if ([[port mediaType] isEqual:AVMediaTypeVideo] )
        {
            videoConnection = connection;
            break;
        }
    }
    if (videoConnection) { break; }
}

[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {

    if (imageSampleBuffer != NULL) {
        NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
        **UIImage *capturedImage = [UIImage imageWithData:imageData];
        if (self.captureDevice == [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo][1]) {
            capturedImage = [[UIImage alloc] initWithCGImage:capturedImage.CGImage scale:1.0f orientation:UIImageOrientationLeftMirrored];
        }**

        handler(capturedImage);
    }
}];
}
// #1
AVCaptureVideoOrientation newOrientation = AVCaptureVideoOrientationLandscapeRight;
if (@available(iOS 13.0, *)) {
    // #2
    for (AVCaptureConnection *connection in [captureSession connections]) {
        if ([connection isVideoOrientationSupported]) {
            connection.videoOrientation = newOrientation;
            break;
        }
    } // #3
} else if ([previewLayer.connection isVideoOrientationSupported]) {
    previewLayer.connection.videoOrientation = newOrientation;
}

Once that you can correctly use your AVCaptureSession, you can set a video orientation. Here a detailed description of the code above. Remember, this code has to be executed after the [captureSession startRunning] execution:

  1. Choose the orientation that you prefer
  2. For ios version >= 13.0 you have to retrieve the active connection from the captureSession. Remember: only video connection supports videoOrientation
  3. For ios version < 13.0 you can use the connection from the previewLayer

如果您认为主计长没有固定方向,那么,一旦设备定位发生变化,你就可以在您的链接上设置一个新的<条码>。

你可以尝试:

private func startLiveVideo() {

    let captureSession = AVCaptureSession()
    captureSession.sessionPreset = .photo
    let captureDevice = AVCaptureDevice.default(for: .video)

    let input = try! AVCaptureDeviceInput(device: captureDevice!)
    let output = AVCaptureVideoDataOutput()
    captureSession.addInput(input)
    captureSession.addOutput(output)

    output.setSampleBufferDelegate(self, queue: DispatchQueue(label: "videoQueue"))
    output.connection(with: .video)?.videoOrientation = .portrait

    let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
    previewLayer.frame = view.bounds
    view.layer.addSublayer(previewLayer)

    captureSession.startRunning()
}




相关问题
Code sign Error

I have created a new iPhone application.I have two mach machines. I have created the certificate for running application in iPhone in one mac. Can I use the other mac for running the application in ...

ABPersonViewController Usage for displaying contact

Created a View based Project and added a contact to the AddressBook using ABAddressBookRef,ABRecordRef now i wanted to display the added contact ABPersonViewController is the method but how to use in ...

将音频Clips从Peter改为服务器

我不禁要问,那里是否有任何实例表明从Peit向服务器发送音响。 I m不关心电话或SIP风格的解决办法,只是一个简单的袖珍流程......

• 如何将搜查线重新定位?

我正试图把图像放在搜索条左边。 但是,问题始于这里,搜索条线不能重新布署。

热门标签