dispatch_barrier_async并不等待CoreImage完成处理

时间:2018-01-22 18:05:58

标签: ios multithreading grand-central-dispatch core-image dispatch-async

在我提出问题之前,我应该说我已经阅读了很多关于它的内容,并且我已经尝试了很多方法,但没有任何方法可行。我在并发队列中进行了数十次Core Image处理,我需要等待它们才能使用dispatch_barrier_async来完成,所以只有这样我才能进行最后的渲染并转到下一个视图控制器,但具有讽刺意味的是,dispatch_barrier没有& #39;等待我的并发队列完成,为什么?是因为我在错误的线程中进行核心图像处理?

//这是我的并发队列。

dispatch_queue_t concurrentQueue = 
dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0);

此处使用它来处理效果

-(void)setupEffects{
//It's one of my effects as an example which renders for previewing the //effect.

case Effect4:{
              dispatch_async(concurrentQueue, ^{
                  //BG
                  self.firstCIFilter = [CIFilter filterWithName:@"CIHexagonalPixellate"
                                            withInputParameters:@{@"inputImage": [self getFirstCIImage],@"inputScale":@26}];
                  self.lastSelectedInputImgforBG =[self applyCIEffectWithCrop];

                  //FG
                  self.firstCIFilter = [CIFilter filterWithName:@"CIPhotoEffectProcess"
                                            withInputParameters:@{@"inputImage":[self getFirstCIImage]}];
                  self.fgImgWithEffect = [self applyCIEffect];

                  dispatch_async(dispatch_get_main_queue(), ^{
                      self.lastSelectedInputImgforFG= [self cropAndFadeAndRenderFGImage];
                      [self saveEffect];
                      [self loadEffectsWithIndex:effectIndex];
                  });
              });
 }

//Once user is done, it renders the image once again
-(UIImage *)applyCIEffectWithCrop{
    __weak typeof(self) weakSelf = self;
    @autoreleasepool{
        weakSelf.firstCIContext =nil;
        weakSelf.firstResultCIImage=nil;
        weakSelf.croppingCIImage=nil;

        weakSelf.firstCIContext = [CIContext contextWithOptions:nil];
        weakSelf.firstResultCIImage = [weakSelf.firstCIFilter valueForKey:kCIOutputImageKey];
        weakSelf.croppingCIImage=[weakSelf.firstResultCIImage imageByCroppingToRect:CGRectMake(0,0, weakSelf.affineClampImage1.size.width*scale , weakSelf.affineClampImage1.size.height*scale)];
        return  [UIImage imageFromCIImage:weakSelf.croppingCIImage scale:1.0 orientation:weakSelf.scaledDownInputImage.imageOrientation cropped:YES withFirstCIImage:[weakSelf getFirstCIImage]];
    }
}

然后对于我的最终渲染,此方法需要等待我的setupEffect完成然后使用segue但它不会。

- (void)doneButtonAction {
    _finalRender =YES;
    CGFloat max=MAX(self.originalSizeInputImage.size.width,self.originalSizeInputImage.size.height);
    if (max<=1700){
        //Do nothing for Final Render
        self.scaledDownInputImage= self.originalSizeInputImage;
    }else{
        CGSize scaledDownSize = [self getScalingSizeForFinalRenderForImage: self.originalSizeInputImage];
        self.scaledDownInputImage = [self scaleThisImage:self.originalSizeInputImage scaledToFillSize:scaledDownSize];
    }
    imageRect = AVMakeRectWithAspectRatioInsideRect(self.scaledDownInputImage.size, self.viewWithLoadedImages.bounds);

    //Preparation for high quality render with high resolution input 
    //image.
    self.affineClampImage1 = [self affineClampImage];
    self.selectionCropAndBlurredImage = [self croppedFGtoGetBlurred];
    [self.imgData appendData:UIImagePNGRepresentation(self.scaledDownInputImage)];
    [self.effectClass getimageWithImageData:self.imgData];

    if (_effectMode) {
        //Applying effects again for the high resolution input image.
        [self setupEffects];
    }else{
        [self setupFilters];
    }

    dispatch_async(concurrentQueue, ^{
        //Rendering the high quality Images in full resolution here.
        CGRect frame = CGRectMake(0.0, 0.0,
                                  self.lastSelectedInputImgforBG.size.width  *self.lastSelectedInputImgforBG.scale,
                                  self.lastSelectedInputImgforBG.size.height *self.lastSelectedInputImgforBG.scale);
        UIGraphicsBeginImageContextWithOptions(frame.size, NO, 1.0);
        // Draw transparent images on top of each other
        [self.lastSelectedInputImgforBG drawInRect:frame];
        [self.lastSelectedInputImgforFG drawInRect:frame];
        self.tempImage=nil;
        self.tempImage = UIGraphicsGetImageFromCurrentImageContext();        
        UIGraphicsEndImageContext();
    });

    dispatch_barrier_async(concurrentQueue, ^{
        // Getting the full resolution rendered image and going to 
        //the next viewcontroller when the setupEffect and render is 
        //finished... which it doesn't wait until they're finished...
        self.finalHightqualityRenderedImage = self.tempImage;        
        [self performSegueWithIdentifier:@"showShareVC" sender:self];
    });
}

我应该提到我的代码在没有使用我的并发队列的情况下可以正常工作但当然会阻止UI直到完成,这不是我的目标。 我们将非常感谢您的帮助。

1 个答案:

答案 0 :(得分:2)

我认为解释位于dispatch_barrier_async

的底部
  

您指定的队列应该是您使用dispatch_queue_create函数自己创建的并发队列。如果传递给此函数的队列是串行队列或全局并发队列之一,则此函数的行为类似于dispatch_async函数。

所以不要按照你的标准抓住DISPATCH_QUEUE_PRIORITY_BACKGROUND  第一行代码,使用concurrentQueue自己创建dispatch_queue_create

相关问题