如何获得干净的视差图和干净的深度图

时间:2019-05-09 15:44:53

标签: opencv camera-calibration stereo-3d disparity-mapping

目前,我使用的是titathink(TT522PW)的IP摄像机,该摄像机可提供1280 * 720的视频流,具有30 FPS。带有普通传感器(不是具有高灵敏度低照度的型号)

当捕获视频流时,我们在帧上看到鱼眼型失真。

未校正的图像

non-rectified images

  1. 我首先分别校准每个摄像机以消除失真(在校准中,我得到左摄像机的均方根误差 rms_left = 0.166 ,右摄像机的均方根误差 rms_right = 0.162 )。然后,使用由各个校准相机产生的xml文件,对立体声相机进行校准,在立体声校准中,我得到 RMS误差= 0.207

  2. 通过显示校准后的图像,我们可以看到立体声校准效果很好

校准后的图像

the calibrated images

带水平线的整流图像

rectified image with horizontal lines

  1. 我接过dji的功能,用于计算视差图以及计算点云

计算和过滤视差图的代码

bool Disparity_filter::initDispParam(){


#ifdef USE_CUDA
  block_matcher_ = cv::cuda::createStereoBM(num_disp_, block_size_);
#else
  block_matcher_ = cv::StereoBM::create(num_disp_, block_size_);
#endif

#ifdef USE_OPEN_CV_CONTRIB
  wls_filter_ = cv::ximgproc::createDisparityWLSFilter(block_matcher_); // left_matcher
  wls_filter_->setLambda(8000.0);
  wls_filter_->setSigmaColor(1.5);

  right_matcher_ = cv::ximgproc::createRightMatcher(block_matcher_);
#endif
   return true;
}


void Disparity_filter::computeDisparityMap(std::shared_ptr<Frame> framel, std::shared_ptr<Frame> framer){
framel->raw_disparity_map_=cv::Mat(HEIGHT, WIDTH, CV_16SC1); 
#ifdef USE_CUDA
    cv::cuda::GpuMat cuda_disp_left;
    framel->cuda_crop_left.upload(framel->cpu_crop_left);
    framer->cuda_crop_right.upload(framer->cpu_crop_right);

    // GPU implementation of stereoBM outputs uint8_t, i.e. CV_8U
    block_matcher_->compute(framel->cuda_crop_left.clone(),
                          framer->cuda_crop_right.clone(),
                          cuda_disp_left);
    cuda_disp_left.download(framel->raw_disparity_map_);

    framel->raw_disparity_map_.convertTo(framel->disparity_map_8u_, CV_8UC1, 1);

    // convert it from CV_8U to CV_16U for unified
    // calculation in filterDisparityMap() & unprojectPtCloud()
    framel->raw_disparity_map_.convertTo(framel->raw_disparity_map_, CV_16S, 16);
#else

    // CPU implementation of stereoBM outputs short int, i.e. CV_16S

    cv::Mat left_for_matcher ,right_for_matcher;

   left_for_matcher  = framel->cpu_crop_left.clone();
   right_for_matcher = framer->cpu_crop_right.clone();      
    cv::cvtColor(left_for_matcher,  left_for_matcher,  cv::COLOR_BGR2GRAY);
    cv::cvtColor(right_for_matcher, right_for_matcher, cv::COLOR_BGR2GRAY);

    block_matcher_->compute(left_for_matcher, right_for_matcher, framel->raw_disparity_map_);
    framel->raw_disparity_map_.convertTo(framel->disparity_map_8u_, CV_8UC1, 0.0625);
#endif 

} 


void Disparity_filter::filterDisparityMap(std::shared_ptr<Frame> framel, std::shared_ptr<Frame> framer){

    right_matcher_->compute(framer->cpu_crop_right.clone(),
                                                    framel->cpu_crop_left.clone(),
                                                    raw_right_disparity_map_);

  // Only takes CV_16S type cv::Mat
  wls_filter_->filter(framel->raw_disparity_map_,
                      framel->cpu_crop_left,
                      filtered_disparity_map_,
                      raw_right_disparity_map_);

  filtered_disparity_map_.convertTo(framel->filtered_disparity_map_8u_, CV_8UC1, 0.0625); 

}

计算点云的代码


bool PointCloud::initPointCloud(){
    std::string stereo_c2="../calibration/sterolast.xml";  //calib_stereo.xml"; //
  cv::FileStorage ts(stereo_c2,cv::FileStorage::READ);
  if (!ts.isOpened()) {
    std::cerr << "Failed to open calibration parameter file." << std::endl;
    return false;
    }
    cv::Mat P1,P2;
    ts["P1"] >> param_proj_left_;
    ts["P2"] >> param_proj_right_;

  principal_x_ = param_proj_left_.at<double>(0, 2);
  principal_y_ = param_proj_left_.at<double>(1, 2);
  fx_ = param_proj_left_.at<double>(0, 0);
  fy_ = param_proj_left_.at<double>(1, 1);
  baseline_x_fx_ = -param_proj_right_.at<double>(0, 3);
  std::cout<<"** principal_x= " << principal_x_ <<"  ** principal_y= " << principal_y_  <<"  ** fx= " << fx_ <<"  ** fy= " << fy_<<"  ** baseline_x_fx=  " << baseline_x_fx_<<std::endl<< std::flush;
  return true;



    }
void PointCloud::unprojectPtCloud(std::shared_ptr<Frame> framel)
{
  // due to rectification, the image boarder are blank
  // we cut them out
  int border_size = num_disp_;
  const int trunc_img_width_end = HEIGHT - border_size;
  const int trunc_img_height_end = WIDTH - border_size;

   mat_vec3_pt_ = cv::Mat_<cv::Vec3f>(HEIGHT, WIDTH, cv::Vec3f(0, 0, 0));
     cv::Mat color_mat_(HEIGHT, WIDTH, CV_8UC1, &color_buffer_[0])  ;
  for(int v = border_size; v < trunc_img_height_end; ++v)
  {
    for(int u = border_size; u < trunc_img_width_end; ++u)
    {
      cv::Vec3f &point = mat_vec3_pt_.at<cv::Vec3f>(v, u);

#ifdef USE_OPEN_CV_CONTRIB
      float disparity = (float)(framel->raw_disparity_map_.at<short int>(v, u)*0.0625);
#else
      float disparity = (float)(framel->raw_disparity_map_.at<short int>(v, u)*0.0625);
#endif
            //std::cout<<"** disparity " << disparity << std::endl<< std::flush;
      // do not consider pts that are farther than 8.6m, i.e. disparity < 6
      if(disparity >= 60)
      {
        point[2] = baseline_x_fx_/disparity;
        point[0] = (u-principal_x_)*point[2]/fx_;
        point[1] = (v-principal_y_)*point[2]/fy_;
      }
      color_buffer_[v*WIDTH+u] = framel->cpu_crop_left.at<uint8_t>(v, u);
    }
  }

  color_mat_ = cv::Mat(HEIGHT, WIDTH, CV_8UC1, &color_buffer_[0]).clone();
    framel->mat_vec3=mat_vec3_pt_;
    framel->color_m=color_mat_;
    pt_cloud_ = cv::viz::WCloud(mat_vec3_pt_, color_mat_);
}

当我计算视差图并对其进行过滤时,我得到的图不是100%清晰(我们看到尽管相机的位置和障碍物固定在流中,但光线强度仍在变化的区域,不是很清晰,但是可以接受) ),您会看到一个小视频,其中使用RMS = 0.2的校准文件对其进行了测试。

立体视觉视差图测试

test of stereo vision- disparity map

点云

point cloud result

问题

  • 以RMS = 0.20的误差执行的立体声校准是否足以获得清晰的视差图和两个摄像机视场的完整浊点?

  • 如何获取稳定且干净的视差图和干净的深度图?

感谢您的帮助:)

1 个答案:

答案 0 :(得分:0)

  

如何获取稳定且干净的视差图和干净的深度图?

为回答这个问题,我看了您分享的视频。过滤后的视差图看起来不错。您使用的WLS过滤器会给出这样的视差图。没有什么问题。但是通常,对于点云,不建议将过滤后的视差图作为输入。这是因为过滤器倾向于填充算法未找到的孔。换句话说,它们是不可靠的数据。因此,请尝试将未经过滤的视差图作为点云的输入。

此外,您用来查看点云的查看器,即网格实验室经常会吞噬一些点。因此,您可以使用其他查看器,例如CloudCompare。

  

我以RMS = 0.20的误差进行的立体校准是否足以获得清晰的视差图和两个摄像机视场的完整浊点?

是的,在大多数情况下,0.20 RMS误差就足够了。但是,越小越好。

相关问题