Google Tango:将ADF框架与颜色框架对齐

时间:2015-07-15 18:16:35

标签: augmented-reality google-project-tango

我正在尝试创建一个Unity增强现实应用程序,该应用程序可以通过ADF本地化确定姿势。我将相机进纸添加到“持久状态”演示,但我的ADF本地化框架与相机进纸不对齐。地面总是似乎转变为超越地平线。

2 个答案:

答案 0 :(得分:1)

我有同样的问题,我找到了解决方案。问题是在PoseController脚本中,地平线未正确设置真实世界范围,但在AugmentedReality示例中,它正确完成(在ARScreen脚本中)。因此,在我的AR应用程序中,当我远离它们时,看起来它们向上移动。

要解决此问题,您可以像Guillaume所说的那样在AugmentedReality示例中启动应用程序,或者在PoseController脚本中进行下一次更改:

- 首先,我们将矩阵 m_matrixdTuc = new Matrix4x4(); m_matrixdTuc.SetColumn(0, new Vector4(1.0f, 0.0f, 0.0f, 0.0f)); m_matrixdTuc.SetColumn(1, new Vector4(0.0f, 1.0f, 0.0f, 0.0f)); m_matrixdTuc.SetColumn(2, new Vector4(0.0f, 0.0f, -1.0f, 0.0f)); m_matrixdTuc.SetColumn(3, new Vector4(0.0f, 0.0f, 0.0f, 1.0f)); 初始化如下:

m_matrixcTuc

好吧,我们必须将其重命名为 m_matrixcTuc = new Matrix4x4(); m_matrixcTuc.SetColumn(0, new Vector4(1.0f, 0.0f, 0.0f, 0.0f)); m_matrixcTuc.SetColumn(1, new Vector4(0.0f, -1.0f, 0.0f, 0.0f)); m_matrixcTuc.SetColumn(2, new Vector4(0.0f, 0.0f, 1.0f, 0.0f)); m_matrixcTuc.SetColumn(3, new Vector4(0.0f, 0.0f, 0.0f, 1.0f)); 并更改-1.0f值:

m_matrixdTuc

这是因为我们不会直接使用这个矩阵来获得相机的变换,但是我们要先做一些操作才能得到真正的private Matrix4x4 m_matrixdTuc; // Device frame with respect to IMU frame. private Matrix4x4 m_imuTd; // Color camera frame with respect to IMU frame. private Matrix4x4 m_imuTc;

-2,我们必须初始化这三个变量:

m_matrixdTuc

(真正的_SetCameraExtrinsics和其他2来自ARScreen脚本)

-3rd我们需要ARScreen的/// <summary> /// The function is for querying the camera extrinsic, for example: the transformation between /// IMU and device frame. These extrinsics is used to transform the pose from the color camera frame /// to the device frame. Because the extrinsic is being queried using the GetPoseAtTime() /// with a desired frame pair, it can only be queried after the ConnectToService() is called. /// /// The device with respect to IMU frame is not directly queryable from API, so we use the IMU /// frame as a temporary value to get the device frame with respect to IMU frame. /// </summary> private void _SetCameraExtrinsics() { double timestamp = 0.0; TangoCoordinateFramePair pair; TangoPoseData poseData = new TangoPoseData(); // Getting the transformation of device frame with respect to IMU frame. pair.baseFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_IMU; pair.targetFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_DEVICE; PoseProvider.GetPoseAtTime(poseData, timestamp, pair); Vector3 position = new Vector3((float)poseData.translation[0], (float)poseData.translation[1], (float)poseData.translation[2]); Quaternion quat = new Quaternion((float)poseData.orientation[0], (float)poseData.orientation[1], (float)poseData.orientation[2], (float)poseData.orientation[3]); m_imuTd = Matrix4x4.TRS(position, quat, new Vector3(1.0f, 1.0f, 1.0f)); // Getting the transformation of IMU frame with respect to color camera frame. pair.baseFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_IMU; pair.targetFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_CAMERA_COLOR; PoseProvider.GetPoseAtTime(poseData, timestamp, pair); position = new Vector3((float)poseData.translation[0], (float)poseData.translation[1], (float)poseData.translation[2]); quat = new Quaternion((float)poseData.orientation[0], (float)poseData.orientation[1], (float)poseData.orientation[2], (float)poseData.orientation[3]); m_imuTc = Matrix4x4.TRS(position, quat, new Vector3(1.0f, 1.0f, 1.0f)); m_matrixdTuc = Matrix4x4.Inverse(m_imuTd) * m_imuTc * m_matrixcTuc; } 方法:

m_matrixdTuc

现在您可以看到,我们获得将在OnTangoPoseAvailable方法中使用的{{1}}变量的真实值。

我真的不明白这种方法背后的数学,但我发现它在我的应用程序上完美运行。希望它也适合你:)

答案 1 :(得分:0)

I already tried something similar. If I understand well, the objects you add to be augmented are rotated toward the top of the screen, and this is especially visible when they are far, as they look like lifted upward.

To make it work, I simply started my project inside the Experimental Augmented Reality app with the "red map marker": there, the camera feed is already aligned to the world properly.