Ocean
Loading...
Searching...
No Matches
Ocean::Geometry::EpipolarGeometry Class Reference

This class implements epipolar geometry functions. More...

#include <EpipolarGeometry.h>

Static Public Member Functions

static bool fundamentalMatrix (const Vector2 *leftPoints, const Vector2 *rightPoints, const size_t correspondences, SquareMatrix3 &right_F_left)
 Calculates the fundamental matrix based on corresponding image points in a 'left' and in a 'right' camera image.
 
static SquareMatrix3 reverseFundamentalMatrix (const SquareMatrix3 &right_F_left)
 Returns the reverted fundamental matrix.
 
static bool essentialMatrix (const Vector3 *leftImageRays, const Vector3 *rightImageRays, const size_t correspondences, SquareMatrix3 &normalizedRight_E_normalizedLeft)
 Calculates the essential matrix based on corresponding viewing rays from the 'left' and 'right' camera.
 
static bool essentialMatrixF (const Vector3 *flippedLeftImageRays, const Vector3 *flippedRightImageRays, const size_t correspondences, SquareMatrix3 &normalizedRight_E_normalizedLeft)
 Calculates the essential matrix based on corresponding viewing rays from the 'left' and 'right' camera.
 
static SquareMatrix3 essentialMatrix (const HomogenousMatrix4 &rightCamera_T_leftCamera)
 Calculates the essential matrix based on 6-DOF camera pose between two cameras.
 
static SquareMatrix3 essential2fundamental (const SquareMatrix3 &normalizedRight_E_normalizedLeft, const SquareMatrix3 &leftIntrinsic, const SquareMatrix3 &rightIntrinsic)
 Calculates the fundamental matrix from a given essential matrix and the two intrinsic camera matrices.
 
static SquareMatrix3 essential2fundamental (const SquareMatrix3 &normalizedRight_E_normalizedLeft, const PinholeCamera &leftCamera, const PinholeCamera &rightCamera)
 Calculates the fundamental matrix from a given essential matrix and the two intrinsic camera matrices.
 
static SquareMatrix3 fundamental2essential (const SquareMatrix3 &right_F_left, const SquareMatrix3 &leftIntrinsic, const SquareMatrix3 &rightIntrinsic)
 Calculates the essential matrix from a given fundamental matrix and the two intrinsic camera matrices.
 
static SquareMatrix3 fundamental2essential (const SquareMatrix3 &right_F_left, const PinholeCamera &leftCamera, const PinholeCamera &rightCamera)
 Calculates the essential matrix by the given fundamental matrix and the two cameras.
 
static bool epipoles (const SquareMatrix3 &right_F_left, Vector2 &leftEpipole, Vector2 &rightEpipole)
 Determines the two epipoles corresponding to a fundamental matrix.
 
static bool epipoles (const HomogenousMatrix4 &extrinsic, const SquareMatrix3 &leftIntrinsic, const SquareMatrix3 &rightIntrinsic, Vector2 &leftEpipole, Vector2 &rightEpipole)
 Determines the two epipoles corresponding to two cameras separated by an extrinsic camera matrix.
 
static bool epipolesFast (const SquareMatrix3 &fundamental, Vector2 &leftEpipole, Vector2 &rightEpipole)
 Finds the two epipoles corresponding to a fundamental matrix.
 
static Line2 leftEpipolarLine (const SquareMatrix3 &fundamental, const Vector2 &rightPoint)
 Returns the epipolar line in the left image corresponding to a given point in the right image.
 
static Line2 rightEpipolarLine (const SquareMatrix3 &fundamental, const Vector2 &leftPoint)
 Returns the epipolar line in the right image corresponding to a given point in the left image.
 
static size_t factorizeEssential (const SquareMatrix3 &normalizedRight_E_normalizedLeft, const PinholeCamera &leftCamera, const PinholeCamera &rightCamera, const Vector2 *leftPoints, const Vector2 *rightPoints, const size_t correspondences, HomogenousMatrix4 &left_T_right)
 Factorizes an essential matrix into a 6-DOF camera pose composed of rotation and translation.
 
static bool rectificationHomography (const HomogenousMatrix4 &transformation, const PinholeCamera &pinholeCamera, SquareMatrix3 &leftHomography, SquareMatrix3 &rightHomography, Quaternion &appliedRotation, PinholeCamera *newCamera)
 Determines the homograph for two (stereo) frames rectifying both images using the transformation between the left and the right camera.
 
static Vectors3 triangulateImagePoints (const HomogenousMatrix4 &world_T_cameraA, const HomogenousMatrix4 &world_T_cameraB, const AnyCamera &anyCameraA, const AnyCamera &anyCameraB, const Vector2 *imagePointsA, const Vector2 *imagePointsB, const size_t numberPoints, const bool onlyFrontObjectPoints=true, const Vector3 &invalidObjectPoint=Vector3(Numeric::minValue(), Numeric::minValue(), Numeric::minValue()), Indices32 *invalidIndices=nullptr)
 Calculates the 3D positions for a pair of image point correspondences with corresponding extrinsic camera transformations.
 
static ObjectPoints triangulateImagePointsIF (const PinholeCamera &camera1, const HomogenousMatrix4 &iFlippedPose1, const PinholeCamera &camera2, const HomogenousMatrix4 &iFlippedPose2, const Vector2 *points1, const Vector2 *points2, const size_t correspondences, const Vector3 &invalidObjectPoint=Vector3(Numeric::minValue(), Numeric::minValue(), Numeric::minValue()), Indices32 *invalidIndices=nullptr)
 Calculates the 3D positions for a set of image point correspondences with corresponding poses (Rt) in inverted flipped camera system.
 
static ObjectPoints triangulateImagePointsIF (const ConstIndexedAccessor< HomogenousMatrix4 > &posesIF, const ConstIndexedAccessor< Vectors2 > &imagePointsPerPose, const PinholeCamera *pinholeCamera=nullptr, const Vector3 &invalidObjectPoint=Vector3(Numeric::minValue(), Numeric::minValue(), Numeric::minValue()), Indices32 *invalidIndices=nullptr)
 Calculates the 3D positions for a set of image point correspondences in multiple views with corresponding camera projection matrices (K * Rt) or poses (Rt) in inverted flipped camera system.
 

Static Protected Member Functions

template<bool tRaysAreFlipped>
static bool essentialMatrix (const Vector3 *leftImageRays, const Vector3 *rightImageRays, const size_t correspondences, SquareMatrix3 &normalizedRight_E_normalizedLeft)
 Calculates the essential matrix based on corresponding viewing rays from the 'left' and 'right' camera.
 
static size_t validateCameraPose (const HomogenousMatrix4 &leftCamera_T_rightCamera, const PinholeCamera &leftCamera, const PinholeCamera &rightCamera, const Vector2 *leftPoints, const Vector2 *rightPoints, const size_t correspondences)
 Returns the number of 3D object points lying in front of two cameras for a given transformation between the two cameras.
 
static Line2 epipolarLine2Line (const Vector3 &line)
 Converts a epipolar line to a line object.
 

Detailed Description

This class implements epipolar geometry functions.

Member Function Documentation

◆ epipolarLine2Line()

Line2 Ocean::Geometry::EpipolarGeometry::epipolarLine2Line ( const Vector3 line)
inlinestaticprotected

Converts a epipolar line to a line object.

Parameters
lineThe epipolar line to be converted
Returns
Resulting line object

◆ epipoles() [1/2]

static bool Ocean::Geometry::EpipolarGeometry::epipoles ( const HomogenousMatrix4 extrinsic,
const SquareMatrix3 leftIntrinsic,
const SquareMatrix3 rightIntrinsic,
Vector2 leftEpipole,
Vector2 rightEpipole 
)
static

Determines the two epipoles corresponding to two cameras separated by an extrinsic camera matrix.

The matrix will be calculated by the extrinsic camera matrix of the right camera relative to the left camera,
and the two intrinsic camera matrices of both cameras.

Parameters
extrinsicThe extrinsic camera matrix of the right camera relative to the left camera (rightTleft)
leftIntrinsicIntrinsic camera matrix of the left camera
rightIntrinsicIntrinsic camera matrix of the right camera
leftEpipoleResulting left epipole
rightEpipoleResulting right epipole
Returns
True, if succeeded
See also
essentialMatrix().

◆ epipoles() [2/2]

static bool Ocean::Geometry::EpipolarGeometry::epipoles ( const SquareMatrix3 right_F_left,
Vector2 leftEpipole,
Vector2 rightEpipole 
)
static

Determines the two epipoles corresponding to a fundamental matrix.

This method uses singular values decomposition for the calculation.

Parameters
right_F_leftThe fundamental matrix to convert into essential matrix, must be valid
leftEpipoleResulting left epipole
rightEpipoleResulting right epipole
Returns
True, if succeeded

◆ epipolesFast()

static bool Ocean::Geometry::EpipolarGeometry::epipolesFast ( const SquareMatrix3 fundamental,
Vector2 leftEpipole,
Vector2 rightEpipole 
)
static

Finds the two epipoles corresponding to a fundamental matrix.

This method calculates the intersection of two epipolar lines. If no intersection can be found the SVD calculation is used.

Parameters
fundamentalThe fundamental matrix to extract the epipoles from
leftEpipoleResulting left epipole
rightEpipoleResulting right epipole
Returns
True, if succeeded

◆ essential2fundamental() [1/2]

static SquareMatrix3 Ocean::Geometry::EpipolarGeometry::essential2fundamental ( const SquareMatrix3 normalizedRight_E_normalizedLeft,
const PinholeCamera leftCamera,
const PinholeCamera rightCamera 
)
static

Calculates the fundamental matrix from a given essential matrix and the two intrinsic camera matrices.

Parameters
normalizedRight_E_normalizedLeftThe essential matrix to convert, must be valid
leftCameraThe left camera profile defining the projection, must be a pure pinhole model without any distortion parameters
rightCameraThe right camera profile defining the projection, must be a pure pinhole model without any distortion parameters
Returns
The resulting fundamental matrix 'right_F_left'

◆ essential2fundamental() [2/2]

static SquareMatrix3 Ocean::Geometry::EpipolarGeometry::essential2fundamental ( const SquareMatrix3 normalizedRight_E_normalizedLeft,
const SquareMatrix3 leftIntrinsic,
const SquareMatrix3 rightIntrinsic 
)
static

Calculates the fundamental matrix from a given essential matrix and the two intrinsic camera matrices.

Parameters
normalizedRight_E_normalizedLeftThe essential matrix to convert, must be valid
leftIntrinsicThe left intrinsic camera matrix, must be valid
rightIntrinsicThe right intrinsic camera matrix, must be valid
fundamentalThe resulting fundamental matrix 'right_F_left'

◆ essentialMatrix() [1/3]

static SquareMatrix3 Ocean::Geometry::EpipolarGeometry::essentialMatrix ( const HomogenousMatrix4 rightCamera_T_leftCamera)
static

Calculates the essential matrix based on 6-DOF camera pose between two cameras.

The resulting essential matrix is defined by the following equation:

normalizedRightPoint^T * normalizedRight_E_normalizedLeft * normalizedLeftPoint = 0
with normalizedRightPoint = [flippedObjectPoint.x() / flippedObjectPoint.z(), flippedObjectPoint.y() / flippedObjectPoint.z(), 1]^T

The flipped object points and the normalized image points are defined in the flipped camera, with default flipped camera pointing towards the positive z-space with y-axis upwards.

Parameters
rightCamera_T_leftCameraThe transformation transforming the left camera to the right camera, with default camera pointing towards the negative z-space with y-axis upwards, must be valid
Returns
The resulting essential matrix 'normalizedRight_E_normalizedLeft'

◆ essentialMatrix() [2/3]

static bool Ocean::Geometry::EpipolarGeometry::essentialMatrix ( const Vector3 leftImageRays,
const Vector3 rightImageRays,
const size_t  correspondences,
SquareMatrix3 normalizedRight_E_normalizedLeft 
)
static

Calculates the essential matrix based on corresponding viewing rays from the 'left' and 'right' camera.

This function implements the 8-point algorithm based on corresponding viewing rays. The resulting essential matrix is defined by the following equation:

rightViewingRay^T * normalizedRight_E_normalizedLeft * leftViewingRay = 0

The normalized image points are defined in the flipped camera, with default flipped camera pointing towards the positive z-space with y-axis upwards.

Parameters
leftImageRaysThree 3D rays with unit length, defined in the coordinate system of the left camera, starting at the camera's center of projection (equal to the origin) and hitting the image plane at a known image points
rightImageRaysThe 3D rays with unit length, defined in the coordinate system of the right camera, starting at the camera's center of projection (equal to the origin) and hitting the image plane at a known image points, one for each left image ray
correspondencesThe number of bearing vector correspondences, with range [8, infinity)
normalizedRight_E_normalizedLeftThe resulting essential matrix
Returns
True, if succeeded

◆ essentialMatrix() [3/3]

template<bool tRaysAreFlipped>
static bool Ocean::Geometry::EpipolarGeometry::essentialMatrix ( const Vector3 leftImageRays,
const Vector3 rightImageRays,
const size_t  correspondences,
SquareMatrix3 normalizedRight_E_normalizedLeft 
)
staticprotected

Calculates the essential matrix based on corresponding viewing rays from the 'left' and 'right' camera.

This function implements the 8-point algorithm based on corresponding viewing rays. The resulting essential matrix is defined by the following equation:

rightViewingRay^T * normalizedRight_E_normalizedLeft * leftViewingRay = 0

The normalized image points are defined in the flipped camera, with default flipped camera pointing towards the positive z-space with y-axis upwards.

Parameters
leftImageRaysThree 3D rays with unit length, defined in the coordinate system of the left camera, starting at the camera's center of projection (equal to the origin) and hitting the image plane at a known image points
rightImageRaysThe 3D rays with unit length, defined in the coordinate system of the right camera, starting at the camera's center of projection (equal to the origin) and hitting the image plane at a known image points, one for each left image ray
correspondencesThe number of bearing vector correspondences, with range [8, infinity)
normalizedRight_E_normalizedLeftThe resulting essential matrix
Returns
True, if succeeded
Template Parameters
tRaysAreFlippedTrue, to interpret the rays as flipped (pointing towards the positive z-space with y-axis upwards); False, to interpret the rays as normal (pointing towards the negative z space with y-axis downwards)

◆ essentialMatrixF()

static bool Ocean::Geometry::EpipolarGeometry::essentialMatrixF ( const Vector3 flippedLeftImageRays,
const Vector3 flippedRightImageRays,
const size_t  correspondences,
SquareMatrix3 normalizedRight_E_normalizedLeft 
)
static

Calculates the essential matrix based on corresponding viewing rays from the 'left' and 'right' camera.

This function implements the 8-point algorithm based on corresponding viewing rays. The resulting essential matrix is defined by the following equation:

rightViewingRay^T * normalizedRight_E_normalizedLeft * leftViewingRay = 0

The normalized image points are defined in the flipped camera, with default flipped camera pointing towards the positive z-space with y-axis upwards.

Parameters
flippedLeftImageRaysThree flipped 3D rays with unit length, defined in the flipped coordinate system of the left camera, starting at the camera's center of projection (equal to the origin) and hitting the image plane at a known image points
flippedRightImageRaysThe flipped 3D rays with unit length, defined in the flipped coordinate system of the right camera, starting at the camera's center of projection (equal to the origin) and hitting the image plane at a known image points, one for each left image ray
correspondencesThe number of bearing vector correspondences, with range [8, infinity)
normalizedRight_E_normalizedLeftThe resulting essential matrix
Returns
True, if succeeded

◆ factorizeEssential()

static size_t Ocean::Geometry::EpipolarGeometry::factorizeEssential ( const SquareMatrix3 normalizedRight_E_normalizedLeft,
const PinholeCamera leftCamera,
const PinholeCamera rightCamera,
const Vector2 leftPoints,
const Vector2 rightPoints,
const size_t  correspondences,
HomogenousMatrix4 left_T_right 
)
static

Factorizes an essential matrix into a 6-DOF camera pose composed of rotation and translation.

Beware: The translation can be determined up to a scale factor only.
The factorization provides the camera pose for the right camera while the left camera is located at the origin with identity pose.
The resulting transformation transforms points defined in the right camera coordinate system into points defined in the left camera coordinate system: pointLeft = left_T_right * pointRight.

Parameters
normalizedRight_E_normalizedLeftThe essential matrix to be factorized, must be valid
leftCameraThe left camera profile defining the projection, must be valid
rightCameraThe right camera profile defining the projection, must be valid
leftPointsAll image points in the left image to be checked whether they produce 3D object points lying in front of the camera, must be valid
rightPointsAll image points in the right image, one for each left point, must be valid
correspondencesThe number of point correspondences, with range [1, infinity)
left_T_rightThe resulting transformation between the left and the right camera, transforming points from right to left
Returns
The number of given image points resulting in valid object points, with range [0, infinity)

◆ fundamental2essential() [1/2]

static SquareMatrix3 Ocean::Geometry::EpipolarGeometry::fundamental2essential ( const SquareMatrix3 right_F_left,
const PinholeCamera leftCamera,
const PinholeCamera rightCamera 
)
static

Calculates the essential matrix by the given fundamental matrix and the two cameras.

Parameters
right_F_leftThe fundamental matrix to convert into essential matrix, must be valid
leftCameraThe left camera profile defining the projection, must be a pure pinhole model without any distortion parameters
rightCameraThe right camera profile defining the projection, must be a pure pinhole model without any distortion parameters
Returns
The resulting essential matrix 'normalizedRight_E_normalizedLeft'

◆ fundamental2essential() [2/2]

static SquareMatrix3 Ocean::Geometry::EpipolarGeometry::fundamental2essential ( const SquareMatrix3 right_F_left,
const SquareMatrix3 leftIntrinsic,
const SquareMatrix3 rightIntrinsic 
)
static

Calculates the essential matrix from a given fundamental matrix and the two intrinsic camera matrices.

Parameters
right_F_leftThe fundamental matrix to convert into essential matrix, must be valid
leftIntrinsicLeft intrinsic camera matrix
rightIntrinsicRight intrinsic camera matrix
Returns
The resulting essential matrix 'normalizedRight_E_normalizedLeft'

◆ fundamentalMatrix()

static bool Ocean::Geometry::EpipolarGeometry::fundamentalMatrix ( const Vector2 leftPoints,
const Vector2 rightPoints,
const size_t  correspondences,
SquareMatrix3 right_F_left 
)
static

Calculates the fundamental matrix based on corresponding image points in a 'left' and in a 'right' camera image.

The resulting fundamental matrix is defined by the following equation:

rightPoint^T * right_F_left * leftPoint = 0
Parameters
leftPointsThe left image points, must be valid
rightPointsThe right image points, must be valid
correspondencesThe number of point correspondences, with range [8, infinity)
right_F_leftThe resulting fundamental matrix
Returns
True, if succeeded

◆ leftEpipolarLine()

Line2 Ocean::Geometry::EpipolarGeometry::leftEpipolarLine ( const SquareMatrix3 fundamental,
const Vector2 rightPoint 
)
inlinestatic

Returns the epipolar line in the left image corresponding to a given point in the right image.

Parameters
fundamentalThe fundamental matrix
rightPointPoint in the right image
Returns
Resulting epipolar line in the left image

◆ rectificationHomography()

static bool Ocean::Geometry::EpipolarGeometry::rectificationHomography ( const HomogenousMatrix4 transformation,
const PinholeCamera pinholeCamera,
SquareMatrix3 leftHomography,
SquareMatrix3 rightHomography,
Quaternion appliedRotation,
PinholeCamera newCamera 
)
static

Determines the homograph for two (stereo) frames rectifying both images using the transformation between the left and the right camera.

As the resulting homography may not cover the entire input images using the same camera profile a new camera (perfect) profile can be calculated instead.
Thus, the resulting rectified images will have a larger field of view but will cover the entire input frame data.
The projection center of the left camera is expected to be at the origin of the world coordinate system.
The viewing directions of both cameras is towards the negative z-axis in their particular coordinate systems.
The given transformation is equal to the extrinsic camera matrix of the right camera
and thus transforms points defined inside the right camera coordinate system to points defined inside the left camera coordinate system.
The resulting homography transformations transform 3D rectified image points (homogenous 2D coordinates) into 3D unrectified image points for their particular coordinate system.
The coordinate system of the 3D image points has the origin in the top left corner, while the x-axis points to right, the y-axis points to the bottom and the z-axis to the back of the image.

Parameters
transformationExtrinsic camera matrix of the right camera with negative z-axis as viewing direction
pinholeCameraThe pinhole camera profile used for both images
leftHomographyResulting left homography
rightHomographyResulting right homography
appliedRotationResulting rotation applied to both cameras
newCameraOptional resulting new camera profile used to cover the entire input image data into the output frames, otherwise nullptr
Returns
True, if succeeded

◆ reverseFundamentalMatrix()

SquareMatrix3 Ocean::Geometry::EpipolarGeometry::reverseFundamentalMatrix ( const SquareMatrix3 right_F_left)
inlinestatic

Returns the reverted fundamental matrix.

Actually the matrix will be transposed only, because the fundamental matrix is singular.

Parameters
right_F_leftThe fundamental matrix satisfying the equation rightPoint^T * right_F_left * leftPoint = 0
Returns
The resulting reverted fundamental matrix 'left_F_right'

◆ rightEpipolarLine()

Line2 Ocean::Geometry::EpipolarGeometry::rightEpipolarLine ( const SquareMatrix3 fundamental,
const Vector2 leftPoint 
)
inlinestatic

Returns the epipolar line in the right image corresponding to a given point in the left image.

Parameters
fundamentalThe fundamental matrix
leftPointPoint in the left image
Returns
Resulting epipolar line in the right image

◆ triangulateImagePoints()

static Vectors3 Ocean::Geometry::EpipolarGeometry::triangulateImagePoints ( const HomogenousMatrix4 world_T_cameraA,
const HomogenousMatrix4 world_T_cameraB,
const AnyCamera anyCameraA,
const AnyCamera anyCameraB,
const Vector2 imagePointsA,
const Vector2 imagePointsB,
const size_t  numberPoints,
const bool  onlyFrontObjectPoints = true,
const Vector3 invalidObjectPoint = Vector3(Numeric::minValue(), Numeric::minValue(), Numeric::minValue()),
Indices32 invalidIndices = nullptr 
)
static

Calculates the 3D positions for a pair of image point correspondences with corresponding extrinsic camera transformations.

Parameters
world_T_cameraAThe extrinsic camera transformations of the first camera, with the camera pointing towards the negative z-space, y-axis pointing up, and the x-axis pointing to the right, must be valid
world_T_cameraBThe extrinsic camera transformations of the second camera, with the camera pointing towards the negative z-space, y-axis pointing up, and the x-axis pointing to the right, must be valid
anyCameraAThe first camera profile, must be valid
anyCameraBThe second camera profile, must be valid
imagePointsAThe set of 2D image points in the first image, each point must correspond to the point with the same index from the second image
imagePointsBThe set of 2D image points in the second image, each point must correspond to the point with the same index from the first image
numberPointsThe number of point correspondences, with range [0, infinity)
onlyFrontObjectPointsIf true, only points that are located in front of the camera will be used for the optimization, otherwise all points will be used.
invalidObjectPointOptional, the location of an invalid object point which will be used as value for all object points which cannot be determined e.g., because of parallel projection rays
invalidIndicesOptional, the resulting indices of the resulting object points with invalid location
Returns
objectPoints The resulting triangulated object points

◆ triangulateImagePointsIF() [1/2]

static ObjectPoints Ocean::Geometry::EpipolarGeometry::triangulateImagePointsIF ( const ConstIndexedAccessor< HomogenousMatrix4 > &  posesIF,
const ConstIndexedAccessor< Vectors2 > &  imagePointsPerPose,
const PinholeCamera pinholeCamera = nullptr,
const Vector3 invalidObjectPoint = Vector3(Numeric::minValue(), Numeric::minValue(), Numeric::minValue()),
Indices32 invalidIndices = nullptr 
)
static

Calculates the 3D positions for a set of image point correspondences in multiple views with corresponding camera projection matrices (K * Rt) or poses (Rt) in inverted flipped camera system.

This linear triangulation uses singular value decomposition.

Parameters
posesIFGiven poses or projection matrices per view
imagePointsPerPoseSet of 2D image points per the view, each point must correspond the one in the other views
pinholeCameraThe pinhole camera profile, one for all views. If no camera profile is given, posesIF act as projection matrices
invalidObjectPointOptional, the location of an invalid object point which will be used as value for all object points which cannot be determined e.g., because of parallel projection rays
invalidIndicesOptional resulting indices of the resulting object points with invalid location
Returns
objectPoints Resulting object points in inverted flipped coordinates

◆ triangulateImagePointsIF() [2/2]

static ObjectPoints Ocean::Geometry::EpipolarGeometry::triangulateImagePointsIF ( const PinholeCamera camera1,
const HomogenousMatrix4 iFlippedPose1,
const PinholeCamera camera2,
const HomogenousMatrix4 iFlippedPose2,
const Vector2 points1,
const Vector2 points2,
const size_t  correspondences,
const Vector3 invalidObjectPoint = Vector3(Numeric::minValue(), Numeric::minValue(), Numeric::minValue()),
Indices32 invalidIndices = nullptr 
)
static

Calculates the 3D positions for a set of image point correspondences with corresponding poses (Rt) in inverted flipped camera system.

This linear triangulation uses singular value decomposition.
If an object point cannot be determined than the resulting object point will have value (0, 0, 0).

Parameters
camera1The camera profile used for the first image
iFlippedPose1Given projection matrix for the first camera
camera2The camera profile used for the second image
iFlippedPose2Given projection matrix for the second camera
points1Set of 2D image points in the first image, each point must correspond the one in the right image
points2Set of 2D image points in the second image
correspondencesNumber of point correspondences, with range [1, infinity)
invalidObjectPointOptional, the location of an invalid object point which will be used as value for all object points which cannot be determined e.g., because of parallel projection rays
invalidIndicesOptional resulting indices of the resulting object points with invalid location
Returns
objectPoints Resulting object points in inverted flipped coordinate space

◆ validateCameraPose()

static size_t Ocean::Geometry::EpipolarGeometry::validateCameraPose ( const HomogenousMatrix4 leftCamera_T_rightCamera,
const PinholeCamera leftCamera,
const PinholeCamera rightCamera,
const Vector2 leftPoints,
const Vector2 rightPoints,
const size_t  correspondences 
)
staticprotected

Returns the number of 3D object points lying in front of two cameras for a given transformation between the two cameras.

The pose of the first camera is located in the origin (identity transformation) while the pose of the second camera is defined by the given transformation.

Parameters
leftCamera_T_rightCameraThe transformation between the right and the left camera, must be valid
leftCameraThe left camera profile defining the projection, must be valid
rightCameraThe right camera profile defining the projection, must be valid
leftPointsThe left image points, must be valid if correspondences != 0
rightPointsThe right image points, one for each left point, must be valid if correspondences != 0
correspondencesThe number of provided point correspondences, with range [0, infinity)
Returns
Number of valid object points, with range [0, correspondences]

The documentation for this class was generated from the following file: