CPR in vtkImageCPRMapper stands for Curved Planar Reformation. This mapper can be used to visualize tubular structures such as blood vessels. It can be used in projected mode, stretched mode or straightened mode depending on the settings @see getUseUniformOrientation , @see getCenterPoint and the distance function of @see getOrientedCenterline .
This specialised mapper takes as input a vtkImageData representing a volume ( @see setImageData ) and a vtkPolyData representing a centerline ( @see setCenterlineData ). The mapper also need to have an orientation per point or for all points. This can be specified using a uniform orientation ( @see getUniformOrientation @see getUseUniformOrientation ) or a point data array ( @see getOrientationArrayName ). Every point, vector or length specified to the mapper (centerline points, orientation, width…) use model coordinates of the volume used as input ( @see setImageData ).
For each segment of the centerline the mapper creates a quad of the specified width ( @see getWidth ) and of height equal to the length of the segment. The position and the orientation of the centerline are interpolated along the y-axis of the quad. The position is linearly interpolated (lerp) and the orientation is interpolated using spherical linear interpolation (slerp). For a point (x, y) on the quad, the value of y gives the interpolated position P and interpolated orientation O which combined with tangentDirection gives D ( @see getTangentDirection ). The value of x between -0.5 and 0.5 then gives the position to sample in the volume: P + x*D.
By computing the right centerline positions and orientations, one can simulate Stretched CPR and Straightened CPR.
This class resolves coincident topology with the same methods as vtkMapper.
Methods
extend
Method used to decorate a given object (publicAPI+model) with vtkImageCPRMapper characteristics.
Argument
Type
Required
Description
publicAPI
Yes
object on which methods will be bounds (public)
model
Yes
object on which data structure will be bounds (protected)
initialValues
IImageCPRMapperInitialValues
No
(default: {})
getBitangentDirection
For each point on the oriented centerline, the bitangent direction forms with the normal and the tangent direction a new basis Default value: [0, 1, 0]
getCenterPoint
A point used to offset each line of pixel in the rendering The line of pixel is offseted such as the center of the line is as close as possible to the center point This can be used in combination with @see getUseUniformOrientation and a custom distance function for @see getOrientedCenterline to visualize a CPR in projected mode or stretched mode Defaults to null.
Returns
Type
Description
the center point
getCenterlinePositionAndOrientation
Argument
Type
Required
Description
distance
Yes
Distance from the beginning of the centerline, following the centerline, in model coordinates
Returns
Type
Description
The position and orientation which is at the given distance from the beginning of the centerline. If the distance is negative or greater than the length of the centerline, position and orientation are not defined.
If the centerline is not oriented, orientation is not defined.
getCenterlineTangentDirections
Returns
Type
Description
A flat array of vec3 representing the direction at each point of the centerline It is computed from the orientations of the centerline and tangentDirection
Uses caching to avoid recomputing at each frame
getDirectionMatrix
The direction matrix is the matrix composed of tangent, bitangent and normal directions It is used to orient the camera or the actor
getHeight
Returns
Type
Description
The total height of the image in model coordinates.
getNormalDirection
For each point on the oriented centerline, the normal direction is the direction along the centerline Default value: [0, 0, 1]
getOrientationArrayName
OrientationArrayName specifies the name of the data array which gives an orientation for each point of the centerline The data array has to be in the PointData attribute of the centerline input If null, look for the orientation data array in: “Orientation”, “Direction”, Vectors, Tensors, Normals The data array should be an array of mat4, mat3, quat or vec3 but using vec3 makes the CPRInteractor unusable Default to null.
getOrientationDataArray
Find the data array to use for orientation in the input polydata ( @see getOrientationArrayName )
getOrientedCenterline
Recompute the oriented centerline from the input polydata if needed and return the result If there is no polydata as input, return the last oriented centerline It means that if no polydata is given as input and the centerline is set using @see setOrientedCenterline , the given centerline will be used
getPreferSizeOverAccuracy
This flag indicates wether the GPU should use half float or not When true, will use half float When false, may use half float if there is no loss of accuracy (see in Texture: checkUseHalfFloat) Defaults to false.
Returns
Type
Description
the preferSizeOverAccuracy flag
getProjectionMode
The different modes of projection Available modes include MIP, MinIP and AverageIP
getProjectionSlabNumberOfSamples
Total number of samples of the volume done by the projection mode If this number is equal or less than 1, projection is disabled Using an odd number is advised If this number is even, the center of the slab will not be sampled
getProjectionSlabThickness
Thickness of the projection slab in image coordinates (NOT in voxels) Usually in millimeters if the spacing of the input image is set from a DICOM
getTangentDirection
For each point on the oriented centerline, the tangent direction is the direction in which the mapper will sample Let O (a mat3) be the orientation at a point on a centerline, and N (a vec3) the tangent direction Then the mapper will sample along O * N Default value: [1, 0, 0]
getUniformDirection
Returns
Type
Description
The direction to sample, in model space, computed using uniform orientation and tangent direction
getUniformOrientation
Use @see getUseUniformOrientation to use the uniform orientation instead of the orientation specified by the centerline
Returns
Type
Description
the uniform orientation of the centerline
getUpdatedExtents
Retrieves the updated extents.
This array is cleared after every successful render.
getUseUniformOrientation
This flag specifies wether the mapper should use the uniformOrientation ( @see getUniformOrientation ) or the orientation specified in centerline at centerline input ( @see setCenterlineData ) Defaults to false.
Returns
Type
Description
the useUniformOrientation flag
getWidth
Returns
Type
Description
the width of the image in model coordinates of the input volume
isProjectionEnabled
Returns wether projection is enabled It is based on the number of samples
newInstance
Method used to create a new instance of vtkImageCPRMapper
Argument
Type
Required
Description
initialValues
IImageCPRMapperInitialValues
No
for pre-setting some of its content
preRenderCheck
Returns
Type
Description
A boolean indicating if the mapper is ready to render
setBitangentDirection
Argument
Type
Required
Description
bitangent
Yes
setCenterPoint
Argument
Type
Required
Description
point
Yes
setCenterlineConnection
Same as setCenterlineData except it uses an output port instead of a polydata You can also use publicAPI.setInputConnection(centerlineConnection, 1);
Argument
Type
Required
Description
centerlineConnection
Yes
setCenterlineData
Set the polydata used as a centerline You can also use publicAPI.setInputData(centerlineData, 1); Use all the segments of all the polylines (the centerline can be in multiple pieces) The polydata can contain a PointData DataArray to specify the direction in which the mapper should sample for each point of the centerline ( @see getDirectionArrayName @see getDirectionArrayOffset ) If no such point data is specified, a uniform direction can be used instead ( @see getUniformDirection @see getUseUniformOrientation ) The points of the centerline are in model coordinates of the volume used as input ( @see setImageDataData ) and not index coordinates (see MCTC matrix of the OpenGL ImageCPRMapper) Use imageData.getWorldToIndex(); or imageData.getIndexToWorld(); to go from model coordinates to index coordinates or the other way around
Argument
Type
Required
Description
centerlineData
Yes
A polydata containing one or multiple polyline(s) and optionnally a PointData DataArray for direction
setDirectionMatrix
Argument
Type
Required
Description
mat
Yes
setImageConnection
Set the connection for the volume You can also use publicAPI.setInputConnection(imageData, 0);
Argument
Type
Required
Description
imageData
Yes
setImageData
Set the volume which should be sampled by the mapper You can also use publicAPI.setInputData(imageData, 0); The model coordinates of this imageData are used by this mapper when specifying points, vectors or width (see MCTC matrix of the OpenGL ImageCPRMapper) You can use imageData.getWorldToIndex(); or imageData.getIndexToWorld(); to go from this model coordinates to index coordinates or the other way around
Argument
Type
Required
Description
imageData
Yes
setNormalDirection
Argument
Type
Required
Description
normal
Yes
setOrientationArrayName
Argument
Type
Required
Description
arrayName
Yes
setOrientedCenterline
Set the internal oriented centerline WARNING: this centerline will be overwritten if the polydata centerline is specified (input 1 @see setCenterlineData )
Argument
Type
Required
Description
centerline
Yes
An oriented centerline
setPreferSizeOverAccuracy
Argument
Type
Required
Description
preferSizeOverAccuracy
Yes
setProjectionMode
Argument
Type
Required
Description
projectionMode
Yes
setProjectionSlabNumberOfSamples
Argument
Type
Required
Description
projectionSlabNumberOfSamples
Yes
setProjectionSlabThickness
Argument
Type
Required
Description
projectionSlabThickness
Yes
setTangentDirection
Argument
Type
Required
Description
tangent
Yes
setUniformOrientation
Argument
Type
Required
Description
orientation
Yes
setUpdatedExtents
Tells the mapper to only update the specified extents.
If there are zero extents, the mapper updates the entire volume texture. Otherwise, the mapper will only update the texture by the specified extents during the next render call.
This array is cleared after a successful render.
Argument
Type
Required
Description
extents
Yes
setUseUniformOrientation
Argument
Type
Required
Description
useUniformOrientation
Yes
setWidth
Argument
Type
Required
Description
width
Yes
useStraightenedMode
Configure the mapper and the centerline to be in straightened CPR mode
useStretchedMode
Configure the mapper and the centerline to be in strectched CPR mode
Argument
Type
Required
Description
centerPoint
Yes
The center point, optional, default to the first point of the centerline or [0, 0, 0]
Source
Constants.d.ts
export declare enum ProjectionMode { MAX = 0, MIN = 1, AVERAGE = 2, }
export interface IImageCPRMapperInitialValues extendsIAbstractMapper3DInitialValues { width: number; uniformOrientation: TOrientation; // Don't use vec3 if possible useUniformOrientation: boolean; preferSizeOverAccuracy: boolean; // Whether to use halfFloat representation of float, when it is inaccurate orientationArrayName: Nullable<string>; tangentDirection: vec3; bitangentDirection: vec3; normalDirection: vec3; }
export interface vtkImageCPRMapper extends vtkAbstractMapper3D, CoincidentTopologyHelper { /** * @returns the width of the image in model coordinates of the input volume */ getWidth(): number;
/** * Use @see getUseUniformOrientation to use the uniform orientation instead of the orientation specified by the centerline * @returns the uniform orientation of the centerline */ getUniformOrientation(): TOrientation;
/** * This flag specifies wether the mapper should use the uniformOrientation ( @see getUniformOrientation ) or the orientation specified in centerline at centerline input ( @see setCenterlineData ) * Defaults to false. * @returns the useUniformOrientation flag */ getUseUniformOrientation(): boolean;
/** * A point used to offset each line of pixel in the rendering * The line of pixel is offseted such as the center of the line is as close as possible to the center point * This can be used in combination with @see getUseUniformOrientation and a custom distance function for @see getOrientedCenterline to visualize a CPR in projected mode or stretched mode * Defaults to null. * @returns the center point */ getCenterPoint(): Nullable<vec3>;
/** * This flag indicates wether the GPU should use half float or not * When true, will use half float * When false, may use half float if there is no loss of accuracy (see in Texture: checkUseHalfFloat) * Defaults to false. * @returns the preferSizeOverAccuracy flag */ getPreferSizeOverAccuracy(): boolean;
/** * OrientationArrayName specifies the name of the data array which gives an orientation for each point of the centerline * The data array has to be in the PointData attribute of the centerline input * If null, look for the orientation data array in: "Orientation", "Direction", Vectors, Tensors, Normals * The data array should be an array of mat4, mat3, quat or vec3 but using vec3 makes the CPRInteractor unusable * Default to null. */ getOrientationArrayName(): Nullable<string>;
/** * For each point on the oriented centerline, the tangent direction is the direction in which the mapper will sample * Let O (a mat3) be the orientation at a point on a centerline, and N (a vec3) the tangent direction * Then the mapper will sample along O * N * Default value: [1, 0, 0] */ getTangentDirection(): vec3;
/** * For each point on the oriented centerline, the bitangent direction forms with the normal and the tangent direction a new basis * Default value: [0, 1, 0] */ getBitangentDirection(): vec3;
/** * For each point on the oriented centerline, the normal direction is the direction along the centerline * Default value: [0, 0, 1] */ getNormalDirection(): vec3;
/** * The direction matrix is the matrix composed of tangent, bitangent and normal directions * It is used to orient the camera or the actor */ getDirectionMatrix(): mat3;
/** * Thickness of the projection slab in image coordinates (NOT in voxels) * Usually in millimeters if the spacing of the input image is set from a DICOM */ getProjectionSlabThickness(): number;
/** * Total number of samples of the volume done by the projection mode * If this number is equal or less than 1, projection is disabled * Using an odd number is advised * If this number is even, the center of the slab will not be sampled */ getProjectionSlabNumberOfSamples(): number;
/** * Returns wether projection is enabled * It is based on the number of samples * @seegetProjectionSlabNumberOfSamples */ isProjectionEnabled(): boolean;
/** * The different modes of projection * Available modes include MIP, MinIP and AverageIP */ getProjectionMode(): ProjectionMode;
/** * Find the data array to use for orientation in the input polydata ( @see getOrientationArrayName ) */ getOrientationDataArray(): Nullable<vtkDataArray>;
/** * Recompute the oriented centerline from the input polydata if needed and return the result * If there is no polydata as input, return the last oriented centerline * It means that if no polydata is given as input and the centerline is set using @see setOrientedCenterline , the given centerline will be used */ getOrientedCenterline(): vtkPolyLine;
/** * Set the internal oriented centerline * WARNING: this centerline will be overwritten if the polydata centerline is specified (input 1 @see setCenterlineData ) * @param centerline An oriented centerline */ setOrientedCenterline(centerline: vtkPolyLine): boolean;
/** * @returns The total height of the image in model coordinates. */ getHeight(): number;
/** * @param distance Distance from the beginning of the centerline, following the centerline, in model coordinates * @returns The position and orientation which is at the given distance from the beginning of the centerline. * If the distance is negative or greater than the length of the centerline, position and orientation are not defined. * If the centerline is not oriented, orientation is not defined. */ getCenterlinePositionAndOrientation(distance: number): { position?: vec3; orientation?: quat; };
/** * @returns A flat array of vec3 representing the direction at each point of the centerline * It is computed from the orientations of the centerline and tangentDirection * Uses caching to avoid recomputing at each frame */ getCenterlineTangentDirections(): Float32Array;
/** * @returns The direction to sample, in model space, computed using uniform orientation and tangent direction */ getUniformDirection(): vec3;
/** * @returns A boolean indicating if the mapper is ready to render */ preRenderCheck(): boolean;
/** * Configure the mapper and the centerline to be in straightened CPR mode */ useStraightenedMode(): void;
/** * Configure the mapper and the centerline to be in strectched CPR mode * @param centerPoint The center point, optional, default to the first point of the centerline or [0, 0, 0] */ useStretchedMode(centerPoint?: Nullable<vec3>): void;
/** * Set the polydata used as a centerline * You can also use `publicAPI.setInputData(centerlineData, 1);` * Use all the segments of all the polylines (the centerline can be in multiple pieces) * The polydata can contain a PointData DataArray to specify the direction in which the mapper should sample for each point of the centerline ( @see getDirectionArrayName @see getDirectionArrayOffset ) * If no such point data is specified, a uniform direction can be used instead ( @see getUniformDirection @see getUseUniformOrientation ) * The points of the centerline are in model coordinates of the volume used as input ( @see setImageDataData ) and not index coordinates (see MCTC matrix of the OpenGL ImageCPRMapper) * Use `imageData.getWorldToIndex();` or `imageData.getIndexToWorld();` to go from model coordinates to index coordinates or the other way around * @param centerlineData A polydata containing one or multiple polyline(s) and optionnally a PointData DataArray for direction */ setCenterlineData(centerlineData: vtkPolyData): void;
/** * Same as setCenterlineData except it uses an output port instead of a polydata * You can also use `publicAPI.setInputConnection(centerlineConnection, 1);` * @seesetCenterlineData * @paramcenterlineConnection */ setCenterlineConnection(centerlineConnection: vtkOutputPort): void;
/** * Set the volume which should be sampled by the mapper * You can also use `publicAPI.setInputData(imageData, 0);` * The model coordinates of this imageData are used by this mapper when specifying points, vectors or width (see MCTC matrix of the OpenGL ImageCPRMapper) * You can use `imageData.getWorldToIndex();` or `imageData.getIndexToWorld();` to go from this model coordinates to index coordinates or the other way around * @paramimageData */ setImageData(imageData: vtkImageData): void;
/** * Set the connection for the volume * You can also use `publicAPI.setInputConnection(imageData, 0);` * @seesetImageData * @paramimageData */ setImageConnection(imageData: vtkOutputPort): void;
/** * Tells the mapper to only update the specified extents. * * If there are zero extents, the mapper updates the entire volume texture. * Otherwise, the mapper will only update the texture by the specified extents * during the next render call. * * This array is cleared after a successful render. * @paramextents */ setUpdatedExtents(extents: Extent[]): boolean;
/** * Retrieves the updated extents. * * This array is cleared after every successful render. */ getUpdatedExtents(): Extent[]; }
/** * Method used to decorate a given object (publicAPI+model) with vtkImageCPRMapper characteristics. * * @param publicAPI object on which methods will be bounds (public) * @param model object on which data structure will be bounds (protected) * @param {IImageCPRMapperInitialValues} [initialValues] (default: {}) */ exportfunctionextend( publicAPI: object, model: object, initialValues?: IImageCPRMapperInitialValues ): void;
/** * Method used to create a new instance of vtkImageCPRMapper * @param {IImageCPRMapperInitialValues} [initialValues] for pre-setting some of its content */ exportfunctionnewInstance( initialValues?: IImageCPRMapperInitialValues ): vtkImageCPRMapper;
/** * CPR in vtkImageCPRMapper stands for Curved Planar Reformation. This mapper * can be used to visualize tubular structures such as blood vessels. It can be * used in projected mode, stretched mode or straightened mode depending on the * settings @see getUseUniformOrientation , @see getCenterPoint and the distance * function of @see getOrientedCenterline . * * This specialised mapper takes as input a vtkImageData representing a volume * ( @see setImageData ) and a vtkPolyData representing a centerline * ( @see setCenterlineData ). The mapper also need to have an orientation per * point or for all points. This can be specified using a uniform orientation * ( @see getUniformOrientation @see getUseUniformOrientation ) or a point * data array ( @see getOrientationArrayName ). Every point, vector or length * specified to the mapper (centerline points, orientation, width...) use model * coordinates of the volume used as input ( @see setImageData ). * * For each segment of the centerline the mapper creates a quad of the * specified width ( @see getWidth ) and of height equal to the length of the * segment. The position and the orientation of the centerline are * interpolated along the y-axis of the quad. The position is linearly * interpolated (lerp) and the orientation is interpolated using * spherical linear interpolation (slerp). For a point (x, y) on the quad, * the value of y gives the interpolated position P and interpolated * orientation O which combined with tangentDirection gives D * ( @see getTangentDirection ). The value of x between -0.5 and 0.5 then gives * the position to sample in the volume: P + x*D. * * By computing the right centerline positions and orientations, one * can simulate Stretched CPR and Straightened CPR. * * This class resolves coincident topology with the same methods as vtkMapper. */ export declare constvtkImageCPRMapper: { newInstance: typeof newInstance; extend: typeof extend; } & StaticCoincidentTopologyMethods; exportdefault vtkImageCPRMapper;
publicAPI.getOrientedCenterline = () => { const inputPolydata = publicAPI.getInputData(1); if (!inputPolydata) { // No polydata: return previous centerline // Don't reset centerline as it could have been set using setOrientedCenterline return model._orientedCenterline; }
// Get dependencies of centerline const orientationDataArray = publicAPI.getOrientationDataArray(); const linesDataArray = inputPolydata.getLines(); const pointsDataArray = inputPolydata.getPoints();
if (!model.useUniformOrientation && !orientationDataArray) { vtkErrorMacro( 'Failed to create oriented centerline from polydata: no orientation' ); publicAPI._resetOrientedCenterline(); return model._orientedCenterline; }
// Check if the rendering can occur publicAPI.preRenderCheck = () => { if (!publicAPI.getInputData(0)) { vtkErrorMacro('No image data input'); returnfalse; } returntrue; };
publicAPI.useStretchedMode = (centerPoint) => { const centerline = publicAPI.getOrientedCenterline(); // Set center point if (!centerPoint) { // Get the first point of the centerline if there is one const centerlinePoints = centerline.getPoints(); const newCenterPoint = centerlinePoints.getNumberOfTuples() > 0 ? centerlinePoints.getPoint(0) : [0, 0, 0]; publicAPI.setCenterPoint(newCenterPoint); } else { publicAPI.setCenterPoint(centerPoint); } // Enable uniform orientation publicAPI.setUseUniformOrientation(true); // Change distance function centerline.setDistanceFunction((a, b) => { const direction = publicAPI.getUniformDirection(); const vec = vec3.subtract([], a, b); const d2 = vec3.squaredLength(vec); const x = vec3.dot(direction, vec); returnMath.sqrt(d2 - x * x); }); };
// One can also call setOrientedCenterline and not provide a polydata centerline to input 1 model._orientedCenterline = vtkPolyLine.newInstance(); publicAPI._resetOrientedCenterline(); }