ImageCPRMapper

Introduction

CPR in vtkImageCPRMapper stands for Curved Planar Reformation. This mapper
can be used to visualize tubular structures such as blood vessels. It can be
used in projected mode, stretched mode or straightened mode depending on the
settings @see getUseUniformOrientation , @see getCenterPoint and the distance
function of @see getOrientedCenterline .

This specialised mapper takes as input a vtkImageData representing a volume
( @see setImageData ) and a vtkPolyData representing a centerline
( @see setCenterlineData ). The mapper also need to have an orientation per
point or for all points. This can be specified using a uniform orientation
( @see getUniformOrientation @see getUseUniformOrientation ) or a point
data array ( @see getOrientationArrayName ). Every point, vector or length
specified to the mapper (centerline points, orientation, width…) use model
coordinates of the volume used as input ( @see setImageData ).

For each segment of the centerline the mapper creates a quad of the
specified width ( @see getWidth ) and of height equal to the length of the
segment. The position and the orientation of the centerline are
interpolated along the y-axis of the quad. The position is linearly
interpolated (lerp) and the orientation is interpolated using
spherical linear interpolation (slerp). For a point (x, y) on the quad,
the value of y gives the interpolated position P and interpolated
orientation O which combined with tangentDirection gives D
( @see getTangentDirection ). The value of x between -0.5 and 0.5 then gives
the position to sample in the volume: P + x*D.

By computing the right centerline positions and orientations, one
can simulate Stretched CPR and Straightened CPR.

This class resolves coincident topology with the same methods as vtkMapper.

Methods

extend

Method used to decorate a given object (publicAPI+model) with vtkImageCPRMapper characteristics.

Argument Type Required Description
publicAPI Yes object on which methods will be bounds (public)
model Yes object on which data structure will be bounds (protected)
initialValues IImageCPRMapperInitialValues No (default: {})

getBitangentDirection

For each point on the oriented centerline, the bitangent direction forms with the normal and the tangent direction a new basis
Default value: [0, 1, 0]

getCenterPoint

A point used to offset each line of pixel in the rendering
The line of pixel is offseted such as the center of the line is as close as possible to the center point
This can be used in combination with @see getUseUniformOrientation and a custom distance function for @see getOrientedCenterline to visualize a CPR in projected mode or stretched mode
Defaults to null.

Returns

Type Description
the center point

getCenterlinePositionAndOrientation

Argument Type Required Description
distance Yes Distance from the beginning of the centerline, following the centerline, in model coordinates

Returns

Type Description
The position and orientation which is at the given distance from the beginning of the centerline. If the distance is negative or greater than the length of the centerline, position and orientation are not defined.
If the centerline is not oriented, orientation is not defined.

getCenterlineTangentDirections

Returns

Type Description
A flat array of vec3 representing the direction at each point of the centerline It is computed from the orientations of the centerline and tangentDirection
Uses caching to avoid recomputing at each frame

getDirectionMatrix

The direction matrix is the matrix composed of tangent, bitangent and normal directions
It is used to orient the camera or the actor

getHeight

Returns

Type Description
The total height of the image in model coordinates.

getNormalDirection

For each point on the oriented centerline, the normal direction is the direction along the centerline
Default value: [0, 0, 1]

getOrientationArrayName

OrientationArrayName specifies the name of the data array which gives an orientation for each point of the centerline
The data array has to be in the PointData attribute of the centerline input
If null, look for the orientation data array in: “Orientation”, “Direction”, Vectors, Tensors, Normals
The data array should be an array of mat4, mat3, quat or vec3 but using vec3 makes the CPRInteractor unusable
Default to null.

getOrientationDataArray

Find the data array to use for orientation in the input polydata ( @see getOrientationArrayName )

getOrientedCenterline

Recompute the oriented centerline from the input polydata if needed and return the result
If there is no polydata as input, return the last oriented centerline
It means that if no polydata is given as input and the centerline is set using @see setOrientedCenterline , the given centerline will be used

getPreferSizeOverAccuracy

This flag indicates wether the GPU should use half float or not
When true, will use half float
When false, may use half float if there is no loss of accuracy (see in Texture: checkUseHalfFloat)
Defaults to false.

Returns

Type Description
the preferSizeOverAccuracy flag

getProjectionMode

The different modes of projection
Available modes include MIP, MinIP and AverageIP

getProjectionSlabNumberOfSamples

Total number of samples of the volume done by the projection mode
If this number is equal or less than 1, projection is disabled
Using an odd number is advised
If this number is even, the center of the slab will not be sampled

getProjectionSlabThickness

Thickness of the projection slab in image coordinates (NOT in voxels)
Usually in millimeters if the spacing of the input image is set from a DICOM

getResolveCoincidentTopology

getResolveCoincidentTopologyAsString

getResolveCoincidentTopologyLineOffsetParameters

getResolveCoincidentTopologyPointOffsetParameters

getResolveCoincidentTopologyPolygonOffsetFaces

getResolveCoincidentTopologyPolygonOffsetParameters

getTangentDirection

For each point on the oriented centerline, the tangent direction is the direction in which the mapper will sample
Let O (a mat3) be the orientation at a point on a centerline, and N (a vec3) the tangent direction
Then the mapper will sample along O * N
Default value: [1, 0, 0]

getUniformDirection

Returns

Type Description
The direction to sample, in model space, computed using uniform orientation and tangent direction

getUniformOrientation

Use @see getUseUniformOrientation to use the uniform orientation instead of the orientation specified by the centerline

Returns

Type Description
the uniform orientation of the centerline

getUseUniformOrientation

This flag specifies wether the mapper should use the uniformOrientation ( @see getUniformOrientation ) or the orientation specified in centerline at centerline input ( @see setCenterlineData )
Defaults to false.

Returns

Type Description
the useUniformOrientation flag

getWidth

Returns

Type Description
the width of the image in model coordinates of the input volume

isProjectionEnabled

Returns wether projection is enabled
It is based on the number of samples

newInstance

Method used to create a new instance of vtkImageCPRMapper

Argument Type Required Description
initialValues IImageCPRMapperInitialValues No for pre-setting some of its content

preRenderCheck

Returns

Type Description
A boolean indicating if the mapper is ready to render

setBitangentDirection

Argument Type Required Description
bitangent Yes

setCenterPoint

Argument Type Required Description
point Yes

setCenterlineConnection

Same as setCenterlineData except it uses an output port instead of a polydata
You can also use publicAPI.setInputConnection(centerlineConnection, 1);

Argument Type Required Description
centerlineConnection Yes

setCenterlineData

Set the polydata used as a centerline
You can also use publicAPI.setInputData(centerlineData, 1);
Use all the segments of all the polylines (the centerline can be in multiple pieces)
The polydata can contain a PointData DataArray to specify the direction in which the mapper should sample for each point of the centerline ( @see getDirectionArrayName @see getDirectionArrayOffset )
If no such point data is specified, a uniform direction can be used instead ( @see getUniformDirection @see getUseUniformOrientation )
The points of the centerline are in model coordinates of the volume used as input ( @see setImageDataData ) and not index coordinates (see MCTC matrix of the OpenGL ImageCPRMapper)
Use imageData.getWorldToIndex(); or imageData.getIndexToWorld(); to go from model coordinates to index coordinates or the other way around

Argument Type Required Description
centerlineData Yes A polydata containing one or multiple polyline(s) and optionnally a PointData DataArray for direction

setDirectionMatrix

Argument Type Required Description
mat Yes

setImageConnection

Set the connection for the volume
You can also use publicAPI.setInputConnection(imageData, 0);

Argument Type Required Description
imageData Yes

setImageData

Set the volume which should be sampled by the mapper
You can also use publicAPI.setInputData(imageData, 0);
The model coordinates of this imageData are used by this mapper when specifying points, vectors or width (see MCTC matrix of the OpenGL ImageCPRMapper)
You can use imageData.getWorldToIndex(); or imageData.getIndexToWorld(); to go from this model coordinates to index coordinates or the other way around

Argument Type Required Description
imageData Yes

setNormalDirection

Argument Type Required Description
normal Yes

setOrientationArrayName

Argument Type Required Description
arrayName Yes

setOrientedCenterline

Set the internal oriented centerline
WARNING: this centerline will be overwritten if the polydata centerline is specified (input 1 @see setCenterlineData )

Argument Type Required Description
centerline Yes An oriented centerline

setPreferSizeOverAccuracy

Argument Type Required Description
preferSizeOverAccuracy Yes

setProjectionMode

Argument Type Required Description
projectionMode Yes

setProjectionSlabNumberOfSamples

Argument Type Required Description
projectionSlabNumberOfSamples Yes

setProjectionSlabThickness

Argument Type Required Description
projectionSlabThickness Yes

setRelativeCoincidentTopologyLineOffsetParameters

Argument Type Required Description
factor Number Yes
offset Number Yes

setRelativeCoincidentTopologyPointOffsetParameters

Argument Type Required Description
factor Number Yes
offset Number Yes

setRelativeCoincidentTopologyPolygonOffsetParameters

Argument Type Required Description
factor Number Yes
offset Number Yes

setResolveCoincidentTopology

Argument Type Required Description
resolveCoincidentTopology Yes

setResolveCoincidentTopologyLineOffsetParameters

Argument Type Required Description
factor Number Yes
offset Number Yes

setResolveCoincidentTopologyPointOffsetParameters

Argument Type Required Description
factor Number Yes
offset Number Yes

setResolveCoincidentTopologyPolygonOffsetFaces

Argument Type Required Description
value Yes

setResolveCoincidentTopologyPolygonOffsetParameters

Argument Type Required Description
factor Number Yes
offset Number Yes

setResolveCoincidentTopologyToDefault

setResolveCoincidentTopologyToOff

setResolveCoincidentTopologyToPolygonOffset

setTangentDirection

Argument Type Required Description
tangent Yes

setUniformOrientation

Argument Type Required Description
orientation Yes

setUseUniformOrientation

Argument Type Required Description
useUniformOrientation Yes

setWidth

Argument Type Required Description
width Yes

useStraightenedMode

Configure the mapper and the centerline to be in straightened CPR mode

useStretchedMode

Configure the mapper and the centerline to be in strectched CPR mode

Argument Type Required Description
centerPoint Yes The center point, optional, default to the first point of the centerline or [0, 0, 0]

Source

Constants.d.ts
export declare enum ProjectionMode {
MAX = 0,
MIN = 1,
AVERAGE = 2,
}

declare const _default: {
ProjectionMode: typeof ProjectionMode;
};
export default _default;
Constants.js
export const ProjectionMode = {
MAX: 0,
MIN: 1,
AVERAGE: 2,
};

export default {
ProjectionMode,
};
index.d.ts
import { mat3, mat4, quat, vec3 } from "gl-matrix";
import { Nullable } from "../../../types";
import { vtkOutputPort } from "../../../interfaces";
import vtkAbstractMapper3D, { IAbstractMapper3DInitialValues } from "../AbstractMapper3D";
import vtkDataArray from "../../../Common/Core/DataArray";
import vtkImageData from "../../../Common/DataModel/ImageData";
import vtkPolyData from "../../../Common/DataModel/PolyData";
import vtkPolyLine from "../../../Common/DataModel/PolyLine";
import { ProjectionMode } from "./Constants";

interface ICoincidentTopology {
factor: number;
offset: number;
}

type TOrientation = mat4 | mat3 | quat | vec3;

export interface IImageCPRMapperInitialValues extends IAbstractMapper3DInitialValues{
width: number;
uniformOrientation: TOrientation; // Don't use vec3 if possible
useUniformOrientation: boolean;
preferSizeOverAccuracy: boolean; // Whether to use halfFloat representation of float, when it is inaccurate
orientationArrayName: Nullable<string>;
tangentDirection: vec3;
bitangentDirection: vec3;
normalDirection: vec3;
}

export interface vtkImageCPRMapper extends vtkAbstractMapper3D {
/**
* @returns the width of the image in model coordinates of the input volume
*/
getWidth(): number;

/**
* @see getWidth
* @param width
*/
setWidth(width: number): boolean;

/**
* Use @see getUseUniformOrientation to use the uniform orientation instead of the orientation specified by the centerline
* @returns the uniform orientation of the centerline
*/
getUniformOrientation(): TOrientation;

/**
* @see getUniformOrientation
* @param orientation
*/
setUniformOrientation(orientation: TOrientation): boolean;

/**
* This flag specifies wether the mapper should use the uniformOrientation ( @see getUniformOrientation ) or the orientation specified in centerline at centerline input ( @see setCenterlineData )
* Defaults to false.
* @returns the useUniformOrientation flag
*/
getUseUniformOrientation(): boolean;

/**
* @see getUseUniformOrientation
* @param useUniformOrientation
*/
setUseUniformOrientation(useUniformOrientation: boolean): boolean;

/**
* A point used to offset each line of pixel in the rendering
* The line of pixel is offseted such as the center of the line is as close as possible to the center point
* This can be used in combination with @see getUseUniformOrientation and a custom distance function for @see getOrientedCenterline to visualize a CPR in projected mode or stretched mode
* Defaults to null.
* @returns the center point
*/
getCenterPoint(): Nullable<vec3>;

/**
* @see getCenterPoint
* @param point
*/
setCenterPoint(point: Nullable<vec3>): boolean;

/**
* This flag indicates wether the GPU should use half float or not
* When true, will use half float
* When false, may use half float if there is no loss of accuracy (see in Texture: checkUseHalfFloat)
* Defaults to false.
* @returns the preferSizeOverAccuracy flag
*/
getPreferSizeOverAccuracy(): boolean;

/**
* @see getPreferSizeOverAccuracy
* @param preferSizeOverAccuracy
*/
setPreferSizeOverAccuracy(preferSizeOverAccuracy: boolean): boolean;

/**
* OrientationArrayName specifies the name of the data array which gives an orientation for each point of the centerline
* The data array has to be in the PointData attribute of the centerline input
* If null, look for the orientation data array in: "Orientation", "Direction", Vectors, Tensors, Normals
* The data array should be an array of mat4, mat3, quat or vec3 but using vec3 makes the CPRInteractor unusable
* Default to null.
*/
getOrientationArrayName(): Nullable<string>;

/**
* @see getOrientationArrayName
* @param arrayName
*/
setOrientationArrayName(arrayName: Nullable<string>): boolean;

/**
* For each point on the oriented centerline, the tangent direction is the direction in which the mapper will sample
* Let O (a mat3) be the orientation at a point on a centerline, and N (a vec3) the tangent direction
* Then the mapper will sample along O * N
* Default value: [1, 0, 0]
*/
getTangentDirection(): vec3;

/**
* @see getTangentDirection
* @param tangent
*/
setTangentDirection(tangent: vec3): boolean;

/**
* For each point on the oriented centerline, the bitangent direction forms with the normal and the tangent direction a new basis
* Default value: [0, 1, 0]
*/
getBitangentDirection(): vec3;

/**
* @see getBitangentDirection
* @param bitangent
*/
setBitangentDirection(bitangent: vec3): boolean;

/**
* For each point on the oriented centerline, the normal direction is the direction along the centerline
* Default value: [0, 0, 1]
*/
getNormalDirection(): vec3;

/**
* @see getNormalDirection
* @param normal
*/
setNormalDirection(normal: vec3): boolean;

/**
* The direction matrix is the matrix composed of tangent, bitangent and normal directions
* It is used to orient the camera or the actor
*/
getDirectionMatrix(): mat3;

/**
* @see getDirectionMatrix
* @param mat
*/
setDirectionMatrix(mat: mat3): boolean;

/**
* Thickness of the projection slab in image coordinates (NOT in voxels)
* Usually in millimeters if the spacing of the input image is set from a DICOM
*/
getProjectionSlabThickness(): number;

/**
* @see getProjectionSlabThickness
* @param projectionSlabThickness
*/
setProjectionSlabThickness(ProjectionSlabThickness: number): boolean;

/**
* Total number of samples of the volume done by the projection mode
* If this number is equal or less than 1, projection is disabled
* Using an odd number is advised
* If this number is even, the center of the slab will not be sampled
*/
getProjectionSlabNumberOfSamples(): number;

/**
* @see getProjectionSlabNumberOfSamples
* @param projectionSlabNumberOfSamples
*/
setProjectionSlabNumberOfSamples(projectionSlabNumberOfSamples: number): boolean;

/**
* Returns wether projection is enabled
* It is based on the number of samples
* @see getProjectionSlabNumberOfSamples
*/
isProjectionEnabled(): boolean;

/**
* The different modes of projection
* Available modes include MIP, MinIP and AverageIP
*/
getProjectionMode(): ProjectionMode;

/**
* @see getProjectionMode
* @param projectionMode
*/
setProjectionMode(projectionMode: ProjectionMode): boolean;

/**
* Find the data array to use for orientation in the input polydata ( @see getOrientationArrayName )
*/
getOrientationDataArray(): Nullable<vtkDataArray>;

/**
* Recompute the oriented centerline from the input polydata if needed and return the result
* If there is no polydata as input, return the last oriented centerline
* It means that if no polydata is given as input and the centerline is set using @see setOrientedCenterline , the given centerline will be used
*/
getOrientedCenterline(): vtkPolyLine;

/**
* Set the internal oriented centerline
* WARNING: this centerline will be overwritten if the polydata centerline is specified (input 1 @see setCenterlineData )
* @param centerline An oriented centerline
*/
setOrientedCenterline(centerline: vtkPolyLine): boolean;

/**
* @returns The total height of the image in model coordinates.
*/
getHeight(): number;

/**
* @param distance Distance from the beginning of the centerline, following the centerline, in model coordinates
* @returns The position and orientation which is at the given distance from the beginning of the centerline.
* If the distance is negative or greater than the length of the centerline, position and orientation are not defined.
* If the centerline is not oriented, orientation is not defined.
*/
getCenterlinePositionAndOrientation(distance: number): { position?: vec3, orientation?: quat };

/**
* @returns A flat array of vec3 representing the direction at each point of the centerline
* It is computed from the orientations of the centerline and tangentDirection
* Uses caching to avoid recomputing at each frame
*/
getCenterlineTangentDirections(): Float32Array;

/**
* @returns The direction to sample, in model space, computed using uniform orientation and tangent direction
*/
getUniformDirection(): vec3;

/**
* @returns A boolean indicating if the mapper is ready to render
*/
preRenderCheck(): boolean;

/**
* Configure the mapper and the centerline to be in straightened CPR mode
*/
useStraightenedMode(): void;

/**
* Configure the mapper and the centerline to be in strectched CPR mode
* @param centerPoint The center point, optional, default to the first point of the centerline or [0, 0, 0]
*/
useStretchedMode(centerPoint?: Nullable<vec3>): void;

/**
* Set the polydata used as a centerline
* You can also use `publicAPI.setInputData(centerlineData, 1);`
* Use all the segments of all the polylines (the centerline can be in multiple pieces)
* The polydata can contain a PointData DataArray to specify the direction in which the mapper should sample for each point of the centerline ( @see getDirectionArrayName @see getDirectionArrayOffset )
* If no such point data is specified, a uniform direction can be used instead ( @see getUniformDirection @see getUseUniformOrientation )
* The points of the centerline are in model coordinates of the volume used as input ( @see setImageDataData ) and not index coordinates (see MCTC matrix of the OpenGL ImageCPRMapper)
* Use `imageData.getWorldToIndex();` or `imageData.getIndexToWorld();` to go from model coordinates to index coordinates or the other way around
* @param centerlineData A polydata containing one or multiple polyline(s) and optionnally a PointData DataArray for direction
*/
setCenterlineData(centerlineData: vtkPolyData): void;

/**
* Same as setCenterlineData except it uses an output port instead of a polydata
* You can also use `publicAPI.setInputConnection(centerlineConnection, 1);`
* @see setCenterlineData
* @param centerlineConnection
*/
setCenterlineConnection(centerlineConnection: vtkOutputPort): void;

/**
* Set the volume which should be sampled by the mapper
* You can also use `publicAPI.setInputData(imageData, 0);`
* The model coordinates of this imageData are used by this mapper when specifying points, vectors or width (see MCTC matrix of the OpenGL ImageCPRMapper)
* You can use `imageData.getWorldToIndex();` or `imageData.getIndexToWorld();` to go from this model coordinates to index coordinates or the other way around
* @param imageData
*/
setImageData(imageData: vtkImageData): void;

/**
* Set the connection for the volume
* You can also use `publicAPI.setInputConnection(imageData, 0);`
* @see setImageData
* @param imageData
*/
setImageConnection(imageData: vtkOutputPort): void;

/**
*
*/
getResolveCoincidentTopology(): ICoincidentTopology

/**
*
*/
getResolveCoincidentTopologyAsString(): ICoincidentTopology

/**
*
*/
getResolveCoincidentTopologyLineOffsetParameters(): ICoincidentTopology

/**
*
*/
getResolveCoincidentTopologyPointOffsetParameters(): ICoincidentTopology

/**
*
*/
getResolveCoincidentTopologyPolygonOffsetFaces(): ICoincidentTopology

/**
*
*/
getResolveCoincidentTopologyPolygonOffsetParameters(): ICoincidentTopology;

/**
*
* @param {Number} factor
* @param {Number} offset
*/
setRelativeCoincidentTopologyLineOffsetParameters(factor: number, offset: number): boolean;

/**
*
* @param {Number} factor
* @param {Number} offset
*/
setRelativeCoincidentTopologyPointOffsetParameters(factor: number, offset: number): boolean;

/**
*
* @param {Number} factor
* @param {Number} offset
*/
setRelativeCoincidentTopologyPolygonOffsetParameters(factor: number, offset: number): boolean;

/**
*
* @param resolveCoincidentTopology
* @default false
*/
setResolveCoincidentTopology(resolveCoincidentTopology: boolean): boolean;

/**
*
* @param {Number} factor
* @param {Number} offset
*/
setResolveCoincidentTopologyLineOffsetParameters(factor: number, offset: number): boolean;

/**
*
* @param {Number} factor
* @param {Number} offset
*/
setResolveCoincidentTopologyPointOffsetParameters(factor: number, offset: number): boolean;

/**
*
* @param value
*/
setResolveCoincidentTopologyPolygonOffsetFaces(value: number): boolean;

/**
*
* @param {Number} factor
* @param {Number} offset
*/
setResolveCoincidentTopologyPolygonOffsetParameters(factor: number, offset: number): boolean;

/**
*
*/
setResolveCoincidentTopologyToDefault(): boolean;

/**
*
*/
setResolveCoincidentTopologyToOff(): boolean;

/**
*
*/
setResolveCoincidentTopologyToPolygonOffset(): boolean;
}

/**
* Method used to decorate a given object (publicAPI+model) with vtkImageCPRMapper characteristics.
*
* @param publicAPI object on which methods will be bounds (public)
* @param model object on which data structure will be bounds (protected)
* @param {IImageCPRMapperInitialValues} [initialValues] (default: {})
*/
export function extend(publicAPI: object, model: object, initialValues?: IImageCPRMapperInitialValues): void;

/**
* Method used to create a new instance of vtkImageCPRMapper
* @param {IImageCPRMapperInitialValues} [initialValues] for pre-setting some of its content
*/
export function newInstance(initialValues?: IImageCPRMapperInitialValues): vtkImageCPRMapper;

/**
* CPR in vtkImageCPRMapper stands for Curved Planar Reformation. This mapper
* can be used to visualize tubular structures such as blood vessels. It can be
* used in projected mode, stretched mode or straightened mode depending on the
* settings @see getUseUniformOrientation , @see getCenterPoint and the distance
* function of @see getOrientedCenterline .
*
* This specialised mapper takes as input a vtkImageData representing a volume
* ( @see setImageData ) and a vtkPolyData representing a centerline
* ( @see setCenterlineData ). The mapper also need to have an orientation per
* point or for all points. This can be specified using a uniform orientation
* ( @see getUniformOrientation @see getUseUniformOrientation ) or a point
* data array ( @see getOrientationArrayName ). Every point, vector or length
* specified to the mapper (centerline points, orientation, width...) use model
* coordinates of the volume used as input ( @see setImageData ).
*
* For each segment of the centerline the mapper creates a quad of the
* specified width ( @see getWidth ) and of height equal to the length of the
* segment. The position and the orientation of the centerline are
* interpolated along the y-axis of the quad. The position is linearly
* interpolated (lerp) and the orientation is interpolated using
* spherical linear interpolation (slerp). For a point (x, y) on the quad,
* the value of y gives the interpolated position P and interpolated
* orientation O which combined with tangentDirection gives D
* ( @see getTangentDirection ). The value of x between -0.5 and 0.5 then gives
* the position to sample in the volume: P + x*D.
*
* By computing the right centerline positions and orientations, one
* can simulate Stretched CPR and Straightened CPR.
*
* This class resolves coincident topology with the same methods as vtkMapper.
*/
export declare const vtkImageCPRMapper: {
newInstance: typeof newInstance;
extend: typeof extend;
}
export default vtkImageCPRMapper;
index.js
import { mat4, quat, vec3 } from 'gl-matrix';
import CoincidentTopologyHelper from 'vtk.js/Sources/Rendering/Core/Mapper/CoincidentTopologyHelper';
import vtkAbstractImageMapper from 'vtk.js/Sources/Rendering/Core/AbstractImageMapper';
import macro from 'vtk.js/Sources/macros';
import vtkPoints from 'vtk.js/Sources/Common/Core/Points';
import vtkPolyLine from 'vtk.js/Sources/Common/DataModel/PolyLine';
import { ProjectionMode } from './Constants';

const { vtkErrorMacro } = macro;

const { staticOffsetAPI, otherStaticMethods } = CoincidentTopologyHelper;

// ----------------------------------------------------------------------------
// vtkImageCPRMapper methods
// ----------------------------------------------------------------------------

function vtkImageCPRMapper(publicAPI, model) {
// Set our className
model.classHierarchy.push('vtkImageCPRMapper');

const superClass = { ...publicAPI };

/**
* Public methods
*/
publicAPI.getBounds = () => {
const imageWidth = publicAPI.getWidth();
const imageHeight = publicAPI.getHeight();
return [0, imageWidth, 0, imageHeight, 0, 0];
};

publicAPI.getOrientationDataArray = () => {
const pointData = publicAPI.getInputData(1)?.getPointData();
if (!pointData) {
return null;
}
if (model.orientationArrayName !== null) {
return pointData.getArrayByName(model.orientationArrayName) || null;
}
return (
pointData.getArrayByName('Orientation') ||
pointData.getArrayByName('Direction') ||
pointData.getVectors() ||
pointData.getTensors() ||
pointData.getNormals() ||
null
);
};

publicAPI.getOrientedCenterline = () => {
const inputPolydata = publicAPI.getInputData(1);
if (!inputPolydata) {
// No polydata: return previous centerline
// Don't reset centerline as it could have been set using setOrientedCenterline
return model._orientedCenterline;
}

// Get dependencies of centerline
const orientationDataArray = publicAPI.getOrientationDataArray();
const linesDataArray = inputPolydata.getLines();
const pointsDataArray = inputPolydata.getPoints();

if (!model.useUniformOrientation && !orientationDataArray) {
vtkErrorMacro(
'Failed to create oriented centerline from polydata: no orientation'
);
publicAPI._resetOrientedCenterline();
return model._orientedCenterline;
}

// If centerline didn't change, don't recompute
const centerlineTime = model._orientedCenterline.getMTime();
if (
centerlineTime >= publicAPI.getMTime() &&
centerlineTime > linesDataArray.getMTime() &&
centerlineTime > pointsDataArray.getMTime() &&
(model.useUniformOrientation ||
centerlineTime > orientationDataArray.getMTime())
) {
return model._orientedCenterline;
}

// Get points of the centerline
const linesData = linesDataArray.getData();
if (linesData.length <= 0) {
// No polyline
publicAPI._resetOrientedCenterline();
return model._orientedCenterline;
}
const nPoints = linesData[0];
if (nPoints <= 1) {
// Empty centerline
publicAPI._resetOrientedCenterline();
return model._orientedCenterline;
}
const pointIndices = linesData.subarray(1, 1 + nPoints);

// Get orientations of the centerline
const orientations = new Array(nPoints);
// Function to convert from mat4, mat3, quat or vec3 to quaternion
let convert = () => null;
const numComps = model.useUniformOrientation
? model.uniformOrientation.length
: orientationDataArray.getNumberOfComponents();
switch (numComps) {
case 16:
convert = (outQuat, inMat) => {
mat4.getRotation(outQuat, inMat);
quat.normalize(outQuat, outQuat);
};
break;
case 9:
convert = (outQuat, inMat) => {
quat.fromMat3(outQuat, inMat);
quat.normalize(outQuat, outQuat);
};
break;
case 4:
convert = quat.copy;
break;
case 3:
convert = (a, b) => quat.rotationTo(a, model.tangentDirection, b);
break;
default:
vtkErrorMacro('Orientation doesnt match mat4, mat3, quat or vec3');
publicAPI._resetOrientedCenterline();
return model._orientedCenterline;
}
// Function to get orientation from point index
let getOrientation = () => null;
if (model.useUniformOrientation) {
const outQuat = new Float64Array(4);
convert(outQuat, model.uniformOrientation);
getOrientation = () => outQuat;
} else {
const temp = new Float64Array(16);
getOrientation = (i) => {
const outQuat = new Float64Array(4);
orientationDataArray.getTuple(i, temp);
convert(outQuat, temp);
return outQuat;
};
}
// Fill the orientation array
for (let i = 0; i < nPoints; ++i) {
const pointIdx = pointIndices[i];
orientations[i] = getOrientation(pointIdx);
}

// Done recomputing
model._orientedCenterline.initialize(pointsDataArray, pointIndices);
model._orientedCenterline.setOrientations(orientations);
return model._orientedCenterline;
};

publicAPI.setOrientedCenterline = (centerline) => {
if (model._orientedCenterline !== centerline) {
model._orientedCenterline = centerline;
return true;
}
return false;
};

publicAPI._resetOrientedCenterline = () => {
model._orientedCenterline.initialize(vtkPoints.newInstance());
model._orientedCenterline.setOrientations([]);
};

publicAPI.getMTime = () => {
let mTime = superClass.getMTime();
if (!model._orientedCenterline) {
return mTime;
}

mTime = Math.max(mTime, model._orientedCenterline.getMTime());
return mTime;
};

publicAPI.getHeight = () => {
const accHeights = publicAPI
.getOrientedCenterline()
.getDistancesToFirstPoint();
if (accHeights.length === 0) {
return 0;
}
return accHeights[accHeights.length - 1];
};

publicAPI.getCenterlinePositionAndOrientation = (distance) => {
const centerline = publicAPI.getOrientedCenterline();
const subId = centerline.findPointIdAtDistanceFromFirstPoint(distance);
if (subId < 0) {
return {};
}
const distances = centerline.getDistancesToFirstPoint();
const pcoords = [
(distance - distances[subId]) / (distances[subId + 1] - distances[subId]),
];
const weights = new Array(2);

const position = new Array(3);
centerline.evaluateLocation(subId, pcoords, position, weights);

const orientation = new Array(4);
if (!centerline.evaluateOrientation(subId, pcoords, orientation, weights)) {
// No orientation
return { position };
}
return { position, orientation };
};

publicAPI.getCenterlineTangentDirections = () => {
const centerline = publicAPI.getOrientedCenterline();
const directionsTime = model._centerlineTangentDirectionsTime.getMTime();
if (directionsTime < centerline.getMTime()) {
const orientations = centerline.getOrientations();
model._centerlineTangentDirections = new Float32Array(
3 * orientations.length
);
const localDirection = new Array(3);
for (let i = 0; i < orientations.length; ++i) {
vec3.transformQuat(
localDirection,
model.tangentDirection,
orientations[i]
);
model._centerlineTangentDirections.set(localDirection, 3 * i);
}
model._centerlineTangentDirectionsTime.modified();
}
return model._centerlineTangentDirections;
};

publicAPI.getUniformDirection = () =>
vec3.transformQuat(
new Array(3),
model.tangentDirection,
model.uniformOrientation
);

publicAPI.getDirectionMatrix = () => {
const tangent = model.tangentDirection;
const bitangent = model.bitangentDirection;
const normal = model.normalDirection;
return new Float64Array([
tangent[0],
tangent[1],
tangent[2],
bitangent[0],
bitangent[1],
bitangent[2],
normal[0],
normal[1],
normal[2],
]);
};

publicAPI.setDirectionMatrix = (mat) => {
if (mat4.equals(mat, publicAPI.getDirectionMatrix())) {
return false;
}
model.tangentDirection = [mat[0], mat[1], mat[2]];
model.bitangentDirection = [mat[3], mat[4], mat[5]];
model.normalDirection = [mat[6], mat[7], mat[8]];
publicAPI.modified();
return true;
};

// Check if the rendering can occur
publicAPI.preRenderCheck = () => {
if (!publicAPI.getInputData(0)) {
vtkErrorMacro('No image data input');
return false;
}
return true;
};

publicAPI.useStraightenedMode = () => {
publicAPI.setCenterPoint(null);
publicAPI.setUseUniformOrientation(false);
publicAPI.getOrientedCenterline().setDistanceFunction(vec3.dist);
};

publicAPI.useStretchedMode = (centerPoint) => {
const centerline = publicAPI.getOrientedCenterline();
// Set center point
if (!centerPoint) {
// Get the first point of the centerline if there is one
const centerlinePoints = centerline.getPoints();
const newCenterPoint =
centerlinePoints.getNumberOfTuples() > 0
? centerlinePoints.getPoint(0)
: [0, 0, 0];
publicAPI.setCenterPoint(newCenterPoint);
} else {
publicAPI.setCenterPoint(centerPoint);
}
// Enable uniform orientation
publicAPI.setUseUniformOrientation(true);
// Change distance function
centerline.setDistanceFunction((a, b) => {
const direction = publicAPI.getUniformDirection();
const vec = vec3.subtract([], a, b);
const d2 = vec3.squaredLength(vec);
const x = vec3.dot(direction, vec);
return Math.sqrt(d2 - x * x);
});
};

publicAPI.isProjectionEnabled = () => model.projectionSlabNumberOfSamples > 1;

publicAPI.setCenterlineData = (centerlineData) =>
publicAPI.setInputData(centerlineData, 1);

publicAPI.setCenterlineConnection = (centerlineConnection) =>
publicAPI.setInputConnection(centerlineConnection, 1);

publicAPI.setImageData = (imageData) => publicAPI.setInputData(imageData, 0);

publicAPI.setImageConnection = (imageData) =>
publicAPI.setInputConnection(imageData, 0);

publicAPI.getIsOpaque = () => true;

// One can also call setOrientedCenterline and not provide a polydata centerline to input 1
model._orientedCenterline = vtkPolyLine.newInstance();
publicAPI._resetOrientedCenterline();
}

// ----------------------------------------------------------------------------
// Object factory
// ----------------------------------------------------------------------------

const DEFAULT_VALUES = {
width: 10,
uniformOrientation: [0, 0, 0, 1],
useUniformOrientation: false,
centerPoint: null,
preferSizeOverAccuracy: false,
orientationArrayName: null,
tangentDirection: [1, 0, 0],
bitangentDirection: [0, 1, 0],
normalDirection: [0, 0, 1],
projectionSlabThickness: 1,
projectionSlabNumberOfSamples: 1,
projectionMode: ProjectionMode.MAX,
};

// ----------------------------------------------------------------------------

export function extend(publicAPI, model, initialValues = {}) {
Object.assign(model, DEFAULT_VALUES, initialValues);

// Inheritance
vtkAbstractImageMapper.extend(publicAPI, model, initialValues);

// Two inputs: one for the ImageData and one for the PolyData (centerline)
macro.algo(publicAPI, model, 2, 0);

model._centerlineTangentDirectionsTime = {};
macro.obj(model._centerlineTangentDirectionsTime, { mtime: 0 });

// Setters and getters
macro.setGet(publicAPI, model, [
'width',
'uniformOrientation',
'useUniformOrientation',
'centerPoint',
'preferSizeOverAccuracy',
'orientationArrayName',
'tangentDirection',
'bitangentDirection',
'normalDirection',
'projectionSlabThickness',
'projectionSlabNumberOfSamples',
'projectionMode',
]);
CoincidentTopologyHelper.implementCoincidentTopologyMethods(publicAPI, model);

// Object methods
vtkImageCPRMapper(publicAPI, model);
}

// ----------------------------------------------------------------------------

export const newInstance = macro.newInstance(extend, 'vtkImageCPRMapper');

// ----------------------------------------------------------------------------

export default {
newInstance,
extend,
...staticOffsetAPI,
...otherStaticMethods,
};