vis4d.op.base.pointnetpp

Pointnet++ implementation.

based on https://github.com/yanx27/Pointnet_Pointnet2_pytorch Added typing and named tuples for convenience.

#TODO write tests

Functions

farthest_point_sample(xyz, npoint)

Farthest point sampling.

index_points(points, idx)

Indexes points.

query_ball_point(radius, nsample, xyz, new_xyz)

Query around a ball with given radius.

sample_and_group(npoint, radius, nsample, ...)

Samples and groups.

sample_and_group_all(xyz, points)

Sample and groups all.

square_distance(src, dst)

Calculate Euclid distance between each two points.

Classes

PointNet2Segmentation(num_classes[, in_channels])

Pointnet++ Segmentation Network.

PointNet2SegmentationOut(class_logits)

Prediction for the pointnet++ semantic segmentation network.

PointNetFeaturePropagation(in_channel, mlp)

Pointnet++ Feature Propagation Layer.

PointNetSetAbstraction(npoint, radius, ...)

PointNet set abstraction layer.

PointNetSetAbstractionOut(coordinates, features)

Ouput of PointNet set abstraction.

class PointNet2Segmentation(num_classes, in_channels=3)[source]

Pointnet++ Segmentation Network.

Creates a new Pointnet++ for segmentation.

Parameters:
  • num_classes (int) – Number of semantic classes

  • in_channels (int) – Number of input channels

__call__(xyz)[source]

Call implementation.

Parameters:

xyz (Tensor) – Pointcloud data shaped [N, n_feats, n_pts]

Return type:

PointNet2SegmentationOut

Returns:

PointNet2SegmentationOut, class logits for each point

forward(xyz)[source]

Predicts the semantic class logits for each point.

Parameters:

xyz (Tensor) – Pointcloud data shaped [N, n_feats, n_pts]$

Return type:

PointNet2SegmentationOut

Returns:

PointNet2SegmentationOut, class logits for each point

class PointNet2SegmentationOut(class_logits: Tensor)[source]

Prediction for the pointnet++ semantic segmentation network.

Create new instance of PointNet2SegmentationOut(class_logits,)

class_logits: Tensor

Alias for field number 0

class PointNetFeaturePropagation(in_channel, mlp, norm_cls='BatchNorm1d')[source]

Pointnet++ Feature Propagation Layer.

Creates a pointnet++ feature propagation layer.

Parameters:
  • in_channel (int) – Number of input channels

  • mlp (list[int]) – list with hidden dimensions of the MLP.

  • norm_cls (Optional(str)) – class for norm (nn.’norm_cls’) or None

__call__(xyz1, xyz2, points1, points2)[source]

Call function.

Input:

xyz1: input points position data, [B, C, N] xyz2: sampled input points position data, [B, C, S] points1: input points features, [B, D, N] points2: sampled points features, [B, D, S]

Returns:

upsampled points data, [B, D’, N]

Return type:

new_points

forward(xyz1, xyz2, points1, points2)[source]

Forward Implementation.

Input:

xyz1: input points position data, [B, C, N] xyz2: sampled input points position data, [B, C, S] points1: input points features, [B, D, N] points2: sampled points features, [B, D, S]

Returns:

upsampled points data, [B, D’, N]

Return type:

new_points

class PointNetSetAbstraction(npoint, radius, nsample, in_channel, mlp, group_all, norm_cls='BatchNorm2d')[source]

PointNet set abstraction layer.

Set Abstraction Layer from the Pointnet Architecture.

Parameters:
  • npoint (int) – How many points to sample

  • radius (float) – Size of the ball query

  • nsample (int) – Max number of points to group inside circle

  • in_channel (int) – Input channel dimension

  • mlp (list[int]) – Input channel dimension of the mlp layers. E.g. [32 , 32, 64] will use a MLP with three layers

  • group_all (bool) – If true, groups all point inside the ball, otherwise samples ‘nsample’ points.

  • norm_cls (Optional(str)) – class for norm (nn.’norm_cls’) or None

__call__(coordinates, features)[source]

Call function.

Input:

coordinates: input points position data, [B, C, N] features: input points data, [B, D, N]

Returns:

coordinates: sampled points position data, [B, C, S] features: sample points feature data, [B, D’, S]

Return type:

PointNetSetAbstractionOut with

forward(xyz, points)[source]

Pointnet++ set abstraction layer forward.

Input:

xyz: input points position data, [B, C, N] points: input points data, [B, D, N]

Returns:

coordinates: sampled points position data, [B, C, S] features: sample points feature data, [B, D’, S]

Return type:

PointNetSetAbstractionOut with

class PointNetSetAbstractionOut(coordinates: Tensor, features: Tensor)[source]

Ouput of PointNet set abstraction.

Create new instance of PointNetSetAbstractionOut(coordinates, features)

coordinates: Tensor

Alias for field number 0

features: Tensor

Alias for field number 1

farthest_point_sample(xyz, npoint)[source]

Farthest point sampling.

Input:

xyz: pointcloud data, [B, N, 3] npoint: number of samples

Returns:

sampled pointcloud index, [B, npoint]

Return type:

centroids

index_points(points, idx)[source]

Indexes points.

Input:

points: input points data, [B, N, C] idx: sample index data, [B, S]

Returns:

, indexed points data, [B, S, C]

Return type:

new_points

query_ball_point(radius, nsample, xyz, new_xyz)[source]

Query around a ball with given radius.

Input:

radius: local region radius nsample: max sample number in local region xyz: all points, [B, N, 3] new_xyz: query points, [B, S, 3]

Returns:

grouped points index, [B, S, nsample]

Return type:

group_idx

sample_and_group(npoint, radius, nsample, xyz, points)[source]

Samples and groups.

Input:

npoint: Number of center to sample radius: Grouping Radius nsample: Max number of points to sample for each circle xyz: input points position data, [B, N, 3] points: input points data, [B, N, D]

Returns:

sampled points position data, [B, npoint, nsample, 3] new_points: sampled points data, [B, npoint, nsample, 3+D]

Return type:

new_xyz

sample_and_group_all(xyz, points)[source]

Sample and groups all.

Input:

xyz: input points position data, [B, N, 3] points: input points data, [B, N, D]

Returns:

sampled points position data, [B, 1, 3] new_points: sampled points data, [B, 1, N, 3+D]

Return type:

new_xyz

square_distance(src, dst)[source]

Calculate Euclid distance between each two points.

src^T * dst = xn * xm + yn * ym + zn * zm; sum(src^2, dim=-1) = xn*xn + yn*yn + zn*zn; :rtype: Tensor

sum(dst^2, dim=-1) = xm*xm + ym*ym + zm*zm; dist = (xn - xm)^2 + (yn - ym)^2 + (zn - zm)^2

= sum(src**2,dim=-1)+sum(dst**2,dim=-1)-2*src^T*dst

Input:

src: source points, [B, N, C] dst: target points, [B, M, C]

Output:

dist: per-point square distance, [B, N, M]