Table of Contents
Local Surface Normal Estimation
For some applications, not only is the local connectivity of a point region important, but so is the local surface normal for that particular neighborhood. Naturally, then, many point cloud libraries involve the calculation of the local normal vectors, either over the entire point set or for a subset of the point set. There are two related approaches for computing the normal vector.
First, though, it is important to note that we are talking about local neighborhoods to the points in question. Since the normals will be computed for local neighborhoods, it does not make much sense to actually compute the normal at every point. Yes, it is possible and may be of utility for many algorithms. Here, we want to get a slightly coarse estimate of the normals over the entire point cloud. What that means, is that a subset of points should be selected for computing these local normals, much like in the Matlab Version noted below.
Getting a Local Neighborhood
There are actually many ways to get a local neighborhood. What we are really looking for in a neighborhood is a center point from the point cloud, plus the local surrounding points to that chosen center point. As long as you identify a reasonable policy for establishing the center point and its local nearby points, you should be good to go.
Some ideas are:
- Using the connectivity matrix.
- Using Matlab's internal $k$-nearest neighbors code to grab the local points.
- Using evenly chosen points from the point cloud list, then creating their neighborhoods.
More details on the underlying algorithms go here.
PCA
The principal component analysis, or PCA, approach does not really perform principal component analysis, however it does share that same processing up to a certain point. PCA is effectively a method for finding a linear subspace that represents the data well. It involves eigenvalue/eigenvector analysis of the covariance matrix for a set of data. Let $\Sigma$ be the covariance matrix of the data given by $P$ (a 3 x $n$ matrix of points). the eigenvalue decomposition of the covariance matrix is $$ \Sigma = V \Lambda V^T, \ \ \text{where}\ \ \Lambda = \text{diag}(\lambda_1, \lambda_2, \lambda_3).$$ The eigenvalue $\lambda_i$ reflects the variance associated to the data along the direction given by its associated eigenvector $\vec e_i$. Recall that the matrix $V$ consists of the eigenvectors $$ V = \left[ \begin{matrix} | & | & | \\ \vec e_i & \vec e_2 & \vec e_3 \\ | & | & | \end{matrix} \right].$$ If the set $P$ is truly planar, then the normal vector to the points in $P$ should be aligned with the eigenvector with no variance (the eigenvalue such that $\lambda_i = 0$). In practice there will be noise or some curvature, so what is sought is the minimal eigenvalue and eigenvector. Normally, eigenvectors can be complex valued, but for covariance matrices that is not the case. They should be non-negative real values.
Finding the smallest of the non-negative, real eigenvalues and taking its eigenvector gives the local surface normal.
Matlab Notes: Implementation will involve proper use of the min
,cov
, and eigs
functions.
SVD
Noting the similarity of the PCA approach to the SVD calculations, one can immediately identify a similar procedure as the PCA one, but involving the SVD of a matrix. Given a set of points in matrix form, $P$, where the data is column-wise, the SVD is: $$ P = U \Sigma V $$ We know that the normal to a surface is the vector that is orthogonal to the surface. Let's presume that the points in $P$ have been centered so that they have zero mean (so really, the coordinates represent vector directions from the mean value to the actual point in the cloud). If the local surface is a small, planar, patch then there exists a normal vector $n$ such that $$ n^T p_i = 0, \ \ \forall i \in 1 \dots n$$ where $n$ is the number of points in $P$. Alternatively, we can write is relative to the matrix $P$, $$ n^T P = 0$$ where now the right hand side is the vector of 0 values (it is 1 x $n$). That means that the local normal vector is the vector associated to the smallest left singular value. Rreally, that singular value should be zero, but in the presence of noise or some local curvature, it will be a small number. In Matlab speak, what we are looking for is $\vec n = U(:,$end$)$.
Naturally, if we are given a small set $P$ and would like to compute its normal, then the first step would be to center the data. After that computation of the SVD followed by extraction of the smallest singular vector gives the necessary estimate. By definition of the SVD, this vector is already unit length, so we've got the answer.
Matlab Notes: In slightly older Matlab versions, implementation will involve proper use of bsxfun
to perform the mean offset computation (in one line of code). For R2016b and later, Matlab allows for implicit expansion of array operations, which means that it will automatically invoke bsxfun
given the coded operation (discussed here). Be careful about whether Matlab wants/returns things row-wise or column-wise.
Visualizing in Matlab
Visualizing the normals involve plotting the normal vectors at the point location of the vector base. The means to do so in Matlab is through the function quiver3
. It takes in the actual point coordinates for all the vector bases, then the vector coordinates for the vector proper. In our case, the idea is to pass the points for which the normals have been computed, and to pass the actual normals for those points. Supposing that the variables we properly set, then the function invocation would be:
quiver3(x, y, z, u, v, w);
Matlab Version
Matlab has its own internal routine for performing the same operation. It is called pcnormals
and its operation is sort of described in this tutorial. The tutorial can be followed as a check of your code, but should not be used as the solution to the weekly activity (unless explicitly noted).