PIMS/CSC Weekly Seminar: Usman Alim
Topic
Speakers
Details
Abstract:
The body-centered cubic (BCC) lattice is the optimal three-dimensional sampling lattice. Its optimality stems from the fact that its dual, the face-centered cubic (FCC) lattice, achieves the highest sphere-packing efficiency. In order to approximate a scalar-valued function from samples that reside on a BCC lattice, spline-like compact kernels have been recently proposed. The lattice translates of an admissible BCC kernel form a shift-invariant approximation space that yields higher quality approximations as compared to similar spline-like spaces associated with the ubiquitous Cartesian cubic (CC) lattice. In this work, we focus on the approximation of derived quantities from the scalar BCC point samples and investigate two problems: the accurate estimation of the gradient and, the approximate solution to Poisson’s equation within a rectangular domain with homogeneous Dirichlet boundary conditions. In either case, we seek an approximation in a prescribed shift-invariant space and obtain the necessary coefficients via a discrete convolution operation. Our solution methodology is optimal in an asymptotic sense in that, the resulting coefficient sequence respects the asymptotic approximation order provided by the space. In order to implement the discrete convolution operation on the BCC lattice, we develop efficient three-dimensional versions of the discrete Fourier and sine transforms. These transforms take advantage of the Cartesian coset structure of the BCC lattice in the spatial domain, and the geometric properties of the Voronoi tessellation formed by the dual FCC lattice in the Fourier domain. We validate our solution methodologies by conducting qualitative and quantitative experiments on the CC and BCC lattices using both synthetic and real-world datasets. In the context of volume visualization, our results show that, owing to the superior reconstruction of normals, the BCC lattice leads to a better rendition of surface details. Furthermore, like the approximation of the function itself, this gain in quality comes at no additional cost.
Additional Information
Usman Alim, GrUVi Lab, School of Computing Science, SFU