PPMGR Dataset

PPMGR Dataset: Plausible Planetary Material Geometry and Reflectance Imaging Dataset


Characterization of objects in all dimensions at a microscopic level is important in numerous applications including surface analysis on planetary bodies. Existing microscopes fit for this task are large bench-top devices unsuitable for in-situ use, particularly in resource-constrained remote robotic exploration. Computational imaging techniques present a powerful means to overcome physical limitations in fielded sensors, but have seen especially little use in space applications. [Pettersson 2019] presented a miniature (150 gram) 3D microscopic imager without moving parts capable of providing 1-megapixel images at approximately 1 micron horizontal and 5 micron vertical resolution. This device combines light-field imaging and photometric stereo to provide both 3D reconstruction and reflectance characterization of individual soil grains. This system opens vast opportunities for extension, demonstrating the potential of computational imaging to amplify sensing capabilities in space. 

Ground truth geometry and reflectance data is infeasible to capture with the 3D Microscope as it requires specialized fabrication. Therefore, we present a physically-based dataset and code that emulates the 3D Microscope towards benchmarking various 3D Microscope algorithms. We use the NVIDIA OptiX ray tracing engine to render our dataset. We first generate 20 random meshes with controllable roughness based on the Diamond Square Algorithm with Open3D. We then collect and provide open source texture maps with varying specularity and complexity to simulate harsh terrain. Some textures were diffuse and consistent, such as a sand or ash, while others were specular and varied, such as a rocks in a pool of water or dimpled metal. We then render the data with optically accurate lighting and camera positions to simulate the 3D Microscope capturing sequence from [Pettersson 2019]. Each scene in the dataset includes the mesh, texture used in the mesh, the rendered image, and the ground truth depth, normals, and roughness. We validate our dataset with multi view stereo and photometric stereo algorithms from [Pettersson 2019]. We hope our dataset and code provide benchmarking of classical and learning-based inverse rendering algorithms to improve sensing capabilities in space.


In order to calibrate the microscope in [Pettersson 2019] three fiducials with varying surface textures were manufactured. Microscope data of these fiducials were analyzed with the multi view stereo (MVS) and photometric stereo (PS) algorithms and the mean absolute depth errors were found. We compared the fiducial errors to the depth mean absolute error (MAE) of our renders. The results of this comparison is shown in Table 1.

The depth is found by fusing the depth from MVS with the normals of PS. Because ground truth depth and normals are known, we have included the depth and normal errors before fusing for the final depth reconstruction. The mesh types are divided into rough and smooth and the difference between the two can be seen in the below scene images.

As expected, the rough meshes with several dark pits and featured surfaces result in a larger error. Additionally, the scenes with a high a specularity, such as the dimpled metal and large rocks in water, result in a larger error. Overall, the errors of our rendered scenes are comparable to those of the fiducial indicating to us that our rendered scenes are suitable for analysis.

Table 1: Mean absolute error of all meshes and fiducials.


This dataset was created while Brevin Tilmon was an intern with Dr. Uland Wong and Dr. Michael Dille in the Intelligent Robotics Group at NASA Ames Research Center through KBR/SGT. Jackson Arnold also contributed to the dataset creation through the FOCUS Lab at the University of Florida. 


Miniature 3D Microscope and Reflectometer for Space Exploration.
G. Pettersson, M. Dille, S. Abrahamsson, U. Wong
IEEE International Conference on Computational Photography (ICCP), 2019


Thanks to Uland Wong and Michael Dille for support on this project.


For questions about the code or dataset, contact: Dr. Sanjeev Koppal <sjkoppal@ece.ufl.edu>, Brevin Tilmon <btilmon@ufl.edu>, or Jackson Arnold <jarnold2@ufl.edu>.


Download the dataset and code here [.zip, 807 MB]. Download includes the the dataset each image, ground truth depth, normals, and roughness, the mesh used for each render, the textures for each scene, and the code to recreate the dataset.

Each mesh directory includes the parameters used to generate that scene’s mesh. The texture directories include the color, height, normals, occular occlusion, and roughness. Only color, normals, and roughness were used in rendering, but the additional parameters can be implemented.

The textures are freely distributed at 3dtextures.me, and each scene’s texture directory includes the specific texture link. The texture links are also provided with each scene preview below. The textures mapped to each mesh can be modified to better suit the user. The scenes here are not meant to be photorealistic representations of planetary geology, and if a more accurate mesh or texture is made it can be applied.

Scene 1: Sparse Sedimentary Matrix

Texture Link: 3dtextures.me/2020/10/30/ground-wet-rocks-002

Scene 2: Dark Basalt

Texture Link: 3dtextures.me/2018/12/21/volcanic-ash-001

Scene 3: Granite

Texture Link: 3dtextures.me/2016/06/21/blue-marble-001

Scene 4: Light Basalt

Texture Link: 3dtextures.me/2018/12/21/volcanic-ash-001

Scene 5: Specular Igneous Matrix

Texture Link: 3dtextures.me/2020/10/14/metal-hammered-002

Scene 6: Diffuse Igneous Matrix

Texture Link: 3dtextures.me/2020/05/11/ground-wet-pebbles-001

Scene 7: Plagioclase

Texture Link: 3dtextures.me/2018/01/15/rock-ore-002

Scene 8: Sandstone

Texture Link: 3dtextures.me/2017/03/23/sand-002

Scene 9: Dense Sedimentary Matrix

Texture Link: 3dtextures.me/2020/01/01/stone-path-001

Bonus Scene: Icy Material

Texture Link: 3dtextures.me/2017/12/28/water-001