Measurement and Application of Spectral Reflectance in IR for Lidar Image Simulation

From Psych 221 Image Systems Engineering
Jump to navigation Jump to search

Introduction

Thanks to the developments in deep learning and more powerful computing capabilities, as well as more advanced imaging systems, self-driving car has become one of the hottest fields in the industry. In the autonomous driving community, light detection and ranging (lidar) is one of the commonly used imaging techniques. A key element in facilitating the field is that we need a huge amount of training data. In this case, the data are images of driving scenes, which could be taken in the real world, or simulated from computer rendering algorithms. This class project serves as the initial part of a larger project that would, eventually, achieve realistic lidar image simulation for self-driving car applications.

Background

Due to the large demand for data in the training deep network models for autonomous driving, generating realistic driving scene images computer rendering algorithms seems promising. Comparing to actually driving on the road and taking images, generating such images on a computer is much more cost efficient. The Wandell Group at Stanford University has created a Flywheel library of computer models of different physical objects in common driving scenes. Using this library we can easily construct realistic driving scenes. Then using physically based rendering (pbrt) and image simulation pipeline software such as IsetCam and Iset3d, we could create computer simulated images of driving scene very efficiently at large scale [3]. Fig. 1 shows such a computer generated image [3].

Figure 1: Computer generated driving scene image

Lidar systems become a crucial part of most self-driving cars due to its ability to detect object from great distances. Lidars achieve this by sending out laser pulses at all directions sequentially and measuring the time in between sending out and receiving reflected pulses. This way, the distance between the object and the lidar can be calculated and a Lidar distance map (Fig. 2) could be generated. To make the data acquisition/generation process more efficient and scalable, a natural direction is then to use rendering algorithms to generate such maps on a computer. In order to simulate realistic lidar images, we need to know reflectance of different materials in the infrared regime, since most of the lasers in commercial lidar systems operate in IR around 850 nm.

Figure 2: Lidar distance map from Luminar.

Method

Online Libraries

We first did some research on existing libraries and studies on the material reflectance in the IR range. There are three online sources that are most relevant to our lidar project (appendix [1][2][3]). However, they all have a somewhat different foci from driving scenes. The USGS Digital Spectral Library focuses on geology and the study from Terence Haran et al. focuses on different materials from pedestrians (Fig. 3).

Figure 3: Reflectance spectra from Haran et al. [4]

Experiment

Due to the lack of existing libraries on the IR reflectance spectrum for materials common in driving scenes, we went out to an actually scene and measured the spectral reflectance data for 24 different materials. Fig. 4. Shows one example of the driving scenes we took at Stanford Oval. From these driving scenes, we decided a list of 24 common subjects to be measured, like car body, car plate, tree, road, etc. A Photo Research 705/715 photoradiometer was used to measure the reflectance of these subjects with wavelength ranging from 380nm to 1068nm and details of this spectroradiometer can be found in ref [1].

Figure 4: A driving scene at Stanford Oval


For each subject, we measured twice. The first time, we put a white board on the surface of the measured subject as shown in Fig. 5, and this was to measure the background light arriving at the surface. For the second time, we removed this white board and measured the light we received reflected by the surface. From these two measurements, we could calculate the reflectance from their ratio . The focus was set on the white board carefully and kept the same between these two measurements.

Figure 5: Illustration of the experiment.

Simulation with Measured Results

We first calculated reflectance of all measured results and saved them into spd files. “pbrt” files generated based on [2] was used as basis. The rendering was done after replacing the reflectance of relative subjects in the pbrt file by our measured results (wavelength from 380nm to 1068nm). The results were displayed by “ISET” with IR filter off. The whole pipeline is shown in Fig. 6.

Figure 6: Simulation pipeline

Results

Reflectance

All measured results and code has been uploaded to google drive, and Appendix shows detailed information. For each of the 24 subjects, there are three files, one .jpg file shows the picture of the measured subject, and two .mat files contains the reflected spectra of the white board and the subject. The following (Fig. 7) shows results of two measured subjects, a blue coat labeled as ‘clothes’ and another one labeled as ‘greengrass’. The figures on the left show the measured regions, i.e. where we put the white board. The plots in the middle show received light intensities from the material (in blue) and the white board (in red), and their ratio (at bottom), which is the reflectance we want to use for simulation. The color boxes on the right show the colors we got by integrating the reflectance with the color matching functions, and they are approximately the same colors.

Figure 7: Measured results of two 'clothes' and 'greengrass'

Simulation

To test our measured reflectance data could be incorporated into our image rendering pipelines, we created a very simple scene with one uniform material in it. Then in the pbrt file of the scene, we replaced the material ‘spectrum kd’ by ‘blackcar.spd’ from our measurement. Then after running it in the Iset pipeline we could plot the irradiance from the ‘Optics’ window. We then compared this curve with our measured reflection spectrum for the ‘blackcar’ (which is reflectance multiplied by background light) and they look exactly the same. In Fig. 8, part a shows the displayed simple scene with uniformly one material; part b shows the plotted irradiance from part of the scene; part c shows the measured reflection spectrum and calculated reflectance. This verifies that our measurement data are compatible with the image rendering pipeline and could be incorporated in generating lidar images in the future.

Figure 8: a) a simple scene, b) irradiance from the scene, c) measured reflection spectrum and reflectance curve

Summary & Future Work

Summary

For this class project, we measured the spectral reflectance in the visible and IR regime for 24 materials that are common in driving scenes. We converted the measured spectra to xyz/RGB values to verify that the results are consistent. We then created a simple blank scene of a uniform material on pbrt and replaced its spectral radiance with our measured material reflectance. By comparing the measured spectral reflectance and the displayed one from the IsetCam software, we verified that our measurement data are compatible with the image rendering pipeline.

Future Work

For future work of this project, one simple thing we could quickly do is to measure the reflectance spectra for more materials so that our library would have a better coverage of different colors and textures of materials in driving scenes. Also to better simulate the real scene, we could measure the reflectance as a function of angle from the detector. When simulating and rendering the images, we would need to modify the light source to resemble how lidars work in the real world. Furthermore, we need to research and work on the signal processing part to better extract reflected pulse signal from background noise.

Reference

[1] http://www.cs.unc.edu/~stc/FAQs/spectroradiometer/705spec.pdf

[2] https://github.com/mmp/pbrt-v3

[3] Henryk Blasinski Electronic Imaging 2018 (5), 1-7

[4] Terence Haran et al. Infrared Reflectivity of Pedestrian Mannequin for Autonomous Emergency Braking Testing

Appendix

All data and code can be found from the following link, there is a README file explaining the contents of each folder and descriptions of functions. Please contact authors for access. https://drive.google.com/drive/u/1/folders/1yl8QUrc8jkqNFVF8zVO-jhsCrhAnDOsP

Online libraries for material reflectance in IR:

[1]https://speclab.cr.usgs.gov/spectral.lib06/ds231/datatable.html

[2]http://gsp.humboldt.edu/OLM/Courses/GSP_216_Online/lesson2-1/reflectance.html

[3]https://scholarworks.iupui.edu/bitstream/handle/1805/14346/chien-2016-infrared.pdf?sequence=1&isAllowed=n