Google Pixel 4a Lens Shading

From Psych 221 Image Systems Engineering
Jump to navigation Jump to search

Jiahui Wang

Introduction

Lens shading, also known as vignetting, is common in optics and photography. As shown in Fig. 1 [1,2], the peripheral region of the image is darker than the center region. This phenomenon is intrinsic because of the geometric of the lens and thus the relative illumination at the edge is lower [2]. This interesting artifact has been widely used by photographer and artists to increase the depth and history of their photographs [1]. But we as an engineer should consider the correction of the artifact and provide people freedom to add the effect later on.

In this project, we are going to utilize the photos taken using Google Pixel 4a to model the lens shading effect of Google Pixel 4a camera. We analysis the lens shading effects for different color channels, different exposure time and different isospeed. We compare our model with the standard costh4 model [2, 4, 5]. And finally, we build a pipeline of the lens shading correction using different images.

Figure 1: Left: Geometric lens that caused the lens shading. Right: a picture with lens shading artifact (from [1]).

Datasets and pre-processing

The pictures are taken using Google Pixel 4a from an ‘integrating sphere’, which provides a uniform field illumination such that the nonuniformity in pictures correspond to the lens distortion and camera noise.

The pictures include different exposure time and isospeed, which will be used later to analyze the model dependence to exposure time and isospeed. The data are stored as raw images using DNG formats [7]. We are going to utilize ieDNGRead [3,4] to read and extract the image information.

The pictures are the raw image including pixel-wise information. As shown in Fig. 2, if we take the data at the black line directly using the raw DNG data, the two color channels will be mixing and it looks like large, which is not true. So we need to separate the color channels for image processing.

Figure 2: 1D raw data (black) vs color separated data (colored).

Methods

We can read from the DNG file about the color filter array pattern. In this case, the uncompressed data have 'bggr' pattern. We are going to separate the data to b, g1, g2, r directly from the raw data. Then fit our models separately for different color channels. We use 2D polynomial models to fit the data for different color channels.

Poly22

We first fit the polynomial function up to the second order:

sfr(x,y) = p00 + p10*x + p01*y + p20*x^2 + p11*x*y + p02*y^2,

where pij is the fitting parameters and x, y are the normalized location parameters.

After getting the fitting map sfr(x,y), we elementwise divide the original data by the fitting map to get the corrected data. Figure 3 shows the original data, correction maps and corrected data for different color channels as 2D color maps. Because g1 and g2 has little difference, here we just plot one, g1, and the three channels will be R, G, B channels respectively.

Figure 3:The columns represent R, G, B color channels. The first, second, and third rows are the original images, the correction maps, and the corrected images respectively.

We can observe in Fig. 3 that the R, G, B channels have similar correction maps. The corrected images are better than the uncorrected one and the corners have larger aberration. The corrected images have obvious concentrate ring shape artifact. If we normalize the data to mean=1 image, we can get the standard derivation for different color channels (r, g1, g2, and b) are: 0.0732, 0.0723, 0.0713, and 0.0597.

To compare the fitting results for different color channels, we summarize the fitting parameters as shown in Table 1. To compare the fitting results for different color channels, we summarize the fitting parameters as shown in Table 1. In addition to the fitted parameters Pij, we calculate the normalized parameter N20 = P20/P00 and N02 = P02/P00, since they dominant the correction map shape. We can see that red and green color have very similar correction map, while blue color channel is slightly different from red and green.

Table 1: Poly22 fitting parameters for different channels.

Poly44

Poly22 fit up to the second order, which corresponds to distance from the center. We could try higher order approximation with Poly44 function which corresponds to a fitting. Figures 5 and 6 show the comparison of the poly22 and poly44 correction results. We can see that the corner distortion effects are weaker when using poly44 model.


Figure 5: Channel R correction map and fitting correction using poly22 and poly44 models.
Figure 6: Grayscale RGB correction map and fitting correction using poly22 and poly44 models.

The standard derivations for different color channels (normalize to 1) are 0.0428, 0.0412, 0.0411, and 0.0461, respectively, which show a great improvement than poly22.

Matlab support up to Poly55. We also did experiment on poly55 model but it does not have significant difference than poly44. We are not going to describe the poly55 results in detail here. We also try to fit up to fitting with only components. But due to the lack of other components like . The fitting results are not that promising. You may play with our code to check the results.

Lens shading correction pipeline

After getting a correction map, we can correct the lens shading effects in real situation figures. Here we build the pipeline of the correction as shown in Fig. 7. We are going to show the correction results in the Results section.

Figure 7: Lens shading correction pipeline.

Results

Now we are going to apply our model in different situations and compare it with cos4th model.

Black Level and Relative illumination

In this section, we would like to highlight the importance of the black level intensity and the relative illumination. First of all, lens shading artifact is caused by the lens geometric distortion. Optical system is linear, thus the distortion function should not depend on the exact illumination level. We define relative illumination as

,

where is the illumination at angle and is the illumination intensity at the center of the image.

If we do not subtract the black level intensity, what we actually calculate is

,

which gives a wrong conclusions that Black level intensity and the absolute intensity will influence the line shape and fitting parameters.

Thus it is important to subtract the black level intensity and use the relative illumination to calculate the lens shading correction map.

Exposure

We are going to fix the ISO speed first (isospeed=198 in analysis). There are 5 unique exposure durations, each has 5 sampled images with the same ISO speed. We are going to average the 5 samples with the same exposure duration and same ISO speed. In Fig. 8, we plot the normalized and unnormalized blue channel illumination data cutting at the center (y=0) of the images.

Figure 8: We plot blue channel data with different exposure time with ISO198 at the center of the images. Left: Unnormalized image data with different exposure time. Right: normalized data with different exposure time.

We can observe that the normalized illumination lines are the same for different exposure durations. We used to fit our data with isospeed198 and exposure= 0.0119 in previous sections. And we use it as a reference model to correct data with different exposure time.

Figure 9: Corrected map (for blue channel) with different exposure durations.

We can see that the fitted model can be applied to different exposure durations. To have a clearer comparison, we plot the mean and standard derivation of the corrected image with different exposure time in Fig. 10. It matches what we conclude from Fig. 9. The standard derivation and mean are similar for different exposure time.

Figure 10: Mean and standard derivation of the corrected images with different exposure time.

ISO speed

We now fix the exposure time and analyze the influence of the ISO speed. The ISO speed has a great influence on noise level and the total illumination. However, when we subtract the Black level illumination and normalized the data, the relative illumination curve shape should not be influenced, as shown in Fig. 11. So we can draw similar conclusions as in exposure duration study that the ISO speed won't influence the relative illumination shape thus share the same lens shading correction map.

Figure 11: We plot blue channel data with different ISO speed time with the same exposure time and cut at the center of the images. Left: Unnormalized image data with different ISO speeds. Right: normalized data with different ISO speeds.

Cos4th

As a comparison of our polynomial model, we fit the cos4th model [2, 5] to our datasets and compare it with our poly44 model. Since we do not know the distance from the lens to the sensor, we take it as fitting parameters and use the fitting function:

A+B(C/(x^2+y^2+C))^2,

where A, B, and C are the fitting parameters. We compare the cos4th corrected red channel image with the poly44 model. We can see that the standard derivation of cos4th model is smaller than the poly44 model. And the corrected model actually looks more uniform. It is reasonable because cos can be understand as higher order polynomial Taylor series. So we envision that with higher order polynomial fittings, we can get a better correction results.

Figure 12: Red channel comparison between cos4th model and our poly44 model. The first and second rows are the correction map and the corrected red channel image, respectively.

Lens shading correction for real pictures

The teaching team provides the DNG files taken using Google pixel 4a at different situations. We are going to use the correction color map we get from our model poly22 and poly44 to correct the lens shading artifacts.From the examples shown in Figs. 13 and 14, we observe that the corner becomes lighter. Poly22 caused more unwanted distortions and poly44 behaves better. There is little color aberration when we apply the lens shading algorithms, which might be studied in the future for color correction project.

Figure 13: Example 1. From left to right: uncorrected image; poly22 corrected image; poly44 corrected image.
Figure 14: Example 2. From left to right: uncorrected image; poly22 corrected image; poly44 corrected image.

Conclusions

In conclusion, we have explored different polynomial models for lens shading correction under different conditions. The results show that the lens shading may be corrected better with higher order polynomial function. We conclude that, when performing the raw image processing, one needs to separate color channels and subtract the Black Level intensity before processing the data. For different exposure durations and ISO speeds, one might use the same lens shading correction map with relative illumination and correct data normalization methods. The lens shading is caused by the intrinsic lens geometric distortion. The optical system is linear, so this phenomenon is intrinsic and should not depend on the illumination intensity. There are some other issues we meet during the data analysis process. For example, when we process the image, we notice that there will be some color distortion when performing the correction, which has also been noticed in Ref. [8]. It would be interesting in the future to consider color calibration when performing lens shading correction algorithms.

Slides and Codes

Check the OneDrive link: https://office365stanford-my.sharepoint.com/:f:/g/personal/jiahuiw_stanford_edu/EuQqRyY9ZmZLqCuHfQ_84GEBggFib-E5s55BCpqunB4ZAw?e=t2Y67k

References

[1]. When to use vignetting in photographs. https://www.lightstalking.com/vignetting/

[2]. Psych221 Class slides; Wiki Vignetting. https://en.wikipedia.org/wiki/Vignetting

[3]. Farrell, Joyce E., et al. "A simulation tool for evaluating digital camera image quality." Image Quality and System Performance. Vol. 5294. International Society for Optics and Photonics, 2003.

[4]. Farrell, Joyce E., Peter B. Catrysse, and Brian A. Wandell. "Digital camera simulation." Applied optics 51.4 (2012): A80-A90.

[5]. Kerr, Douglas A. "Derivation of the Cosine Fourth Law for Falloff of Illuminance Across a Camera Image." (2007).

[6]. Chen, Chih-Wei, and Chiou-Shann Fuh. "Lens Shading Correction for Dirt Detection." Pattern Recognition, Machine Intelligence and Biometrics. Springer, Berlin, Heidelberg, 2011. 171-195.

[7]. Adobe Digital Negative (DNG) specification. https://wwwimages2.adobe.com/content/dam/acom/en/products/photoshop/pdfs/dng_spec_1.5.0.0.pdf

[8]. Silva, Varuna De, Viacheslav Chesnokov, and Daniel Larkin. "A novel adaptive shading correction algorithm for camera systems." Electronic Imaging 2016.18 (2016): 1-5.

[9]. For more information and figures, please check our slides and codes.

Acknowledgement

Jiahui Wang would like to thank the teaching team for providing the datasets and pointing out the importance of the black level intensity subtraction.