Smartphone Camera Quality: Difference between revisions

From Psych 221 Image Systems Engineering
Jump to navigation Jump to search
imported>Student2017
imported>Student2017
Line 41: Line 41:
For our project, we recruited participants to capture images with their smartphones on Stanford’s campus.  Our objective was to have a broad sample of smartphone cameras, so we collected data at several different locations on campus over a period of two weeks.  Our data collection locations included the Stanford Bookstore, the Graduate School of Business, the SCIEN Image Systems Engineering Seminar, the Psychology 221 class, and the SCIEN Affiliates Meeting.   
For our project, we recruited participants to capture images with their smartphones on Stanford’s campus.  Our objective was to have a broad sample of smartphone cameras, so we collected data at several different locations on campus over a period of two weeks.  Our data collection locations included the Stanford Bookstore, the Graduate School of Business, the SCIEN Image Systems Engineering Seminar, the Psychology 221 class, and the SCIEN Affiliates Meeting.   


[[Image:mfakBookstore.JPG|frame|200px|Collecting data at the Stanford Bookstore]]
[[Image:mfakBookstore.JPG|left|thumb|200px|Collecting data at the Stanford Bookstore]]
 
Our objective for our data collection procedure was for it to be controlled and repeatable.  We created a series of three image targets and had participants take multiple photos of each of them.  In order to provide a consistent illuminant, we borrowed a light booth and brought it to each data collection location and used the daylight illuminant for every image captured.  We covered the front of the light booth with a piece of cardboard and cut a small opening to point the phone camera through in order to minimize stray light entering the light booth.  We used a piece of Styrofoam mounted on the cardboard front cover to serve as a stand for the phones while participants were taking photos.  While different phones have the camera at different locations, the stand ensured that all phones were held in the same horizontal alignment during image capture. 
 
We had participants capture three consecutive photos of each of three different image targets.  All photos were taken with the phone in landscape orientation, using the following settings:
*Photo mode
*Flash, live, and filters off
*Zoom out
*Highest resolution
*No image compression
*All other settings (HDR, etc.) to auto
 
The image targets were mounted on pieces of cardboard that we cut to match the dimensions of the rear panel of the light booth.  For each round of photos with a given image target, we removed the light booth’s lid and placed the piece of cardboard (with the image target on it) flush against the rear panel of the light booth and used binder clips to fasten it to the upper edge of the rear panel.  We then placed the lid back on the light booth and had the participant take three photos.  After these three photos were taken, we removed the light booth lid and replaced the cardboard panel with another panel with a different image targets.


=Results=
=Results=

Revision as of 07:42, 15 December 2017

Introduction

With the increasing quality of smartphone cameras and software, everyone--from professional photographers to amateurs taking selfies--is using their smartphone as the primary device to capture images. In recent years, consumer desire to take and share high-quality photos with a device as portable as a smartphone has exploded, as evidenced by the popularity of image capture and sharing applications like Snapchat and Instagram and the ever-increasing need for photo storage space through services like Apple iCloud and Google Photos. Following this trend, there is increasing interest in quantifying and comparing the quality control of images captured by all smartphones that leave the manufacturing plant. The DxOMark rankings are treated as a reliable metric for comparison, but the sample size and complexity of the tests required to generate these rankings is limited to only a few units of each phone.

For our project, we would like to assess the unit-to-unit variation in image quality among smartphones in the real world. In particular, we are curious if there are variations based on smartphone manufacturer, model, price, firmware/OS version, length of ownership/use, and/or physical condition. We believe that things related to physical properties of camera may vary from unit to unit, so we think that chromatic aberration, distortion, and/or color metrics might show some variation in our preliminary results. Smartphones are now approaching $1000 with almost all recent improvements devoted to the camera, so we believe that this is an important and technologically-relevant area to explore for our final project.

Background

Smartphone camera quality is an emerging area of interest for researchers and consumers. There are two primary standards for smartphone camera quality evaluation: the DxOMark ratings1, and the IEEE Standard2.

DxOMark

DxOMark is a company that performs camera and lens image quality assessments and then provide ratings for consumers. In addition to digital camera sensors and lenses, DxOMark reviews mobile phone cameras and ranks them based on a variety of measurements. On a smartphone, they analyze the performance of the imaging pipeline in its entirety, including lens, sensor, camera control, and image processing. Their protocol includes a combination of lab testing and perceptual evaluation of images taken in the lab and in a variety of indoor and outdoor scenes. DxOMark reports sub-scores in several different categories in additional to an overall score, which is used to rank the smartphone cameras. DxOMark reviews both photos and videos captured on the smartphones. For photos, their evaluation metrics include:

  • Exposure and contrast, including dynamic range, exposure repeatability, and contrast
  • Color, including saturation and hue, white balance, white balance repeatability, and color shading
  • Texture and noise
  • Autofocus, including AF speed and repeatability
  • Artifacts, including softness in the frame, distortion, vignetting, chromatic aberrations, ringing, flare, ghosting, aliasing, moiré patterns, and more
  • Flash
  • Zoom at several subject distances
  • Bokeh

For videos, their evaluation metrics include:

  • Exposure
  • Color
  • Texture and noise
  • Autofocus
  • Artifacts
  • Stabilization

DxOMark is seen as the industry standard and their rankings are referenced in popular press and industry publications, including Forbes, The Verge, Wired, and TechRadar.

IEEE Standard

IEEE Std 1858-2016: Camera Phone Image Quality provides a detailed specification of test conditions and apparatus for evaluating smartphone image quality. The standard includes protocols for lab-based assessments as well as subjective perceptual evaluations. The evaluation metrics considered include spatial frequency response, lateral chromatic displacement, chroma level, color uniformity, local geometric distortion, visual noise, and texture blur.


Methods

Data Collection

For our project, we recruited participants to capture images with their smartphones on Stanford’s campus. Our objective was to have a broad sample of smartphone cameras, so we collected data at several different locations on campus over a period of two weeks. Our data collection locations included the Stanford Bookstore, the Graduate School of Business, the SCIEN Image Systems Engineering Seminar, the Psychology 221 class, and the SCIEN Affiliates Meeting.

Collecting data at the Stanford Bookstore

Our objective for our data collection procedure was for it to be controlled and repeatable. We created a series of three image targets and had participants take multiple photos of each of them. In order to provide a consistent illuminant, we borrowed a light booth and brought it to each data collection location and used the daylight illuminant for every image captured. We covered the front of the light booth with a piece of cardboard and cut a small opening to point the phone camera through in order to minimize stray light entering the light booth. We used a piece of Styrofoam mounted on the cardboard front cover to serve as a stand for the phones while participants were taking photos. While different phones have the camera at different locations, the stand ensured that all phones were held in the same horizontal alignment during image capture.

We had participants capture three consecutive photos of each of three different image targets. All photos were taken with the phone in landscape orientation, using the following settings:

  • Photo mode
  • Flash, live, and filters off
  • Zoom out
  • Highest resolution
  • No image compression
  • All other settings (HDR, etc.) to auto

The image targets were mounted on pieces of cardboard that we cut to match the dimensions of the rear panel of the light booth. For each round of photos with a given image target, we removed the light booth’s lid and placed the piece of cardboard (with the image target on it) flush against the rear panel of the light booth and used binder clips to fasten it to the upper edge of the rear panel. We then placed the lid back on the light booth and had the participant take three photos. After these three photos were taken, we removed the light booth lid and replaced the cardboard panel with another panel with a different image targets.

Results

Conclusions

References

Appendix I

Appendix II