App for Programmable Camera in Android
Group Members: Kaitlyn Benitez-Strine, Ronnie Instrella, Joe Maguire
Back to Psych 221 Projects 2014
Introduction

The basic description of our project is as follows:
We have a prototype programmable camera to be used with iOS or Android devices. The project's goal is to make an app that will run on iOS or Android and uses the camera. Think of an interesting camera app, and we can work together to build it. Prior experience in iOS or Android is needed.
Our group's task was to use the power of a programmable camera to make seemingly difficult photos much easier to produce. We immediately started thinking of types of photos that are somewhat easy to do (if you know how to manipulate your camera a little), but that are difficult enough that the average person would not automatically understand how to reproduce. Today there are various cool effects your average digital camera can do, if one understands how to manipulate the numerous modes. However, most people don't understand the concept behind the shot and thus stick to the simple "point and shoot" automatic mode.
We wanted to hone in on one single "cool" effect, and after turning to the internet for inspiration, we found our project - slow shutter speed photography. Inspired by the magical picture to the right, our group set out to make slow shutter speed photography an easy experience through the phone.
What has Been Done in the Past
Our group investigated prior applications on the iPhone/ Android to see what sorts of options were available to users on a typical slow shutter speed photography app. We picked two different apps for the iPhone and analyzed how the average person would approach taking a photo with these apps. We picked a paid app and a free app to analyze: Slow Shutter Cam and LongExpo respectively. I had our subject try to match a photo we had taken with the app while doing a think-aloud protocol.
Slow Shutter Cam

Featured by Apple in "App Store Essentials: Camera & Photography" and "Extraordinary Photo Apps", Nominated for the 2010 and 2011 "Best App Ever Award - Best Photo App", and Recently mentioned by the NY Times is the best slow shutter speed camera on the market right now for IOS - Slow Shutter Cam [1].
For $0.99 one can take three types of slow shutter speed photography photos.
- Motion Blur: Basically allowing the user to turn on the the shutter priority mode on the camera, it captures images over an extended period of time for lovely blurring effects (perfect for ghost images, waterfalls, and adding a notion of movement to photos.)
- Low Light: Used under low light conditions so that camera sensor picks up every last photon of light, this mode allows the user to capture people/ moving objects under low light conditions.
- Light Trail: Used to capture moving light (such as fireworks or cars at night, this mode allows the user to capture the movement of a light source.
And in addition to these modes, one has the options of tapping to adjust focus/ exposure, locking the exposure/ focus from shot to shot, seeing a live preview of the captured image, and much more during the taking of the photos. After the fact, the user can compensate for exposure, and adjust brightness, contrast, saturation, and hue.
All of this is wonderful for a photographer wishing to take nearly Digital Single Lens Reflex (DSLR) shots with his or her iPhone camera, but for the average person I wondered what they would make of the features.
Review of Slow Shutter Cam
Never having been exposed to the app before, our subject was confused by a couple of things. First she was unaware of what the AF/ AE locks were at the top of the screen and after much experimentation she was still unaware of the purpose of their function (locking exposure and focus for numerous shots). Plus it wasn't exactly intuitive where to find the major modes of the app (which could be found by pressing the shutter looking object in the lower left side of the screen). Once she discovered the button (after having exhausted all other options), she didn't know which option would be best for her purpose of taking a photo of a waterfall in a shady region of the Stanford Shopping Center on an afternoon in March. Furthermore, after experimenting with the modes she was never entirely certain what terms like "exposure", "exposure boost", or "light sensitivity" meant.
As a result, we intend to make the "difficult" terms more easily understood, and pictures to change modes more readily observed for the average non-photographer user. This could be done with better symbols/ pictures, and little blurbs to explain the effects each option has on the captured image.
IPhone Application: Long Expo

LongExpo doesn't have quite as many awards as Slow Shutter Cam, but it is a great free app for long exposure and light trail photos. It's current rating for the latest update is 4.5 stars. [2]
LongExpo also has three modes:
- Standard mode: Used for the typical blurring effect, this mode is for regular long exposure photography.
- Low Light Mode: With an adjustable bar for exposure in low light conditions, this mode allows the user to take photos of movable objects/ people in darker settings.
- Light Trail Mode: Focused on recording moving light sources, this mode captures light trails, and light stream photography.
And beyond these modes, one can edit the photo in numerous ways after taking the shot. The user can adjust brightness, contrast, saturation and where the photo freezes. Plus they can add filters (similar to Instagram's), as well as frames, stickers, focus, drawings, and meme lettering, as well as other interesting effects.
Again we had an average person test the app without having seen it before to see how she would match a photo we had taken previously.
Review of Long Expo
The user had difficulty finding go back/ delete buttons when she wanted to take a different picture. It was also unintuitive how to switch shutter speed from the get go. After tapping random buttons, she eventually discovered the adjusters she had wanted to find, but it took some time to do so. And even then, when she found the variable shutter speed she discovered a B "time" (which basically goes as long as you want to take the picture for), but that was not obvious from the start.
She also never found out that there were several modes (the three listed above), so she just stuck to what she was given from the start and never discovered the other modes to choose from.
All of this made it clear that the buttons for adjusting modes, and going back should be clear and explicit as soon as you open the app. Plus, labeling of what certain adjusters denote would also be helpful.
What We Intend to Do
The best part about our app is that we have a DSLR camera to take the photos for our users, which implies that the user will be free to move about while getting high resolution photos. So although we intended to have similar modes to the apps before us, we have a couple advantages that make for a better shot:
- Stability: The programmable camera can be placed on something so that when the user intends to take the image, he/she does not ruin the shot by pressing the camera button on the phone and changing its disposition as the picture begins. The slight change in position would create an unsharp image, but the programmable camera can be placed on a table or tripod.
- Movability: The programmable camera again can be placed anywhere, so the user does not have to be where the camera is and yet the user can see what the camera sees and control it without being next to it. Thus, the user can be in the photo without running to beat a self timer.
- Fine Tuned Control: The DSLR camera has more possibilities than a phone's camera allowing for a better shot - for instance there are finer tuned choices for shutter speed, ISO speed, and aperture.
With these benefits, my group intends to create an app with three modes:
- Motion Blurring: Great for capturing movement either by blurring the moving objects/ water, or for blurring the background by panning means.
- Light Trail: Perfect for capturing movement of a light source such as fireworks or taillights, but also for one's one phone screen. As soon as one hits the camera button, the screen can become a color that the camera picks up through the sensors, and the user can move their phone into different letters and shapes.
- Painting With Light: This technique is when the user is behind the camera and lighting up an object in front of the camera with a flashlight or laser. The light becomes a sort of paint brush unto the night scene.
And along with these modes the user will be able to adjust the brightness, hue, and contrast after the picture has been taken.
Background
Methods
Camera Parameters
To accomplish the defined tasks, it is important to have control over the settings of the camera. This includes the camera's shutter speed, exposure compensation, f stop, and ISO. Motion blurring occurs at nominal levels of illumination so we must minimize the amount of light acquired by the CCD or else the image will appear washed out. To do this we automatically reduce the ISO values, exposure compensation while increasing the camera's f-stop. For light trail and painting with light, the level of background illumination is very low, so we can preserve nominal ISO, exposure compensation, and f stop will increasing the exposure time. This is because the light source is orders of magnitude brighter then background, so it will always leave a distinct noticeable trail when the photo is integrated over many frames.
Image Thresholding
Results
Ultimately the underwater color rig was successful in providing convincing estimates of underwater illumination. A summary of intermediate and final results, along with overviews of the data processing techniques used are introduced below.
Macbeth/Xrite Color Checker Reflectances
The Macbeth Color Checker patch reflectances are a part of the ISET suite and were not re-measured for the purposes of this experiment. A chart of the reflectances is shown below:

The reflectances easily cover the spectral extent of the human visual system and beyond into the infared. They make a useful set of spectrally independent color estimation targets. The Xrite color target used in the PVC color rigs is produced by a different manufacturer than the Macbeth target characterized by the ISET suite. While the color patches used by Xrite are assumed to be identical to Macbeth's, a verification of this assumption would guarantee the results presented here.
Canon SX260HS Responsivities
The laboratory configuration described above was used to collect a series of 81 images of linearly independent spectra with the SX260HS with fixed color balance. An example of one of the images is shown below, with a red illuminant. The camera's exposure, ISO, and aperture were set manually to avoid saturation in any of the camera's RGB color channels.

A section of the illumination target within the images was extracted and compared to the dark portions of the image to render an estimate of the average RGB response of the camera to the illuminant. A graph of the RGB responses measured by the SX260HS is shown below. The x-axis is the wavelength setting of the Cornerstone 130 monochromatic light source.

The RGB responses alone are not sufficient to estimate the camera's responsivities; the intensity and spectral bandwidth of the illumination target must also be known. The intensity and bandwidth of the same illumination targets as measured by the PR-650 colorimeter are shown below. Note the monochromatic light source has both a finite bandwidth and variable intensity.

Combining the SX260HS RGB response and the measured PR-650 colorimeter spectra, the Canon's overall spectral response can be estimated. The linear system of digital color photography was rearranged to describe camera sensitivities as a function of RGB pixel response and illuminant spectra and solved using a non-negative least squares solver in MATLAB's optimization toolbox, lsqnonneg. A graph of the individual color filter responsivities is shown below.

The SX260HS sensor response is significant between 405nm and 660nm. This represents the limit of the spectral band the color rig can estimate. The PR-650 colorimeter has a 5nm resolution, resulting in 51 bands within the sensor range.
Underwater Color Rig Data
The color rig's Canon SX260HS camera was programmed using the CHDK firmware to snap photos of the onboard Xrite color checker at an interval of approximately 2 seconds, limited by the camera's onboard processor. A single descent from the surface to 90 feet and back produces between 100 and 150 JPEG photos. On some of the dives, the camera was switched to HD video mode to collect higher rate data. The video recording subsystem on the SX260HS includes an automatic exposure control making the recorded videos unsuitable for seawater spectral absorption analysis, but they do provide an excellent qualitative overview of the selective absorption effect with depth. Two videos from the final configuration of the color rig are posted online at YouTube:
85ft Dive Number 1 (with Salp capture on return!)
85ft Dive Number 2 (with plastic bottle for fun)
Extracting Color Patch RGB Information
The PVC color rigs maintain a consistent field of view as the rig descends, so a map of the color patches within the Xrite color target was defined as shown below. Each of the photos taken by the rig results in a set of RGB values for each of the 24 color patches, or 72 data points per photo.

As the rig descends the patches take on a green tint and the red patches especially lose their intensity and appear a muted gray color. A rendering of the color checker values at different depths is shown below.


Light is quickly attenuated by the water column as the rig descends. A plot of RGB color values throughout a dive is shown below to illustrate the effect. The RGB values correspond to the white color patch on the bottom left of the color target.

Pixel Saturation
The SX260HS aperture, exposure, ISO, and color balance settings were manually fixed at the surface using the camera's onboard light meter as a reference. Unfortunately, the light meter allows individual color channels to saturate within a pixel. As a result many of the brighter patches within the Xrite color target result in saturated pixel readings, a value of 255 in this case. Since saturation is a nonlinear effect it must be avoided. A histogram of the pixel intensities recorded throughout a dive reveals many of the color patches saturate at least one color channel.

During the RGB pixel extraction process, color patches that experienced saturation were flagged as unusable for illuminant estimation. Of the 24 patches on the Xrite color target, only 7 were usable. A map of the usable color patches is shown below.

Underwater Spectra
With known camera color responsivities, known color target reflectances, and a set of extracted RGB pixel data the underwater scene illuminant can be estimated. Rearranging the linear model for color digital photography, we get the following system for each camera photo:

The color patch reflectivities in the model have been merged via element-by-element multiplication to form a single combined spectral response for each color patch. There are a total of 7 usable color targets within each image and 3 pixel color values per target resulting in a total of 21 linear equations usable for estimating the illuminant. There are a total of 51 spectral bands in the responsivity matrix of the color filter array and consequently 51 unknown values in the illuminant estimate. This results in an underdetermined system. To solve the system, an additional smoothness constraint was applied by developing the following optimization objective function, inspired by the multispectral estimation techniques of Park et. al [4].

The system was solved via non-negative least squares optimization in MATLAB using the lsqnonneg function. Since natural illuminants tend to be smooth and are guaranteed to be non-negative this is a reasonable constraint. An estimated, normalized spectrum from a depth of 65 feet is shown below.

By sorting the estimated spectra by depth an intensity map of illuminants can be created, as shown below. Depth in feet is shown along the y-axis and wavelength is shown in nanometers along the x-axis. Illuminant intensity is shown in log scale. Note the dramatic attenuation of light with depth and the spectral peak near 532nm.

Conclusions and Future Work
The proof-of-concept experiment reveals future improvements and experiments for the system: The color target chosen must be analyzed explicitly to confirm the color patch spectra match those in the ISET toolbox. The acrylic technique used to shield the color card must be refined to suppress reflections. A higher resolution setting on the PR-650 colorimeter should be used to refine the color filter array characterization of the digital camera. A shorter exposure setting should be used during data-collection dives to prevent pixel saturation. Finally, more illuminant data should be collected at both the Hopkins Marine Station site and other locations during different seasons to determine the geographic and temporal diversity of the seawater absorption characteristic.
Despite design challenges, however, the underwater color rig was successfully able to characterize the changing underwater illumination with depth using relatively inexpensive equipment. By careful selection and characterization of both digital camera and color target a sufficient basis was established to generate a true multispectral estimate of the underwater illuminant spectrum. The techniques described in this report can be used to process images captured and estimate illumination at depth in a variety of applications from digital photography color balancing to monitoring marine plankton populations or estimating the health of a coral population. Hopefully the design of this inexpensive system will enable further underwater multispectral exploration.
References - Resources and related work
References
[1] "Slow Shutter Cam on the App Store on iTunes." itunes.apple.com. Apple Online Store, 18 Dec 2013. Web. 15 Mar 2014. <https://itunes.apple.com/us/app/slow-shutter-cam/id357404131?mt=8>.
[2] "LongExpo - slow shutter and long exposure camera on the App Store on iTunes." itunes.apple.com. Apple Online Store, 09 Jan 2014. Web. 15 Mar 2014. <https://itunes.apple.com/us/app/longexpo-slow-shutter-long/id594078421?mt=8>.
[3] Brian A. Wandell, Foundations of Vision, Chapter 9. https://www.stanford.edu/group/vista/cgi-bin/FOV/chapter-9-color/#Linear_Models
[4] Jong-Il Park, Moon-Hyun Lee, Michael D. Grossberg, and Shree K. Nayar "Multispectral Imaging Using Multiplexed Illumination" Computer Vision, 2007. ICCV 2007. IEEE 11th International Conference on
Software
Image Systems Engineering Toolbox http://imageval.com/
Canon Hack Development Kit http://chdk.wikia.com/wiki/CHDK
Appendix I - Code and Data
In the belief that the techniques used may be illustrated best by example, the MATLAB code used to perform the multispectral analysis and calibration is available below along with sample data from the project.
Code
All code was written in MATLAB R2012a for Mac OSX, Mountain Lion. External dependencies include the MATLAB Image Processing Toolkit, MATLAB Optimization Toolkit, and the ISET toolkit.
Data
A total of 2.5GB of image data were collected for this project, and are available upon request. The extracted RGB color patch data, colorimeter responses, and SX260HS responsivity data are available below.
File:CanonSX260HSResponsivity.zip
Presentation
This project was given as a 5-minute presentation to the PSYCH221 Winter 2013 class at Stanford. The presentation files used are linked below.
5min Powerpoint Presentation File