Hardware Project: Refining a Multispectral Imaging System

From Psych 221 Image Systems Engineering
Revision as of 04:51, 18 March 2014 by imported>Projects221 (Conclusions)
Jump to navigation Jump to search

Team

Sean McIntyre, Johannes Giorgis, Matisse Milovich, Paul Villers

Introduction

Background

Multispectral imaging uses wavelength-specific filters to capture light at certain frequencies, to gain information about the spectrum of light reflected off an object in question. Different sets of wavelength-specific filters can provide information about the spectrum not available to the human eye either by increasing the resolution of the spectrum (using many filters) or by increasing the range of the spectrum (using UV and infrared filters). Multispectral imaging has a number of different applications. Obtaining high granularity spectral information about reflected color in a painting, for example, can distinguish between two colors that are metamers to the human eye, and therefore provide information about what pigments and dyes may have been used in different geographic zones or in different time periods to create apparently similar colors.

Project Description

The goal of the project was to build a multispectral imaging system based on a rotating color filter wheel and monochrome camera. The system will be used to capture images of paintings in the Cantor Arts museum. A high level schematic (below) shows the multispectral imaging process; our objective is to design a system which connects the lens to the filter wheel and the filter wheel to the camera, so that transmitted light is recorded by the sensor and an image file is saved on the computer. Finally, a processing step operates on the set of images before viewing and interpretation.

Project Pipeline

At the project outset, the components were neither physically nor electronically connected. The filter wheel housing wasn't able to connect the camera or lens, because only one side of the housing had c-mount connectivity. The stock adaptor was too thick and would have increased the lens - sensor distance which would have reduced the focal range of the lens. Furthermore, the filter wheel and camera were disconnected and had manual functionality, which didn't allow one-click multispectral image acquisition.

Design Specifications

The project had two principal design components. The first is to use mechanical adaptors in the lens - filter - camera system to ensure that light entering through the lens would be filtered before reaching the sensor. The second is to implement an automated photo-capture system via communication between a computer, Arduino and the camera.

Adaptors

Automated photo-capture

Components

  1. Point Grey monochromatic camera with c-mount threading, USB port, GPIO port
  2. Wavelength-specific filters, each with c-mount threading
  3. Edmund optics Intelligent Filter Wheel (FW) with control box and housing
  4. Arduino Mega 2560 Board
  5. PC with FlyCap software and SDK

Methods

System Overview

Design

Results

Conclusions

We met the objective of the project and delivered an improved system in terms of successful mating between physical components and a working automated photo capture system. For the purpose of taking images of full art pieces, the distance between camera sensor and lens is still too large. Our proposed solution to this problem is to use a different camera-lens pair that is meant to have a larger distance between the lens and sensor.


Engineering Challenges

The most challenging and time-consuming portions of the project were as follows:

  • Designing the C-Mount threads from scratch with limited information about the thread profile. We carefully researched C-Mount thread specifications and designed within the stated tolerances. Nevertheless, after testing our first iteration of 3D printed parts we needed to fine-tune the thread profile to reduce friction with the lens and camera threads.
  • Several full days were spent trying to use the SDK files from Point Grey. We finally found that the files require a highly specific compiler and library set. The user manual gives very little information.
  • We were unable to establish serial communication with the filter wheel, possibly because of the version of filter wheel we have. If it is possible, it is certainly not as easy as it is made out to be in the documentation. We worked around this issue by using the filter wheel’s other communication protocol, the “Santa Barbara” protocol, which uses pulses of varying widths to set filter positions. This communication system is one-way; it does not allow us to receive signals from the filter wheel. To compensate, we used timed delays of appropriate length to coordinate the camera with the filter wheel.
  • The communication and timing between the filter wheel and camera has been an ongoing challenge. The more the system is automated, the more it depends on all connections working seamlessly. The most critical step is the correct initialization of the camera. Full understanding of the camera system is not possible from its technical reference document, but requires contacting the manufacturer.


With more time we would calibrate the system using a Macbeth ColorChecker and obtain the light intensity per waveband. Our system may be used in the near future to capture data from artwork at Stanford’s Cantor Arts Museum, as well as from fluorescent scenes.

References

Appendix I - Code and Data

Appendix II - Work Partition

  • Design and finalization of 3D printed adapters - Matisse and Paul
  • Design and implementation of Arduino control - Sean and Matisse
  • Finalization of circuit - Sean
  • C++ program to calibrate camera and save images - Paul and Johnny
  • Testing of entire system - All

We would like to thank Henryk for pushing us to make the system as automated as possible, for guidance, and for use of his lab space.