Lin: Difference between revisions
imported>Projects221 |
imported>Projects221 |
||
Line 117: | Line 117: | ||
==Code== | ==Code== | ||
My code is [https://github.com/linjef/photon-calculator hosted on GitHub], as per Professor Wandell's suggestions (minus | My code is [https://github.com/linjef/photon-calculator hosted on GitHub], as per Professor Wandell's suggestions (minus VSET, which should be in the starting directory and announced in Matlab with isetPath()). | ||
==Data== | ==Data== |
Revision as of 19:22, 20 March 2013
Back to Psych 221 Projects 2013
Background
Original photon calculator utility suggestion
Build a program, perhaps based on the ISET library, that calculates the spectral irradiance at the sensor from the scene radiance and a specification of the optics. Doing this for diffraction-limited optics, specifying only the f/#, is sufficient.
The utility should be backed by a wiki page that illustrates all of the steps in doing that calculation. This project should produce an educational and useful calculator.
- Doing an implementation that can run on a browser on the Internet is best.
- Doing a straight Matlab implementation with a nice GUI is also good.
- Implementing the ISET (Matlab) routines as a Python calculator has value, as well.
Submitted project background
The program will (hopefully) run in a browser, be able to take in lens transmission as a function of wavelength (or be able to choose from presets of lenses, such as that of the human eye, thin lens glass, and camera lenses, though I can only find data for a Nikon 55mm as of now), and an f/# and lens magnification / distances.
The calculation will also take into account cosine-fourth fall-off and irradiance blurring from diffraction. The program will get scene radiance from the two sample image scenes from ISET, and if possible, I'd like to make more scene radiance files (but it sounds like that requires special equipment) and lens transmission data (also, more special equipment, like a spectroradiometric measurer).
An article with graphics describing the calculations will be made.
Evolving project goals
An early draft of the project was completed and brought in to Professor Wandell for suggestions. The project at that point could, given a Jpeg image, extract spectral data. Then, after an area is selected, radiance can be displayed as a graph. After lens and aperture were selected, irradiance could be displayed as a graph.
I also was interested in asking Professor Wandell how ISET/VSET was able to produce scene radiance from simple jpg files. The project in that state used VSET in order to calculate scene radiance from a jpg and an illuminant, and would then continue on to calculate irradiance incident on a sensor using the irradiance formula
And furthermore apply cos-4th fall off / vignetting calculations of
The plan was also to work on an algorithm for spatial blurring (assuming operation is diffraction-limited) of
And then to implement the VSET radiance calculations purely into Java, so the web application would be fully self-sufficient (i.e. not need VSET).
Professor Wandell listened patiently and then asked about how the application was structured: it was a layer of Javascript/HTML/CSS presented to the web browser, Java on the backend on the server, and VSET/Matlab was patiently running next to Java---a part I was getting ready to excise out, in order to finish the project. Professor Wandell suggested a different approach, instead, however---it would instead be a much more interesting teaching tool to continue with a VSET backend and to allow users to run different tutorials beyond just a photon calculator.
And so I worked to build a better photon calculator (now liberally including optics/sensor calculations from VSET) and to organize my code to better expose VSET/Matlab functions to the end-user in the web browser. The hope is to not only produce a good photon calculator tutorial for a good Psych 221 project, but to also lay the groundwork for anybody interested in vision science to build similar web tutorials quickly and easily!
Methods
Technology Stack
The user interface was written to be displayed in a web browser (e.g. Chrome, Firefox). Thus, it was written with a blend of Javascript, HTML, and CSS (but as a draft; thus, don't expect them to be standards-compliant or to work on compliance-problematic browsers like Internet Explorer 6).
Javascript from the browser / user clicks are then translated into Java calls on the server (for example, the Ivory server). The server runs Tomcat, which is able to serve Java-based "servlets" to the user.
MatlabControl is a package for Java which is able to translate and parse Matlab expressions from the Java programming language. This was used to call ISET/VSET in order to perform image manipulation and calculations.
Features added, as suggested
The website is able to present a tutorial mode (after clicking on the "help?" button on the top right of the screen), which guides the user through the calculations and understanding necessary to go from a Jpeg image to irradiance numbers (and then on to the cone absorption histogram).
The website also presents sensor data (for human cone absorption), a way to get statistics from sensorGet, oiGet, and sceneGet, which are VSET functions, and also a graphical get (creating an image from VSET/Matlab and then presenting it; for example, typing
figure(25); plot(1:5)
will get you a nice, simple linear plot. Examples of things you can type into there are also built into the tutorial mode.
The code is documented enough so that, hopefully, future projects can jump in and deploy the project, and then ponder what to extend on this project.
Tutorial mode vs normal mode; screenshots
Results - what you found
There aren't many results to speak of! The resulting webpage should speak for itself; you can access it on-campus on the Ivory server or view the archival images on the Wiki. The webpage may require VPN access otherwise.
The resulting webpage should be a useful tutorial for anyone who didn't watch the irradiance/radiance conversion videos (though they were highly instructive; thanks Professor Wandell!) or prefer to learn through text form instead.
The resulting code has been made available on GitHub as noted below.
Limitations
- The server controls (and shares) one Matlab session among all current users. There are solutions (using HttpSession in Java for example), but implementation requires some thought.
- When re-deploying the project (for another user or onto another server, for example), Matlab paths are hard-coded and need to be changed appropriately (depending on operating system and computer setup).
- sensorGet and similar commands are fed raw to Matlab, without sanitation. A malicious user could wreck havoc with the server system...
Conclusions
The results are promising; for instructive purposes, students do not necessarily have to install Matlab/VSET in order to finish a tutorial. Instead, it is possible to write code similar to what was written in this project, to display a tutorial, its comments, and its output.
Furthermore, because input is accepted, a user could go on and play with parameters. It's possible that in the future, VSET could live a happy and whole life in the "cloud"! There are of course many more extensions that can be done on this project.
For example, multi-client/user support can be built in, as MatlabControl allows multiple Matlab sessions, to some extent (however, this would certainly crush the server without additional optimizations made). Additional code optimizations can and should be made, in order to avoid running duplicate Matlab code. Sanitation of input should be performed, and the code can be adapted so that, given a tutorial name, such as s_HumanSensor, it can dynamically generate the page you see currently.
There's a lot to build on, and a lot left to build, so I would recommend anybody in Psych 221 2014 to consider extending this project.
Most of the data and images used come from Professor Wandell's excellent VSET and Psych 221 slides.
In addition, the project uses as software packages:
- Matlab/ISET-4.0, developed by ImagEval Consulting, provided by Professor Brian Wandell and Dr. Joyce Farrell;
- matlabcontrol, developed by Joshua Kaplan;
- jqPlot, developed by Chris Leonello;
- imgAreaSelect, developed by Michal Wojciechowski.
And also data from:
- Nikon 55mm and filters data from NSF Graphics and Visualization Center
- Glass and CR-39 data from Ophthalmic Lens Tints and Coatings by Gregory L. Stephens and John K. Davis
Appendix I - Code and Data
Code
My code is hosted on GitHub, as per Professor Wandell's suggestions (minus VSET, which should be in the starting directory and announced in Matlab with isetPath()).
Data
zip file with filter data and some simple Matlab scripts; should go in Matlab starting directory
This data file contains some filter/lens data, some simple Matlab scripts, and should go in the Matlab starting directory (which should be set in the Java code as packaged on GitHub).