Psych 284: Difference between revisions

From Psych 221 Image Systems Engineering
Jump to navigation Jump to search
imported>Winawer
imported>Winawer
Line 20: Line 20:
Project ideas:
Project ideas:


===(A) Study the image amplitude spectrum across the image processing pipeline===
===(A) How does the image amplitude spectrum change across the image processing pipeline?===
Generate images containing noise patterns with particular amplitude distributions in the Fourier domain. For example a series of noise patterns such that the amplitude of the spatial frequency components in pixel space can be described by a function 1/f^n, with n spanning a range of values, such as 0 (white noise) 1 (pink noise), 2 (highly blurred) and so on. Then we calculate the amplitude spectrum for a series of images as they are processed through the visual system, including:
Generate images containing noise patterns with particular amplitude distributions in the Fourier domain. For example a series of noise patterns such that the amplitude of the spatial frequency components in pixel space can be described by a function 1/f^n, with n spanning a range of values, such as 0 (white noise) 1 (pink noise), 2 (highly blurred) and so on. Then we calculate the amplitude spectrum for a series of images as they are processed through the visual system, including:
1. the pixel image
1. the pixel image
Line 29: Line 29:


We would like to know, how doe the amplitude spectrum change across these different stages of processing?
We would like to know, how doe the amplitude spectrum change across these different stages of processing?
===(B) ===

Revision as of 20:44, 6 April 2011

This Psych 284 wiki page enables students to post code, comment on the organization of the project, and document aspects of their work.

Return to main teaching page.


Introduction

The course project is to build a computational infrastructure for modeling visual circuits and behavior.

Software organization

Results

Readings

Related work

Crazy ideas

Project ideas:

(A) How does the image amplitude spectrum change across the image processing pipeline?

Generate images containing noise patterns with particular amplitude distributions in the Fourier domain. For example a series of noise patterns such that the amplitude of the spatial frequency components in pixel space can be described by a function 1/f^n, with n spanning a range of values, such as 0 (white noise) 1 (pink noise), 2 (highly blurred) and so on. Then we calculate the amplitude spectrum for a series of images as they are processed through the visual system, including: 1. the pixel image 2. the irradiance image(assuming a typical LCD display) 3. the optical image (after passing through the eye's optics) 4. the cone absorption image 5. retinal ganglion cell image

We would like to know, how doe the amplitude spectrum change across these different stages of processing?

(B)