Andy Lin: Difference between revisions

From Psych 221 Image Systems Engineering
Jump to navigation Jump to search
imported>Ydna
imported>Ydna
Line 43: Line 43:


= Results =
= Results =
results... add pictures.. discuss


= Conclusion =
= Conclusion =

Revision as of 05:45, 18 March 2011

Introduction

It is extremely useful to render 3D scenes and be able to feed the radiance information of these scenes into camera simulation software such as ISET [CITATION: ISET]. Synthesizing scenes allows for the control over scene attributes that we could be difficult to obtain when working with real world scenes. For example, scene synthesis allows for the control over foreground objects, backgrounds, texture and lighting. Most importantly, the depth information is very easy to obtain, in contrast to the complicated depth estimation algorithms for real-world scenes.

Often times, people criticize synthesized scenes and say that they are not realistic enough. However, with enough effort and precision, highly realistic scenes can be regenerated. See Figure 1 for examples of synthesized scenes with the Radiance software [CITATION: Radiance].

x

x

Figure 1: Rendered scenes can be quite realistic. These scenes, in particular, are example scenes from Radiance.

Moreover, if these rendered scenes can be fed into a camera simulation software such as ISET, then we could simulate photographs of these artificial scenes. Moreover, the additional 3D information allows for renderings of advanced photography attributes that have not been simulated extensively before in a well-controlled synthesized environment. For example, the 3D information allows for the simulation of multiple camera systems, proper lens depth-of-field simulation, synthetic aperture, flutter shutter, and even light-fields cameras.

For this project, I used the RenderToolBox software tool, provided by David Brainard [CITATION: Brainard] to synthesize example 3D scenes. The radiance of these scenes, along with the depth map were then fed into the ISET simulation environment for a simulated camera capture of the scene.

Methods

In order to use RenderToolBox, a deeper understanding of its architecture is required. RenderToolBox is actually a wrapper software tool of several other simulation software. In particular, Radiance, PsychToolBox, Physicalled Based Rendering (PBRT), and SimToolBox are used to help render an artificial scene. The most important software tool that RenderToolBox is built on, is Radiance, which is a scene rendering tool which contains a ray tracer.

RenderToolBox

RenderToolBox works by repeatedly calling another a separate rendering engine. There are several rendering engines available in RenderToolBox such as Radiance and PBRT. Although these rendering engines are very capable, they are limited in their output. Most importantly, these rendering engines can only output images with a tri-chromatic colorspace. However, for proper camera simulation, we require multi-spectral data. This problem is the one that RenderToolBox addresses. In order to allow for multi-spectral renderings, it provides multiple input light sources to the rendering engine, each at a specific wavelength. The intensities of these monochromatic input light sources are at the corresponding intensities of the desired light sources of that wavelength. After the rendering engine renders these scenes with different wavelength light sources, it then combines the data obtained. RenderToolBox also allows for us to obtain a depth map of the scene, at the specified camera position. See Figure 2 for the data flow of RenderToolBox.

x

Caption: Figure 2: Data flow for RenderToolBox and interaction with Radiance.

Available Objects To Render

A multitude of shapes, textures, lights, etc. can be rendered with Radiance, and therefore, with RenderToolBox as well. For example, Radiance is able to render different 3D shapes such as cones, spheres, cylinders, and cups. All these objects can then be made of different materials such as plastics, metals, and textures. Interesting lighting effects can be rendered as well for the scene. Not only can there be multiple light sources, but Radiance can also render glowing surfaces, mirrors, prisms, glass, and even mist. Also, any combination of different materials and lighting can be assigned to the objects.

Moreover, RenderToolBox and Radiance are able to use triangular meshes to create objects. This is a very powerful feature because any arbitrary object, which can be approximated with a 3D mesh can be rendered. In fact, RenderToolBox allows for the import of 3D meshes from 3D modeling software such as Maya.

Rendering Options

RenderToolBox is also highly customizable in terms of rendering options. There is access to different rendering engines to use (usually Radiance, or PBRT), as well as the camera angle, position, and field of view. The prominent light source can also be changed as well as the number of rendered light bounces. In other words, if there are highly reflective surfaces that can result in multiple light bounces, then the user can choose the number of light bounces should be simulated when rendering.

Interface With ISET

Rendering multi-spectral scenes is useful, but being able to render these scenes in a camera simulation framework such as ISET could be even more useful. Interfacing with ISET is relatively straightforward. The output from RenderToolBox is a Matlab .mat file, which contains a cell array, with each cell specifying the radiance of the scene at every point in space. This cell array was then converted to a 3D matrix, with the 3rd dimension specifying wavelength, and the first 2 dimensions specifying radiance in the camera plane. This 3D matrix is then fed into an ISET scene. The proper light source and brightness is also specified for the ISET scene.

The second output from RenderToolBox is a depth-map. The depth-map is obtained by adding a rendering call to Radiance by modifying a RenderToolBox function. Although Radiance offers an output file for the depth-map, it is in a raw binary format. A Matlab script was written in order to properly read this data. After reading the depth map data, some data conditioning was performed to make sure that the depth data was within a proper range. See Figure 3 for a block diagram of how RenderToolBox can interface with ISET.

x Figure 3: RenderToolBox and ISET can interface with each other quite easily.

Results

results... add pictures.. discuss

Conclusion

References

Appendix I