Thursday, March 5, 2009

Sun Illumination in 3D Imagery: why 2 views are better than 1

Sun illumination is one of the key concepts in 3D imagery. Depending what illumination angle you chose, certain features will either be highlighted or muted. Bad data can often be made to look good, and good data can certainly be made to look bad. One popular convention (or at least one I heard a lot back when I first started doing all this) is to simply illuminate everything from the northeast (45°). Another is to illuminate perpendicular to the features you wish to highlight. Recently, it has become common to provide two different images or scenes with two opposing sun illumination angles, so that the end-users can get a better sense of the data quality. I was thinking about all this this morning as I was working with some lidar data. Below are two 3D images of some lidar data I am working with rendered in Fledermaus. All the input parameters for the sun illumination were kept exactly the same, and only the illumination angle was changed. In the first image, the data is illuminated from the northeast (45°), while in the second, the data is illuminated from the southwest (225°). Note the dramatic differences. 

At 45°, the imagery looks quite nice and the rocky areas are clearly delineated.

At 225°, the imagery still looks nice, but now you can also see linear NE-SW trending features in the data. These features could be real, or they could be artifacts in the data. Either way, if I had only rendered the data at 45°, I never would have seen them. 

Here is a side-by-side comparison of the two illuminations:

Now that I have seen the linear features in the second view (225°), I can just start to make them out in the first one (45°), but they are still hard to see. By looking at different illumination angles, I can really start to get an idea of features and trends in the data, as well as any artifacts. 

No comments:

Post a Comment