Browsing by Subject "Illumination estimation"
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
Item Open Access Differences in illumination estimation in #thedress(Association for Research in Vision and Ophthalmology Inc., 2017) Toscani, M.; Gegenfurtner, K. R.; Doerschner, K.We investigated whether people who report different colors for #thedress do so because they have different assumptions about the illumination in #thedress scene. We introduced a spherical illumination probe (Koenderink, Pont, van Doorn, Kappers, & Todd, 2007) into the original photograph, placed in fore-, or background of the scene and-for each location-let observers manipulate the probe's chromaticity, intensity and the direction of the illumination. Their task was to adjust the probe such that it would appear as a white sphere in the scene. When the probe was located in the foreground, observers who reported the dress to be white (white perceivers) tended to produce bluer adjustments than observers who reported it as blue (blue perceivers). Blue perceivers tended to perceive the illumination as less chromatic. There were no differences in chromaticity settings between perceiver types for the probe placed in the background. Perceiver types also did not differ in their illumination intensity and direction estimates across probe locations. These results provide direct support for the idea that the ambiguity in the perceived color of the dress can be explained by the different assumptions that people have about the illumination chromaticity in the foreground of the scene. In a second experiment we explore the possibility that blue perceivers might overall be less sensitive to contextual cues, and measure white and blue perceivers' dress color matches and labels for manipulated versions of the original photo. Results indeed confirm that contextual cues predominantly affect white perceivers.Item Open Access Sun position esimation on time-lapse videos for augmented reality applications(2015-07) Balcı, HasanRealistic illumination of virtual objects on Augmented Reality (AR) environments is important in terms of achieving visual coherence. This thesis proposes a novel approach that facilitates the illumination estimation on time-lapse videos and gives the opportunity to combine AR technology with time-lapse videos in a visually consistent way. The proposed approach works for both outdoor and indoor environments where the main light source is the Sun. We rst modify an existing illumination estimation method that aims to obtain sparse radiance map of the environment in order to estimate the initial Sun position. We then track the hard ground shadows on the time-lapse video by using an energy-based pixelwise method. The proposed method aims to track the shadows by utilizing the energy values of the pixels that forms them. We tested the method on various time-lapse videos recorded in outdoor and indoor environments and obtained successful results.Item Open Access Sun position estimation and tracking for virtual object placement in time-lapse videos(Springer, 2017) Balcı, H.; Güdükbay, UğurRealistic illumination of virtual objects placed in real videos is important in terms of achieving visual coherence. We propose a novel approach for illumination estimation on time-lapse videos and seamlessly insert virtual objects in these videos in a visually consistent way. The proposed approach works for both outdoor and indoor environments where the main light source is the Sun. We first modify an existing illumination estimation method that aims to obtain sparse radiance map of the environment in order to estimate the initial Sun position. We then track the hard ground shadows on the time-lapse video by using an energy-based pixel-wise method. The proposed method aims to track the shadows by utilizing the energy values of the pixels that forms them. We tested the method on various time-lapse videos recorded in outdoor and indoor environments and obtained successful results. © 2016, Springer-Verlag London.