Browsing by Subject "Uncertainty estimation"
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item Open Access Refining 3D human texture estimation from a single image(IEEE, 2024-12) Altındiş, Said Fahri; Meric, Adil; Dalva, Yusuf; Güdükbay, Uğur; Dündar, AyşegülEstimating 3D human texture from a single image is essential in graphics and vision. It requires learning a mapping function from input images of humans with diverse poses into the parametric (uv) space and reasonably hallucinating invisible parts. To achieve a high-quality 3D human texture estimation, we propose a framework that adaptively samples the input by a deformable convolution where offsets are learned via a deep neural network. Additionally, we describe a novel cycle consistency loss that improves view generalization. We further propose to train our framework with an uncertainty-based pixel-level image reconstruction loss, which enhances color fidelity. We compare our method against the state-of-the-art approaches and show significant qualitative and quantitative improvements.Item Open Access Three-dimensional human texture estimation learning from multi-view images(2023-12) Altındiş, Said FahriIn the fields of graphics and vision, accurately estimating 3D human texture from a single image is a critical task. This process involves developing a mapping function that transforms input images of humans in various poses into parametric (UV) space, while also effectively inferring the appearance of unseen parts. To enhance the quality of 3D human texture estimation, our study introduces a framework that utilizes deformable convolution for adaptive input sampling. This convolution is uniquely characterized by offsets learned through a sophisticated deep neural network. Additionally, we introduce an innovative cycle consistency loss, which markedly enhances view generalization. Our framework is further refined by incorporating an uncertainty-based, pixel-level image reconstruction loss, aimed at augmenting color accuracy. Through comprehensive comparisons with leading-edge methods, our approach demonstrates notable qualitative and quantitative advancements in the field.