Texture Synthesis and Photo-realistic Re-Rendering of Room Scene Images

Kyle J Ziga, Purdue University

Abstract

In this thesis, we investigate methods for texture synthesis and texture re-rendering of indoor room scene images. The goal is to create a photorealistic redesign of interior spaces by replacing surface finishes with a new product based on a single room scene image. Specifically, we focus on automating this process to reduce manual input while enabling high-quality and easy-to-use experience. The most common method of rendering textures into a scene is called texture mapping. Texture mapping involves mapping pixels in a texture sample to vertices in an object model. Typically, a large texture sample is required to perform texture mapping properly. Given a small texture sample, texture synthesis creates a large sized texture that appears to have been made by the same underlying process. In the first part of this thesis, we present a method of texture synthesis that automatically determines a set of parameters to produce satisfactory results based on the texture types. The next challenge is to create a photorealistic re-rendering of the synthesized texture in the room scene image. 3D scene information such as geometry, lighting and reflectance is crucial to making the re-rendered image realistic. These properties contribute to the image formation process and must be estimated to create a scene-consistent modification. Knowing these parameters allows effects like highlights, shadows and inter-object reflections to be maintained during the re-rendering process. We detail methods for estimating these parameters from a single indoor image. Finally, we will show a web-based implementation of these methods using the WebGL library ThreeJS.

Degree

M.Sc.

Advisors

Zhu, Purdue University.

Subject Area

Design

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS