Image de-fencing

Yanxi Liu, Tamara Belkina, James H. Hays, Roberto Lublinerman

Research output: Chapter in Book/Report/Conference proceedingConference contribution

44 Scopus citations

Abstract

We introduce a novel image segmentation algorithm that uses translational symmetry as the primary foreground/background separation cue. We investigate the process of identifying and analyzing image regions that present approximate translational symmetry for the purpose of image fourground/background separation. In conjunction with texture-based inpainting, understanding the different see-through layers allows us to perform powerful image manipulations such as recovering a mesh-occluded background (as much as 53% occluded area) to achieve the effect of image and photo de-fencing. Our algorithm consists of three distinct phases- (1) automatically finding the skeleton structure of a potential frontal layer (fence) in the form of a deformed lattice, (2) separating foreground/background layers using appearance regularity, and (3) occluded foreground inpainting to reveal a complete, non-occluded image. Each of these three tasks presents its own special computational challenges that are not encountered in previous, general image de-layering or texture inpainting applications.

Original languageEnglish (US)
Title of host publication26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR
DOIs
StatePublished - 2008
Event26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR - Anchorage, AK, United States
Duration: Jun 23 2008Jun 28 2008

Publication series

Name26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR

Other

Other26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR
Country/TerritoryUnited States
CityAnchorage, AK
Period6/23/086/28/08

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition
  • Control and Systems Engineering

Fingerprint

Dive into the research topics of 'Image de-fencing'. Together they form a unique fingerprint.

Cite this