Show simple item record

dc.contributor.authorIsidoro, Johnen_US
dc.contributor.authorSclaroff, Stanen_US
dc.date.accessioned2011-10-20T04:15:33Z
dc.date.available2011-10-20T04:15:33Z
dc.date.issued2003-07-18
dc.identifier.urihttps://hdl.handle.net/2144/1513
dc.description.abstractAn iterative method for reconstructing a 3D polygonal mesh and color texture map from multiple views of an object is presented. In each iteration, the method first estimates a texture map given the current shape estimate. The texture map and its associated residual error image are obtained via maximum a posteriori estimation and reprojection of the multiple views into texture space. Next, the surface shape is adjusted to minimize residual error in texture space. The surface is deformed towards a photometrically-consistent solution via a series of 1D epipolar searches at randomly selected surface points. The texture space formulation has improved computational complexity over standard image-based error approaches, and allows computation of the reprojection error and uncertainty for any point on the surface. Moreover, shape adjustments can be constrained such that the recovered model's silhouette matches those of the input images. Experiments with real world imagery demonstrate the validity of the approach.en_US
dc.language.isoen_US
dc.publisherBoston University Computer Science Departmenten_US
dc.relation.ispartofseriesBUCS Technical Reports;BUCS-TR-2003-017
dc.titleStochastic Refinement of the Visual Hull to Satisfy Photometric and Silhouette Consistency Constraintsen_US
dc.typeTechnical Reporten_US


This item appears in the following Collection(s)

Show simple item record