Show simple item record

dc.contributor.authorRosales, Rómeren_US
dc.contributor.authorSiddiqui, Matheenen_US
dc.contributor.authorAlon, Jonathanen_US
dc.contributor.authorSclaroff, Stanen_US
dc.date.accessioned2011-10-20T04:42:35Z
dc.date.available2011-10-20T04:42:35Z
dc.date.issued2001-06
dc.identifier.urihttps://hdl.handle.net/2144/1629
dc.description.abstractAn approach for estimating 3D body pose from multiple, uncalibrated views is proposed. First, a mapping from image features to 2D body joint locations is computed using a statistical framework that yields a set of several body pose hypotheses. The concept of a "virtual camera" is introduced that makes this mapping invariant to translation, image-plane rotation, and scaling of the input. As a consequence, the calibration matrices (intrinsics) of the virtual cameras can be considered completely known, and their poses are known up to a single angular displacement parameter. Given pose hypotheses obtained in the multiple virtual camera views, the recovery of 3D body pose and camera relative orientations is formulated as a stochastic optimization problem. An Expectation-Maximization algorithm is derived that can obtain the locally most likely (self-consistent) combination of body pose hypotheses. Performance of the approach is evaluated with synthetic sequences as well as real video sequences of human motion.en_US
dc.language.isoen_US
dc.publisherBoston University Computer Science Departmenten_US
dc.relation.ispartofseriesBUCS Technical Reports;BUCS-TR-2001-008
dc.titleEstimating 3D Body Pose Using Uncalibrated Camerasen_US
dc.typeTechnical Reporten_US


This item appears in the following Collection(s)

Show simple item record