Martens, SiegfriedCarpenter, Gail A.Gaudiano, Paolo2011-11-142011-11-141998-08https://hdl.handle.net/2144/2362An ARTMAP neural network is used to integrate visual information and ultrasonic sensory information on a B 14 mobile robot. Training samples for the neural network are acquired without human intervention. Sensory snapshots are retrospectively associated with the distance to the wall, provided by on~ board odomctry as the robot travels in a straight line. The goal is to produce a more accurate measure of distance than is provided by the raw sensors. The neural network effectively combines sensory sources both within and between modalities. The improved distance percept is used to produce occupancy grid visualizations of the robot's environment. The maps produced point to specific problems of raw sensory information processing and demonstrate the benefits of using a neural network system for sensor fusion.en-USCopyright 1998 Boston University. Permission to copy without fee all or part of this material is granted provided that: 1. The copies are not made or distributed for direct commercial advantage; 2. the report title, author, document number, and release date appear, and notice is given that copying is by permission of BOSTON UNIVERSITY TRUSTEES. To copy otherwise, or to republish, requires a fee and / or special permission.Mobile robotNeural networksSensor fusionOccupancy gridSonarRangeNeural Sensor Fusion for Spatial Visualization on a Mobile RobotTechnical ReportBoston University Trustees