start-ver=1.4 cd-journal=joma no-vol=27 cd-vols= no-issue=15 article-no= start-page=1147 end-page=1160 dt-received= dt-revised= dt-accepted= dt-pub-year=2013 dt-pub=20130722 dt-online= en-article= kn-article= en-subject= kn-subject= en-title= kn-title=Positioning device for outdoor mobile robots using optical sensors and lasers en-subtitle= kn-subtitle= en-abstract= kn-abstract=We propose a novel method for positioning a mobile robot in an outdoor environment using lasers and optical sensors. Position estimation via a noncontact optical method is useful because the information from the wheel odometer and the global positioning system in a mobile robot is unreliable in some situations. Contact optical sensors such as computer mouse are designed to be in contact with a surface and do not function well in strong ambient light conditions. To mitigate the challenges of an outdoor environment, we developed an optical device with a bandpass filter and a pipe to restrict solar light and to detect translation. The use of two devices enables sensing of the mobile robotfs position, including posture. Furthermore, employing a collimated laser beam allows measurements against a surface to be invariable with the distance to the surface. In this paper, we describe motion estimation, device configurations, and several tests for performance evaluation. We also present the experimental positioning results from a vehicle equipped with our optical device on an outdoor path. Finally, we discuss an improvement in postural accuracy by combining an optical device with precise gyroscopes. en-copyright= kn-copyright= en-aut-name=NagaiIsaku en-aut-sei=Nagai en-aut-mei=Isaku kn-aut-name=‰iˆäˆÉì kn-aut-sei=‰iˆä kn-aut-mei=ˆÉì aut-affil-num=1 ORCID= en-aut-name=YamauchiGenki en-aut-sei=Yamauchi en-aut-mei=Genki kn-aut-name= kn-aut-sei= kn-aut-mei= aut-affil-num=2 ORCID= en-aut-name=NagataniKeiji en-aut-sei=Nagatani en-aut-mei=Keiji kn-aut-name= kn-aut-sei= kn-aut-mei= aut-affil-num=3 ORCID= en-aut-name=WatanabeKeigo en-aut-sei=Watanabe en-aut-mei=Keigo kn-aut-name= kn-aut-sei= kn-aut-mei= aut-affil-num=4 ORCID= en-aut-name=YoshidaKazuya en-aut-sei=Yoshida en-aut-mei=Kazuya kn-aut-name= kn-aut-sei= kn-aut-mei= aut-affil-num=5 ORCID= affil-num=1 en-affil= kn-affil=Ž©‘R‰ÈŠwŒ¤‹†‰È affil-num=2 en-affil= kn-affil=Graduate School of Engineering, Tohoku University affil-num=3 en-affil= kn-affil=Graduate School of Engineering, Tohoku University affil-num=4 en-affil= kn-affil=Department of Intelligent Mechanical Systems , Okayama University affil-num=5 en-affil= kn-affil=Graduate School of Engineering, Tohoku University en-keyword=mobile robot kn-keyword=mobile robot en-keyword=position estimation kn-keyword=position estimation en-keyword=laser speckle pattern kn-keyword=laser speckle pattern en-keyword=optical sensor kn-keyword=optical sensor END start-ver=1.4 cd-journal=joma no-vol= cd-vols= no-issue= article-no= start-page=6053 end-page=6058 dt-received= dt-revised= dt-accepted= dt-pub-year=2015 dt-pub=20150928 dt-online= en-article= kn-article= en-subject= kn-subject= en-title= kn-title=Path Tracking by a Mobile Robot Equipped with Only a Downward Facing Camera en-subtitle= kn-subtitle= en-abstract= kn-abstract=This paper presents a practical path-tracking method for a mobile robot with only a downward camera facing the passage plane. A unique algorithm for tracking and searching ground images with natural texture is used to localize the robot without a feature-point extraction scheme commonly used in other visual odometry methods. In our tracking algorithm, groups of reference pixels are used to detect the relative translation and rotation between frames. Furthermore, a reference pixel group of another shape is registered both to record a path and to correct errors accumulated during localization. All image processing and robot control operations are carried out with low memory consumption for image registration and fast calculation times for completing the searches on a laptop PC. We also describe experimental results in which a vehicle developed by the proposed method repeatedly performed precise path tracking under indoor and outdoor environments. en-copyright= kn-copyright= en-aut-name=NagaiIsaku en-aut-sei=Nagai en-aut-mei=Isaku kn-aut-name=‰iˆäˆÉì kn-aut-sei=‰iˆä kn-aut-mei=ˆÉì aut-affil-num=1 ORCID= en-aut-name=WatanabeKeigo en-aut-sei=Watanabe en-aut-mei=Keigo kn-aut-name= kn-aut-sei= kn-aut-mei= aut-affil-num=2 ORCID= affil-num=1 en-affil= kn-affil=‰ªŽR‘åŠw‘åŠw‰@Ž©‘R‰ÈŠwŒ¤‹†‰È affil-num=2 en-affil= kn-affil=‰ªŽR‘åŠw‘åŠw‰@Ž©‘R‰ÈŠwŒ¤‹†‰È END start-ver=1.4 cd-journal=joma no-vol= cd-vols= no-issue= article-no= start-page=106 end-page=111 dt-received= dt-revised= dt-accepted= dt-pub-year=1997 dt-pub=19970923 dt-online= en-article= kn-article= en-subject= kn-subject= en-title= kn-title=Obstacle avoidance by changing running path for an autonomous running vehicle applying visual servoing en-subtitle= kn-subtitle= en-abstract= kn-abstract=

This paper describes an improved running control algorithm based on the visual servoing in consideration of the turning back of a running path to avoid an obstacle on the path by changing the running path. This paper also describes an experimental autonomous running vehicle to demonstrate the algorithm. As a vision sensor, the vehicle equips with a video-rate stereo rangefinder which processes color images from stereo CCD cameras and is developed in the authors' laboratory. From the several basic autonomous running experiments, it is concluded that the experimental vehicle runs smoothly any planned path composed of several teaching routes by transferring routes. It is also concluded that the vehicle can turn back on a path including turning back of route transference

en-copyright= kn-copyright= en-aut-name=GofukuAkira en-aut-sei=Gofuku en-aut-mei=Akira kn-aut-name= kn-aut-sei= kn-aut-mei= aut-affil-num=1 ORCID= en-aut-name=TanakaYutaka en-aut-sei=Tanaka en-aut-mei=Yutaka kn-aut-name= kn-aut-sei= kn-aut-mei= aut-affil-num=2 ORCID= en-aut-name=SodaHiroyoshi en-aut-sei=Soda en-aut-mei=Hiroyoshi kn-aut-name= kn-aut-sei= kn-aut-mei= aut-affil-num=3 ORCID= en-aut-name=NagaiIsaku en-aut-sei=Nagai en-aut-mei=Isaku kn-aut-name= kn-aut-sei= kn-aut-mei= aut-affil-num=4 ORCID= affil-num=1 en-affil= kn-affil=Okayama University affil-num=2 en-affil= kn-affil=Okayama University affil-num=3 en-affil= kn-affil=Okayama University affil-num=4 en-affil= kn-affil=Okayama University en-keyword=CCD image sensors kn-keyword=CCD image sensors en-keyword=distance measurement kn-keyword=distance measurement en-keyword=mobile robots kn-keyword=mobile robots en-keyword=path planning kn-keyword=path planning en-keyword=robot vision kn-keyword=robot vision en-keyword=servomechanisms kn-keyword=servomechanisms en-keyword=stereo image processing kn-keyword=stereo image processing END start-ver=1.4 cd-journal=joma no-vol= cd-vols= no-issue= article-no= start-page=932 end-page=937 dt-received= dt-revised= dt-accepted= dt-pub-year=1999 dt-pub=199909 dt-online= en-article= kn-article= en-subject= kn-subject= en-title= kn-title=Development of a video-rate range finder using dynamic threshold method for characteristic point detection en-subtitle= kn-subtitle= en-abstract= kn-abstract=This study develops a video-rate stereo range finding circuit to obtain the depth of objects in a scene by processing video signals (R, G, B, and brightness signals) from binocular CCD cameras. The electronic circuit implements a dynamic threshold method to decrease the affect of signal noise in characteristic point detection, where a video signal from each CCD camera is compared with multiple thresholds, shifting dynamically by feeding back the previous comparison result. Several object depth measurement experiments for simple indoor scenes show that the dynamic threshold method gives high acquisition and correct rates of depth data compared with those by a fixed threshold method for the video signals and a relative method for R, G, and B signals utilized in the authors' previous range finders. en-copyright= kn-copyright= en-aut-name=TanakaYutaka en-aut-sei=Tanaka en-aut-mei=Yutaka kn-aut-name= kn-aut-sei= kn-aut-mei= aut-affil-num=1 ORCID= en-aut-name=GofukuAkio en-aut-sei=Gofuku en-aut-mei=Akio kn-aut-name= kn-aut-sei= kn-aut-mei= aut-affil-num=2 ORCID= en-aut-name=TakedaNobuo en-aut-sei=Takeda en-aut-mei=Nobuo kn-aut-name= kn-aut-sei= kn-aut-mei= aut-affil-num=3 ORCID= en-aut-name=NagaiIsaku en-aut-sei=Nagai en-aut-mei=Isaku kn-aut-name= kn-aut-sei= kn-aut-mei= aut-affil-num=4 ORCID= affil-num=1 en-affil= kn-affil=Department of Systems Engineering, Okayama University affil-num=2 en-affil= kn-affil=Department of Systems Engineering, Okayama University affil-num=3 en-affil= kn-affil=Graduate School of Engineering, Okayama University affil-num=4 en-affil= kn-affil=Department of Systems Engineering, Okayama University en-keyword=Video-Rate Range Finder kn-keyword=Video-Rate Range Finder en-keyword=Stereo Color CCD Camera kn-keyword=Stereo Color CCD Camera en-keyword=Autonomous Vehicle kn-keyword=Autonomous Vehicle en-keyword=Detection of Characteristic Point kn-keyword=Detection of Characteristic Point en-keyword=Real-Time Measurement kn-keyword=Real-Time Measurement END start-ver=1.4 cd-journal=joma no-vol= cd-vols= no-issue= article-no= start-page= end-page= dt-received= dt-revised= dt-accepted= dt-pub-year=2007 dt-pub=20070323 dt-online= en-article= kn-article= en-subject= kn-subject= en-title= kn-title=ˆÚ“®–Ê–Í—l‚ÌŽ‹Šo’ÇÕ‚Æ‹L‰¯‚É‚æ‚éˆÚ“®‘Ì‚Ì—U“±§Œä en-subtitle= kn-subtitle= en-abstract= kn-abstract= en-copyright= kn-copyright= en-aut-name=NagaiIsaku en-aut-sei=Nagai en-aut-mei=Isaku kn-aut-name=‰iˆäˆÉì kn-aut-sei=‰iˆä kn-aut-mei=ˆÉì aut-affil-num=1 ORCID= affil-num=1 en-affil= kn-affil=‰ªŽR‘åŠw END