2011 RO-MAN
20th IEEE International Symposium on Robot and
Human Interactive Communication,
July 31 - August 3, 2011, Atlanta, GA, USA
A New Robotic System to Assist Visually Impaired People
Genci Capi, Member IEEE, Hideki Toda, Member IEEE
propose a new robotic system using camera and laser range
finders. The sensors and a small PC are placed in a trolley
walker. While walking, the camera captures the images and
the LRFs scan the environment in front of the user in the
horizontal and vertical planes. The PC analyzes
environmental data to acquire the information. The PC tells
this information to the person by natural language or beep
signals. In addition, the user can find obstacle free paths
using the joystick potentiometer. Based on the joystick
pushing direction, the corresponding laser data are
processed and the information is sent to the user. By using
this system, visually impaired people will be able to obtain
information of the environment, find obstacle free moving
directions and detects danger places like stairs and steps.
This paper is organized as follows. First the developed
system is described (section II). In Section III the software
for step, stair and face recognition are presented. The
performance of our system in different environment settings
is verified experimentally and the results are presented in
section IV, before concluding in Section V.
Abstract— In this paper, we propose a new robotic system to
assist visually impaired people in unknown indoor and outdoor
environments. The robotic system, which is equipped with a
visual sensor, laser range finders, speaker, gives visually
impaired people information about the environment around
them. The laser data are analyzed using the clustering
technique, making it possible to detect obstacles, steps and
stairs. By using the visual sensor, the system is able to
distinguish between objects and humans. The PC analyses the
sensors data and send information to the visually impaired
people by natural language or beep signal. The usefulness of
the proposed system is examined experimentally.
I. INTRODUCTION
ECENTLY a major problem for the society is to find
R
ways to help visually impaired people for safe
navigation in everyday life environments. Several
approaches have been proposed to address this problem,
which are mainly divided into two groups: devices for (1)
outdoor and (2) indoor environments. Most of the devices
for outdoor environments are based on the Global
Positioning System (GPS) ([1], [2], [3], [4]) for localization
and Laser Range Finder (LRF) for obstacle avoidance. For
example, Mori et al ([1]) proposed a Robotic Travel Aid
(RoTA) “Harunobu” to guide the visually impaired people
in the sidewalk or university campus using a camera, sonar,
differential GPS system, and portable GIS, which are placed
in an actuated wheelchair device. On the other hand, indoor
devices use ultrasound or radio frequency identification
transponder for localization and LRF for obstacle
identification and avoidance ([5],[6],[7],[8],[9]). For
example, in [5] a new system is proposed, which gives blind
people information of the environment. The person is
equipped with two LRF, PC placed in his body.
Although many devices have been proposed and several
improvements have been done in public roads and buildings,
it is still hard and danger for the visually impaired people to
navigate in most of the places. In order to determine the most
important information needed by visually impaired people,
we interviewed several of them in the Toyama Center of
Disabled People. The survey results showed that especially
steps and stairs present a great danger for these people. In
addition, human recognition is important in order not to hurt
other people. In order to assist visually impaired people, we
II. DEVELOPED SYSTEM
The proposed robotic system is shown in Fig. 1. It
consists of one PC, two LRFs, one camera, a H8
microprocessor and a joystick potentiometer. The PC and
sensors are placed in a trolley walker produced in
collaboration with Kanayama Machinery. The laser sensors
scan the environment in front of the trolley walker, in the
horizontal and vertical planes, while the camera captures the
environment images. The PC analyzes the sensors data
continuously. It detects objects that are in front of the guide
robot and tells the user this information by beep signals or
natural language. In our system, we use a bone headphone
which makes it possible for the user to listen to PC
information, while hearing the environment. The intensity of
the beep signals increases as the guide robot gets near to the
obstacles.
A. Laser sensor
In our system, we used two LRF sensors made by HOKUYO
AUTOMATIC. The URG-04LX-UG01 ([10]) is a small and
light LRF sensor. Its high accuracy, high resolution and
wide range provide a good solution for 2D scanning. Fig. 2
shows the appearance of the sensor and Table 1 gives the
main specifications. It has a low-power consumption (2.5W)
for longer working time and can be powered by the USB.
G. Capi and Hideki Toda are with the Department of Electric and
Electronic Eng., University of Toyama, Toyama, Japan (phone/fax
0081-445-6745; e-mail: ).
978-1-4577-1573-0/11/$26.00 ©2011 IEEE
259
全部评论(0)