A semi-autonomous wheelchair control strategy is combined between user intention and autonomous mode.. 2 BR algorithms for Estimation Freespace and User Intention in Semi-autonomous Whee
Trang 1by
Thanh Hai Nguyen
Submitted to the Falcuty of Engineering
in partial fulfilment of the requirements for the degree of
autonomous wheelchair with a stereoscopic camera system
Trang 2Acknowledge
I would like to express my sincere gratitude and appreciation to my supervisor, Professor Hung Tan Nguyen, for providing me with the valuable opportunity to finish this research, which was carried out from July 2006 to August 2009 in Faculty of Engineering, University of Technology Sydney I am not only grateful to him for his invaluable encouragement, enthusiastic support and professional guidance, I also highly appreciate his expert knowledge in the area of Mechatronics and Intelligent Systems
I would also like to thank my co-supervisor, Doctor Steven Su, for his assistance, sharing research experiences and constructive comments with respect to the research
I greatly thank to Gunasmin and Phyllis’s generous help during the time I have been working at the faculty
Many thanks to Greg for helping me to do experiments using a head-movement sensor and Michelle for helping me to edit the research
I also want to thank my best friends and students in my research group for helping me
do experiments I wish they all the best and have the best results in their study
Finally, I would like to show my deep gratitude to my family, who always support and encourage me all my life, especially the period of working at UTS I hope to work and dedicate my knowledge and research experiences to the society
Trang 3Assistive technologies have been proposed in recent years to install in mobile wheelchairs for providing severely disabled people with enhanced access to independent activities in their daily life In practice, ultrasound and laser sensors have been developed to navigate obstacles, but they only provide two-dimensional (2D) grids
The main contributions of this thesis are in the exploitation of three-dimensional (3D) information from the stereoscopic camera in the estimation of free space and the user’s intention This is achieved through using Bayesian Recursive (BR) algorithms which are conditioned on measurements, control data and conditional probabilities, within a semi-autonomous wheelchair control system
In order to provide 3D information for detecting free spaces and obstacles, a
“Bumblebee” stereoscopic camera system has been mounted to a powered wheelchair The Sum of Absolute Differences (SAD) algorithm is subsequently used for constructing an optimal disparity map Especially, the color intensity functions of images has been applied to obtain this optimal disparity map, moreover, the mark size and the disparity boundaries to increase the optimality of the disparity map Given the optimal disparity map, both 3D point and 2D distance maps are produced for controlling the autonomous wheelchair In particular, the height and width of free space have been computed for passing through Experimental results have shown the effectiveness of the SAD approach using the color intensity function and benefits of computing 3D point map and 2D distance map for the wheelchair control
The stereoscopic camera system may provide 3D information about free spaces in the environment However, the free space information can be uncertain This is especially
Trang 4likely when the free space height and/or width are close to the safe height and/or diameter of the wheelchair It is then difficult for the wheelchair to estimate the height and/or width for moving through free space To combat this, the BR algorithm is applied for estimating free space In order to apply a Bayesian decision for the wheelchair to pass autonomously through free space, the average optimal probability values are determined Experimental results for estimating various free spaces prove that the proposed BR approach is effective
A semi-autonomous wheelchair control strategy is combined between user intention and autonomous mode In the autonomous mode, a dynamic free space has been estimated
using the advanced BR algorithm conditioned on the “obstacle” distance This can be
altered when a moving obstacle is in front of the mobile wheelchair User intentions are often uncertain due to noise from a head-movement sensor, and it is therefore difficult for the mobile wheelchair to determine user intentions Hence, the advanced BR algorithm is utilized, conditioned on dynamic free space estimation The algorithm is used to determine the user intention In conclusion, the experimental results detailed in the thesis, serve to illustrate the effectiveness of the approaches
Trang 5Contents
Nomenclature v
List of Figures viii
List of Tables xxi
Abbreviations xxii
Abstract xxiv
Chapter 1 Introduction 1
1.1 Motivation 1
1.2 Thesis Contribution 5
1.3 Publication 6
1.4 Thesis Outline 8
Chapter 2 Literature Review 11
2.1 Introduction 11
2.2 Sensor-based Controls 12
2.3 Camera-based Weelchairs 18
2.4 Stereo Vision Problems 22
2.5 Control problem 37
2.6 Discussion 38
Chapter 3 Obstacle and Freespace Detection using a Stereoscopic Camera System 41
3.1 Introduction 41
3.2 Stereoscopic Vision 44
3.2.1 Camera Model 44
3.2.2 Epipolar Geometry and the Fundamental Matrix 47
Trang 63.2.3 Image Pair Rectification and Edge Detection 50
3.2.4 Block Matching 55
3.2.5 Disparity Estimation 59
3.3 Obstacle and Freespace Detection 62
3.3.1 Distance Perception 63
3.3.2 Computation of 3D Point Map 63
3.3.3 Computation of 2D Distance Map 67
3.3.4 Obstacle and Freespace Detection 70
3.4 Comparison of 2D Maps using a ‘Bumblebee’ Stereoscopic Camera and ‘URG’ Laser Systems 80
3.4.1 2D Map using the ‘URG’ Laser System 80
3.4.2 2D Distance Map using the ‘Bumblebee’ Stereoscopic Camera System 81
3.5 Discussion 83
Chapter 4 Bayesian Recursive Algorithm for Freespace Estimation 85
4.1 Introduction 85
4.2 Freespace Estimation Algorithm 86
4.2.1 Bayesian Recursive Algorithm 87
4.2.2 Bayesian Decision 91
4.3 Experiments of Freespace Estimation 93
4.3.1 Experiment 1: Estimation of the width of a freespace 96
4.3.2 Experiment 2: Estimation of the width of two freespaces 103
4.3.3 Experiment 3: Estimation of the height and width of one freespace 111
4.3.4 Experiment 4: Estimation of the height and width of two freespaces 119
4.4 Discussion 133
Chapter 5 Advanced Bayesian Estimation in Semi-autonomous Wheelchair Control 136
5.1 Introduction 136
Trang 75.2 Semi-autonomous Wheelchair Control Strategy 138
5.2.1 Representation of User Commands 138
5.2.2 Autonomous Mode 145
5.2.2.1 Dynamic Freespace Estimation 146
5.2.2.2 Computation of Controls 149
5.2.3 Semi-autonomous Wheelchair Control 154
5.2.3.1 Bayesian Estimation for Intention of the User 155
5.2.3.2 Bayesian Decision For Control 159
5.3 Experiments of Bayesian Estimation For Semi-autonomous Wheelchair Control 160
5.3.1 Experiment 1: Bayesian estimation for a freespace with a moving obstacle 161
5.3.2 Experiment 2: Autonomous mode for passing through a freespace 176
5.3.3 Experiment 3: User intention for passing through a freespace183 5.3.3.1 Intention Estimation 183
5.3.3.2 Bayesian Decision for Control 187
5.4 Discussion 189
Chapter 6 Conclusion and Future Work 192
6.1 Conclusion 192
6.2 Future Work 196
Appendix A Wheelchair Hardware Description 197
A 1 Overview of Power Wheelchairs 197
A 2 Description of the “Bumblebee” Stereoscopic Camera System 199
A 3 Description of the “URG” Laser System 201
A 4 Description of the Head-movement Sensor 204
A 5 Description of the National Instrument USB 6008 Multifunction Data Acquisition 209
Trang 8Appendix B Implementations of Semi-autonomous Wheelchair Control
System with a Stereoscopic Camera System 212
B 1 Freespace and Obstacle Detection using the Stereoscopic Camera System for C++ 212
B 2 BR algorithms for Estimation Freespace and User Intention in
Semi-autonomous Wheelchair Control for LabVIEW 227
Appendix C Publications Related to the Thesis 232
Bibliography
Trang 9(i, j) - Pixel coordinates
(Q,Q ’ ) - Two centres of two cameras
A - Intrinsic parameter of stereo cameras
B - Baseline of the two camera centres (in metres)
C - User intention state in the BR algorithm
C L - Camera left centre
C R - Camera right centre
C t-1 - Previous state of User intention
D - Extrinsic parameter of stereo cameras
SAD color - Disparity function
d f - Width of the freespace
d max - Maximum disparities
d min - Minimum disparities
d s - Safe diameter
e L - Left epipole
e R - Right epipole
f - Focal length
Trang 10F - Fundamental matrix
F T - Fundamental matrix of two cameras (Q,Q ’) in the opposite order
h - Height of the wheelchair
h1 - Height of the first freespace
h c - Camera position on the wheelchair
h l - Height of the freespace
h obs - “obstacle” distance value from obstacles to the wheelchair
I L - Left image plane
I R - Right image plane
k s - Safe distance form obstacle to the wheelchair
l R - Right epipolar line
OH - Maximum distance on the Z-axis
P(C 0 ) - Equal prior probabilities of user intention
P(C t =u auto ) - Probability of the autonomous mode
P(C t =u user ) - Probability of the user intention
P(W d0 ) - Equal prior probability of dynamic freespace
P(x(t-1)) - Previous probability over state (the height or width)
P av (x(t)) - Average value of the optimal probabilities
p fs (W d ) - Dynamic freespace probability
p fs1:t-1 - Past dynamic freespace estimation
p L - Left image point
P po (C t ) - Optimal probability of user intention
P pr (C t ) - Posterior probability of user intention
P pr (W dt ) - posterior probability of dynamic freespace
P pr (x t ) - Posterior probability of freespace
p R - Right image point
R - Rotation matrix
T - Translation vector
u 1:t - Past controls of freespace
u auto (v a ,ωa ) - Autonomous mode
u d1:t - All past controls in estimating dynamic freespaces
Trang 11u t - Control data in estimating freespaces
u user (v u ,ωu ) - Control using the user intention
V max - Maximum voltage
v u - forward and backward speed
w - width of freespace
W d - Width state of dynamic freespace
W dt-1 - Previous state of dynamic freespace
Z - Distance along the camera corresponding to Z-axis
z 1:t - Past measurements in estimating freespace
z c,t - Head measurements
z d1:t - Past measurement in estimating dynamic freespace
Z imax - Maximum distance values
Z imin - Minimum distance values
α1 - Maximum angle to the left of the wheelchair
α2 - Maximum angle to the right
Trang 12List of Figures
2.1a 2D histogram grid map 12
2.1b 1D polar histogram map 12
2.2a Obstacle avoidance only using ultrasonic sensors 13
2.2b Obstacle avoidance only equipped with IR sensors 13
2.3a Results in the cone of detection for hard objects “seeing” objects that would not actually be in the wheelchair’s path 14
2.3b Multiple sensor system with 0.10 m required on either side of the wheelchair to not experience a false stop 14
2.4 Experiment of a door-sensing camera by a person using Bayesian filters15 2.5a Autonomous mobile robot equipped with the 3D laser range finder 15
2.5b D point cloud image of obstacles using the 3D laser range finder 15
2.6a Uniform sampling the whole environment mapped with a constant density of search nodes 16
2.6b Non uniform sampling leaving open spaces with a lower number of search nodes, increasing density around tight places 16
2.7 3600 real-time map explored by laser range finder sensor 17
2.8 Computation of face image, (a) original image; (b) bright region; (c) face region; (d) face feature 18
2.9a Overview of the intelligent quipped with the USB webcam 20
2.9b Mouth cavity detection, (a)-(b) original images; (c) mouth close; mouth open 20
2.10a Overview of the system diagram 20
Trang 132.10b Indoor and outdoor demonstration 20
2.11 Block diagram of the mobile wheelchair using sensor fusion 21
2.12a Traversed path for the mobile robot from A point to B point 22
2.12b Graphic representation of the traced path as shown in Figure 2.12a 22
2.13a Original document image 23
2.13b Distortion corrected image 23
2.14a Epipolar geometry 24
2.14b Rectified epipolar geometry 24
2.15a (a) depth map and (b) novel view rendered 26
2.15b (c) recovered discontinuity map and (d) white regions A, B, C and D are incorrect segments 26
2.16 3D position of (x,y) with depth z and its projection 27
2.17 (a) Key image; (b) True depth map; (c) Reference depth map; (d) Estimated depth map using DBPM; (e) Estimated depth map using CPM; (f) Estimated depth map using HANNA; (g) Error depth map for DBPM; (h) Error depth map for CPM; (i) Error depth map for HANNA; (j) Regions (in white) where C, 0:1 for (d); (k and l) Rendered scene from novel viewpoints using depth map estimated using DBPM 28
2.18a Humanoid robot platform equipped with a stereo vision system 29
2.18b Populated environment constructed with many obstacles and freespaces29 2.18c Stereo depth points drawn with the gray scale intensity of the corresponding image pixel 29
2.19 Block diagram of disparity map using stereoscopic cameras 30
2.20a Natural stereoscopic left image SAXO 31
2.20b Natural stereoscopic right image SAXO 31
Trang 142.20c Dense disparity fields of the image SAXO obtained after quantization
and filtering 31
2.20d Edges of the image SAXO converted from the disparity image 31
2.21a Grayscale box image 32
2.21b Disparity image 32
2.21c The maximum disparity in each column of the disparity image 33
2.21d Map of depth versus column graph 33
2.22a Robot equipped with a stereo camera system 34
2.22b A indoor scene and the green disparity map 34
2.23a A pair of the stereoscopic left and right images 34
2.23b (a) kanade variable window; (b) maas variable window 35
2.23c (c) implicit support; (d) proposed method 35
2.24 (a)-(b) “Head and lamp” left and right images; (c) Disparity map; (d) Occlusions 35
2.25a user participating in trials 37
2.25b Strategy map 37
3.1 Power wheelchair with the stereoscopic camera system and the URG laser 43
3.2 Hardware requirements for a mobile wheelchair to detect obstacle and freespaces 43
3.3 Pinhole image model with the image plane at the rear of the camera centre C 44
3.4 Pinhole image model with the image plane in front of the camera centre C 45
3.5 The “Bumblebee”stereoscopic camera system with “two eyes” 47
Trang 153.6 Dimensional configuration of the “Bumblebee” stereoscopic camera
system 48
3.7 Epipolar geometry of a pair of the left and right images I L and I R 48
3.8 Block diagram of processing distorted images to produce a depth image 50
3.9 Rectified image planes I’ L and I’ R with p’ L and p’ R on the same horizontal 51
3.10a Distorted left image 52
3.10b Distorted right image 52
3.11a Rectified left image with white and black colour 53
3.11b Rectified right image with white and black colour 53
3.12a Colour rectified left image 53
3.12b Colour rectified right image 53
3.13a Edge left image 54
3.13b Edge right image 54
3.14 Correlation of two 3x3 windows along corresponding epipolar lines to search for the best matching region 55
3.15a Left image with two heights of two freespaces from bars to the ground56 3.15b Right image with two heights of two freespaces from bars to the ground 56
3.16 Stereo disparity map based on the left and right images as shown in figure 3.15, in which the obstacle point A represented as light blue closer than the obstacle point B to the camera position 58
3.17a Left image with obstacles such as television, box, table and chair 59
3.17b Right image with obstacles such as television, box, table and chair 59
Trang 163.18 Stereo disparity map, in which obstacles represented as different blue
colors dependent on positions of obstacles to the camera position 60
3.19a M=3, d max =64, d min=0 61
3.19b M=5, d max =64, d min=0 61
3.19c M=9, d max =64, d min=0 61
3.19d M=11, d max =64, d min=0 61
3.19e M=13, dmax=64, dmin=0 61
3.19f M=21, d max =64, d min=0 61
3.20a M=11, d max =21, d min=0 62
3.20b M=11, d max =34, d min=0 62
3.20c M=11, d max =64, d max=0 62
3.20d M=11, d max =72, d min=0 62
3.20e M=11, d max =132, d min=0 62
3.20f M=11, d max =194, d min=0 62
3.21 Relationship between the disparity d and distance Z 63
3.22 Image and world coordinate systems related to determination of distances Z 64
3.23 The height h of the wheelchair computed to pass through freespaces based on the 3D point map 65
3.24a Viewed from the front side, 3D point map computed to convert to 2D distance map, in which the first freespace has the height h1 and width w1, and the second freespace has the height h2 and width w2 66
3.24b 3D point map viewed from the right side 66
3.25 Power wheelchair with the diameter d compared to the width w m of the first freespace 68
Trang 173.26 2D distance map only shows the first freespace with the width w1, so the
height h1 of the first freespace is greater than the height of the wheelchair While the second freespace is considered as the obstacle
due to its height being less than the height of the wheelchair 69
3.27a Left image with many obstacles as bars and o 1 -o 5 70
3.27b Right image with many obstacles as bars and o 1 -o 5 70
3.28 Disparity map with many obstacles at different positions corresponding to different colors 70
3.29a 3D point map viewed from the front side including five obstacles and one freespace 70
3.29b 3D point map viewed from the right side 71
3.29c 3D point map viewed from the left side 71
3.30 2D distance map including five obstacles and one freespace 71
3.31 Wheelchair with the height h 72
3.32a Left image with one freespace 72
3.32b Right image with one freespace 72
3.33 Disparity map 73
3.34 3D point map 73
3.35 2D distance map with one freespace 73
3.36a Left image with one freespace 73
3.36b Right image with one freespace 73
3.37 Disparity map 74
3.38 3D point map 74
3.39 2D distance map with one freespace as the obstacle 74
3.40 Wheelchair at the left position of the freespace 75
Trang 183.41a Left image with one freespace and one bar 75
3.41b Right image with one freespace and one bar 75
3.42 Disparity map 76
3.43 3D point map 76
3.44 2D distance map with one freespace w and two obstacles o 1 and o 2 76
3.45 Wheelchair at the central position of the freespace 76
3.46a Left image with one freespace and one bar 77
3.46b Right image with one freespace and one bar 77
3.47 Disparity map with one dark blue freespace 77
3.48 3D point map one freespace 77
3.49 2D distance map with one freespace w and two obstacles o 1 and o 2 77
3.50 Wheelchair at the right position of the freespace 78
3.51a Left image with one freespace and one bar 78
3.51b Right image with one freespace and one bar 78
3.52 Disparity map with a dark blue freespace 79
3.53 3D point map with a freespace 79
3.54 2D distance map with one freespace w and two obstacles o 1 and o 2 79
3.55a The ‘URL’ laser system 80
3.55b 2D map using the ‘URG’ laser, in which 2D map_1 (black) is the real-time map measured and 2D map_2 (blue) is the usable map identified In this 2D map, the laser sensor can not measure the height of freespace It therefore shows two freespaces w1 and w2. 81
Trang 193.56a Left image with the height h and width w1 of the first freespace, the
width w2 of the second freespace without the detected height to an
obstacle 82
3.56b Right image with the height h and width w1 of the first freespace, the width w2 of the second freespace without the detected height to an obstacle 82
3.57 Stereo disparity map, in which obstacles close to the camera position are shown in light blue 82
3.58 3D point map with two freespaces w1 and w2 and three obstacles o1, o2 and o3 82
3.59 2D distance map using stereoscopic cameras, in which the second freespace has the width w2 The width w1 of the first freespace is considered as an obstacle, this being due to the height being less than that of the wheelchair 82
4.1 Wheelchair environment interaction 86
4.2 Height and diameter of the wheelchair 87
4.3 Left image with one freespace 93
4.4 Right image with one freespace 96
4.5 Stereo disparity map, in which the width of the freespace is dark blue 97 4.6a Left image with two freespaces 97
4.6b Right image with two freespaces 97
4.6c Stereo disparity map has two freespaces of dark blue color 97
4.6d 3D point map has a freespace with the width w and two obstacles o1 and o2 97
4.6e 2D distance map shows the width w of the freespace and two obstacles o1 and o2 98
Trang 204.7 Block diagram of estimating the width for two freespaces with optimal
probability 103
4.8a Left image with two freespaces 103
4.8b Right image with two freespaces 103
4.8c Stereo disparity map has two freespaces of dark blue color 104
4.8d 3D point cloud map with two heights h ’ , h ” and two widths w ’ , w ” 104
4.8e 2D distance map shows two freespace with two widths w ’ , w ” 104
4.9 Block diagram of estimating a freespace having a height and a width111 4.10a Left image with a freespace 111
4.10.b Right image with a freespace 111
4.10.c Stereo disparity map has a freespace of dark blue colour 111
4.10.b 3D point map has a freespace with a height h, a width w, a “bar” obstacle and two obstacles o1 and o2 111
4.10.b 2D distance map shows a freespace with a width w and two obstacles o1 and o2 111
4.11 Block diagram showing estimation of two freespaces 119
4.12a Left image 119
4.12b Right image 119
4.12c Stereo disparity map has two dark coloured freespace 120
4.12d 3D point map with two heights h’, h”, two widths w’, w”, two “bar” obstacles and three obstacles o1, o2, o3 120
4.12e 2D distance map shows two freespace with two widths w’, w” and three obstacles o1, o2, o3 120
5.1 User using a head command to turn left 137
5.2 Bock diagram showing user commands 139
Trang 215.3 A head-movement sensor mounted on the cap, providing head
commands for wheelchair control 139
5.4a Transmitter of a head sensor system 140
5.4b Receiver of a head sensor system 140
5.5a A sensing axis for the accelerometer with X, Y and Z-axis 141
5.5b The gravity component of a tilted X-axis accelerometer 141
5.6a Typical nonlinear output of left turning with a range of 2.5÷4.2V corresponding to 0÷90 degrees 141
5.6b Typical nonlinear output of right turning with a range of 2.5÷1.1V corresponding to from 0÷ –90 142
5.7a A range of degrees for user intentions using a head sensor 143
5.7b A range of degrees for controls using a head sensor to control forward and backward speed 143
5.8a Output of left turning with a range of 2.5 ÷ 3.6V corresponding to from 10 ÷ 40 degrees 143
5.8b Plot of right turning with a range of 2.5 ÷ 1.6V corresponding to from -10 ÷ - 40 degrees 144
5.9a Output of forward speed with a range of 2.5 ÷ 3.4V corresponding to from 10 ÷ 40 degrees 144
5.9b Plot of backward speed with a range of 2.5 ÷ 1.4V corresponding to from -10 ÷ -40 degrees 144
5.10 Block diagram of Bayesian estimation for dynamic freespaces in an autonomous mode in populated environments 145
5.11 A Wheelchair in a various environment, in which w d1 and w d2 are two
freespaces, o1, o3 are two static obstacles and o2 is a moving obstacle146
Trang 225.12a Left centre of the freespace A 1 A 3; angles α1, α1 of the triangle A 1 OB
The steering angle θ from the centre of the freespace to H; A 3 A 4 is the obstacle 150
5.12b Right centre of the freespace A2A3; θ is the steering angle, and A 1 A 2
and A 3 A 4 are the detected obstacles 150
5.12c A binary histogram determined from the 2D distance map of Figure
5.11a 150
5.13a Output of the steering velocity with ωa (V) vs d c(m) 153
5.13b Output of the speed velocity with v a (V) vs d f(m) 154
5.14 Flow chart of a semi-autonomous wheelchair control system, in which
intention estimation applied to estimate state including u user and u auto155
5.15 Head commands using a head sensor are ranges of degrees, (a1÷a2) and
(b1÷b2) for left and right turning; (c1÷c2) and (d1÷d2) for controlling forward and backward 156
5.16 Block diagram of a semi-autonomous wheelchair control system
including a populated environment and user commands 160
5.17 Moving wheelchair to pass through a dynamic freespace in front of the
wheelchair in a populated environment 161
5.18a Left image with a freespace 164
5.18b Right image with a freespace 164
5.18c Stereo disparity map 164
5.18d 3D point map with a freespace 164
5.18e 2D distance map having a freespace w d=1.53m and two static obstacles
o1, o2 165
5.19a Left image with one moving obstacle 166
5.19b Right image with one moving obstacle 166
Trang 235.19c Stereo disparity map, in which obstacles are light blue 167
5.19d 3D point map with one freespace w d , three obstacles o1, o2 and o3 167
5.19e 2D distance map shows a dynamic freespace w d=1.11m and two static
obstacles o1, o2 and a moving obstacle o3 167
5.20a Left image with one moving obstacle 169
5.20b Right image with one moving obstacle 169
5.20c Stereo disparity map, in which detected obstacles are light blue 169
5.20d 3D point map with one freespace w d and three obstacles o1, o2 and o3169
5.20e 2D distance map shows a dynamic freespace w d=0.6m and two static
obstacles o1, o2 and a moving obstacle o3 169
5.21a Left image with a moving obstacle 171
5.21b Right image with a moving obstacle 171
5.21c Stereo disparity map, in which detected obstacles are light blue, moving
obstacle yellow colour closer to the wheelchair 171
5.21d 3D point map with one freespace w d and two static obstacles o1, o2 and
one moving obstacle o3 171
5.21e 2D distance map shows a dynamic freespace w d=0.73m and two static
obstacles o1, o2 and a moving obstacle o3 171
5.22a 2D distance map without a moving obstacles and the width w d=1.53m173
5.22b 2D distance map without a moving obstacles the width w d=1.53m 173
5.22c 2D distance map without a moving obstacle o3 and the width w d=1.53m173
5.22d 2D distance map with one moving obstacle o3 and the width w d=1.54m173
5.22e 2D distance map with one moving obstacle o3 and the width w d=1.34m173
5.22f 2D distance map with one moving obstacle o3 and the width w d=1.11m173
5.22g 2D distance map with one moving obstacle o3 and the width w d=0.78m173
Trang 245.22h 2D distance map with one moving obstacle o3 and the width w d=0.6m173
5.22i 2D distance map with one moving obstacle o3 and the width w d=0.65m174
5.22j 2D distance map with one moving obstacle o3 and the width w d=0.73m174
5.22k 2D distance map with one moving obstacle o3 and the width w d=1.25m174
5.22l 2D distance map with one moving obstacle o3 and the width w d=1.27m174
5.22m 2D distance map without a moving obstacle o3 and the width w d=1.52m174
5.22n 2D distance map without a moving obstacle o3 and the width w d=1.53m174
5.23a Left image with two freespaces 178
5.23b Right image with two freespaces 178
5.23c Disparity map with two freespaces 178
5.23d 3D point map has three obstacle o1, o2 and o3, and two freespaces w d1
and w d2 178
5.23e 2D distance map with three obstacles o1, o2 and o3, and two freespaces
w d1 and w d2 178
5.24 The wheelchair using the autonomous mode to move from the central
position to the centre of the first freespace w d1 182
5.25a Left image with two freespaces 183
5.25b Right image with freespaces 183
5.25c Disparity map with two freespaces 184
5.25d 3D point map has two freespaces w d1 and w d2 184
5.25e 2D distance map with three obstacles o1, o 2 and o3, and two freespaces
w d1 and w d2 184
5.26a The user using a head command to move through the second freespace w2.188
5.26b The mobile wheelchair using the user intention to move from the start
position to the centre the second freespace w d2 189
Trang 25List of Tables
2.1 Matching approaches to determine disparity, in which (i,j) is the pixel
coordinates in windows w of two images I R , I L 36
4.1 The estimated width W of a freespace 101
4.2 Estimated widths W’ and W” of two freespaces 110
4.3 The estimated height H and width W of one freespace 118
4.4 The estimated height h’ and width w’ of the first freespace 126
4.5 The estimated height h” and width w” of the second freespace 132
5.1 The estimated states of the dynamic freespace wd impacted by a
moving obstacle 175
5.2 The estimated states of the user’s intentions 181
5.3 The estimated states of the user’s intentions 187
Trang 26Abbreviations
1D - One-dimensional
2D - Two-dimensional
3D - Three-dimensional
aVLSI - Analog Very Large Scale Integration
BCD - Binary Conner Detection
BR - Bayesian Recursive
CCD - Charge Coupled Device
DEM - Digital Elevation Map
EKF - Extended Kalman Filter
FPGA - Field Programmable Gate Array
FSAD - Fast Sum of Absolute Differences
GPS - Global Positioning System
INS - Navigation System
ISR - Intelligent Security Robot
LM - Local Map
LRF - Laser Range Finder
MBS - Multiple Baseline Stereo
NCC - Normalised Cross Correlation
NSSD - Normalised Sum of Square Differences
PCB - Printed Circuit Board
PI - Proportional Integral
PID - Proportional Integral Derivative
PPMs - Perspective Projection Matrices
RF - Radio Frequency
RFID - Radio Frequency Identification
Trang 27RGB - Red Green Blue
RMBS - Robust Multiple Baseline Stereo
SAD - Sum of Absolute Differences
SCI - Spine Cord Injury
SLAM - Simultaneous Localization and Mapping
SMAD - Smooth Median Absolute Deviation
SMC - Sliding Mode Control
SSD - Sum of Squared Differences
SSSD - Sum of Sum of square differences
TNIP - Total Number of Interest Points
USB - Universal Serial Bus
VFH - Vector Field Histogram
VPH - Vector Polar Histogram
ZNCC - Zero-mean Normalized Cross Correlation
ZSAD - Zero-mean Sum of Absolute Differences
ZSSD - Zero-mean Sum of Square Differences
Trang 28A recent report of Cripps estimated around 300 to 400 new cases each year in Australia
of spinal cord injuries (SCI), either traumatic or non-traumatic (Cripps 2006) According to this report, in 2004-2005, 39 percent of disability cases also arose from road transport, nine percent from water-related activities, 29 percent from falls, and a further 10 percent from being hit or struck by an object
According to a report published by the US Bureau and the US Department of in 2002, there were 51.2 million people (or 18.1 percent of the population) with a disability, with some 32.5 million people (or 11.5 percent of the population) having severe cases The
US report also provided age-specific data on disabled people, finding that 2.7 million
Trang 29Arthritis, SCI, balance disorders, and other conditions are generally related to traumas caused by car accidents, falls, horse-related and wheelchair-related accidents (Cooper, Boninger et al 2006) In order to reduce wheelchair related accidents, as well as assist severely disabled people with everyday activities, wheelchair research and development has focused more on so called ‘smart’ wheelchairs, these being equipped with ‘high-technical’ apparatus and advanced options In particular, smart wheelchairs utilise laser, camera, voice and ultrasound sensors for navigating in around obstacles, and avoiding collisions (Simpson 2005) However, the options available at present mostly assist with problems using 2D information and simple tasks, the computation of two-dimension (2D) grid maps still being expensive
A single camera furnishes a lot of information about obstacles including colours, edges, corners, and shapes These serve to distinguish many different obstacles in mobile robots (Desouza and Kak 2002; Saeedi, Lawrence et al 2006) The facial movements of the wheelchair user can also be recognised using a Charge-coupled Devise (CCD) camera (Adachi, Goto et al 2003) A time-correlation strategy is proposed for distinguishing between images taken at different times, using a single camera This style
of camera has been used to detect freespace in front of vehicles driving on highways (Cerri and Grisleri 2005) However, the single camera approach has its limitations, including difficulty in computing distances to the wheelchair This affects their capacity
to compute wheelchair controls in populated environments
Trang 30Chapter 1 Introduction
Stereo cameras that are mounted on mobile vehicles provide in-depth information regarding the disparity between stereo images For the purposes of determining freespace as well as the position of obstacles, the information provides a basis for building 2D maps using three dimensional (3D) point maps (Murray and Little 2000; Thompson and Kagami 2005) In this application, a mobile robot utilises the camera system to navigate and autonomously explore the environment in its direction of movement The key advantage of a stereo camera system is not only the determination
of all features of the environment, but also computing distances to the wheelchair The implementation of navigation for freespaces, which detects obstacles with heights and widths for power wheelchairs, needs to be developed This helps mobile wheelchairs in avoiding possible collisions to small obstacles, or bars higher than the wheelchair This will also heighten the safety for people with severe disabilities
A recent application to detect freespace involves a mobile robot equipped with sonar sensors for detecting open freespace Using this theory, a Bayesian update is applied for computing the value of “Freespace Probability” at each point-mark, and according
to a given direction (Ip, Rad et al 2004) This, however, presents a complex strategy for
a mobile robot of obstacle and freespace detection, including collision avoidance This
is due to the number of algorithms used to achieve this objective, such as a HFFS reactive navigation scheme, a segment-based map, a probability based open space evaluation system, and a Bayesian update rule These can be used for mobile wheelchairs, however the Bayesian theory should be ultimately used to stimulate sensors for power wheelchairs in various environments, and to estimate the required information
In addition to the design of a ‘smart’ wheelchair, the shared control wheelchairs combined between user’s intentions, as well as autonomous wheelchairs, can make users more confident in its use, and also add comfort to users Using Bayesian rules to enable the system to adapt to the user, a shared control algorithm is proposed to for estimating the user’ intention (Demeester, Nuttin et al 2003; Huntemann, Demeester et
al 2007) The proposed algorithm can also compute control velocities, depending on
Trang 31Chapter 1 Introduction
system is to encourage people with severe disabilities to use a body part in controlling the wheelchair’s operation Meanwhile, the autonomous wheelchair system equipped with sensors assists the wheelchair to avoid collisions and enhance user safety Therefore, an autonomous system using sensors should be developed to guarantee the safety of users
Mobile vehicles often use ultrasound or laser sensors for obtaining information about the environment, in order to enhance vehicle control in passageways, and other collision avoidance Many algorithms for mobile vehicles are based on the density of obstacles,
as well as the cost function (Ulrich and Borenstein 2000) However, due to the costs of processing the 2D grid maps, the algorithms involve expensive computations Given the significant advantage of 3D information using stereo camera systems, a trajectory tracking algorithm has been proposed (Fierro and Lewis 1995; Chang and Chen 1996)
This algorithm is planned based on a freespace position at the centre of the wheelchair
In recent years, ultrasound and laser sensors have been used in mobile vehicles for detecting the environment; obstacles and freespaces (Borenstein and Koren 1991; Ulrich and Borenstein 1998; Murakami, Kuno et al 2000; Xiang, Xu et al 2003; An and Wang 2004; Parikh, Jr et al 2005; Taha, Mir´o et al 2006) In general, the sensors produce a 2D map represented by a 2D grid The limitations are that the laser or ultrasound sensors can ignore information above and below the scanning plane, secondly, they only provide 2D information about the plane The 3D laser sensor, on the other hand, can provide 3D information, the limitation of this being the creation of the mechanical structure for the third-dimension scan is not only complex, but expensive This is a significant limitation for creating mobile robots and autonomous wheelchairs
A common difficulty of using laser and ultrasound sensors in controls of power wheelchairs and mobile robots is that the information provided about the environment is not enough for mobile vehicles to recognise obstacles and freespaces In order to solve this problem, a power wheelchair is equipped with a stereoscopic camera system, which
collects 3D information from various environments
Trang 32disabilities (Nguyen, Nguyen et al 2009)
This means that the power wheelchair autonomously detects the height and width of freespaces and obstacles in order to provide for autonomous control and collision avoidance Moreover, in semi-autonomous wheelchair control strategy , the user’s intentions such as head-movement commands are developed Therefore, a combination
is proposed of the head-movement sensor and the stereoscopic camera system, in order
to create a semi-autonomous wheelchair control This in itself is an innovative, novel solution for disabled people, whereby the user through body sensors can command the
system to reach a desired target, thereby overriding the shared control system
1.2 Thesis Contribution
The thesis presents the problem of obstacle and freespace detection in a mobile wheelchair using a stereoscopic camera system in various environments The thesis also proposed a semi-autonomous wheelchair control strategy, including the autonomous mode and the user intention In addition, a Bayesian Recursive (BR) algorithm is applied to estimate freespace and user intention, to enable decision-making control for wheelchair’s operation The substantial contribution of this thesis is summarised as follows:
• A power wheelchair is equipped with a stereoscopic camera system for providing left and right images A Sum of Absolute Differences (SAD) correlation algorithm is applied for computing a stereo disparity map based on
Trang 33Chapter 1 Introduction
left and right images Given this disparity map, a 3D point map is computed using a geometric projection Moreover, a 2D distance map, which is converted from the 3D point map, shows obstacle and freespace information The comparison of 2D maps using the “Bumblebee” stereoscopic camera system and the “URG” laser sensor is represented, in order to show the advantage of using the stereoscopic camera
• In order to estimate freespace, a Bayesian Recursive (BR) algorithm is employed for an autonomous wheelchair The BR algorithm is based on measured information, control data and estimates of conditional probabilities for the height and width of freespaces, and for the production of posterior probabilities The average value of the posterior probabilities are determined and compared to a threshold probability, in order to make the Bayesian decision for the mobile wheelchair to pass through a freespace
• A semi-autonomous wheelchair control strategy combined between the user intention and the autonomous mode is presented In particular, steering and speed controls based on a 2D distance map are computed for the autonomous mode Dynamic freespaces can be changed due to moving onstacles in front of the wheelchair, this is difficult for the wheelchair to autonomously pass through For this reason, an advanced BR algorithm is utilised to estimate dynamic freespace Furthermore, user commands are determined using a head-movement sensor The user can use one of these commands to control the wheelchair to reach a desired target However, user commands can be uncertain due to noise from the head sensor An advanced BR algorithm estimates use intention, creating a safe and comfortable wheelchair for the user
1.3 Publication
Three international conference papers have been presented from this thesis, and are listed as follows:
Trang 34Chapter 1 Introduction
• Jordan Son Nguyen, Thanh Hai Nguyen, Hung Tan Nguyen (2009)
“Semi-autonomous Wheelchair System Using Stereoscopic Cameras,” proceedings of the 31 st IEEE/ EMBS Annual International Conference of the IEEE Engineering
in Medicine and Biology Society, Hilton Minneapolis, Minnesota, USA, September 2-6, 2009 Accepted
• Thanh Hai Nguyen, Jordan Son Nguyen and Hung Tan Nguyen (2008)
“Bayesian Recursive Algorithm for Width Estimation of Freespace for a Power
Wheelchair Using Stereoscopic Cameras,” proceedings of the 30 th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Vancouver, British Columbia, Canada, August 20-24, 2008, pp 4234-
4237 [ISBN 978-1-4244-1815-2] (CD-ROM)
• Thanh Hai Nguyen, Jordan Son Nguyen, Duc Minh Pham and Hung Tan
Nguyen (2007) “Real-Time Obstacle Detection for an Autonomous Wheelchair
Using Stereoscopic Cameras,” proceedings of the 29 th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Lyon, France, August 23-26, 2007, pp 4775-4778 [ISBN 1-4244-0788-5] (CD-ROM)
The following IEEE journal article has been submitted, which is based on this thesis :
• Thanh Hai Nguyen, Jordan Son Nguyen, Steven Su and Hung Tan Nguyen
(2008) “A Bayesian Recursive Algorithm for Freespace Estimation in
Autonomous Wheelchair using a Stereoscopic Camera System,” submitted to IEEE Transactions on Neural Systems and Rehabilitation Engineering on 18 th May, 2009
I have also contributed to one conference paper and two journal articles on the control problem relating to Magnetorheological (MR) dampers, which use a sliding mode control approach These publications are as follows-
• Thanh Hai Nguyen, Minh Tam Nguyen, Ngai M Kwok, Quang Phuc Ha and
Bijan Samali (2007) “Magnetorheological Damper Semiactive Control for Civil
Trang 35Chapter 1 Introduction
Structures with Symmetric Quantised Sliding Mode Controllers,” Journal of the Japan Society of Applied Electromagnetic and Mechanics, vol 15, pp 184-187
• Ngai M Kwok, Quang Phuc Ha, Thanh Hai Nguyen, Jianchun Li and Bijan
Samali (2006) “A novel hysteretic model for magnetorheological fluid dampers
and parameter identification using particle swarm optimization,” Elsevier Journal on Sensors and Actuators, pp 1-11
• Thanh Hai Nguyen, Ngai M Kwok, Quang Phuc Ha, Jianchun Li and Bijan
Samali (2006) “Adaptive sliding mode control for civil structures using
magnetorheological dampers,” the international Symposium on Automation and Robotics in Construction (ISARC), pp 636-641
1.4 Thesis Outline
The six thesis chapters are organised as follows:
• Chapter 2 – This chapter contains a review of the current literature on
autonomous vehicles, also providing a review of their historical development The review commences with a consideration of the various sensors available such as ultrasound, laser and camera sensors, these being mounted on power wheelchairs and creating a ‘smart’ wheelchair The review then turns to stereo vision problems such as epipolar problems, lens distorted images, three dimensional (3D) images, stereoscopic images and disparity algorithms using stereo cameras In addition, this chapter describes the Bayesian approaches applied for estimating locations These are uncertain information from sensors in both mobile robots and autonomous wheelchairs Finally, the solutions for
implementing shared control wheelchairs is presented, in order to encourage and
enhance independent activities of severely disabled people in daily life
• In chapter 3, vision problems are firstly discussed, these including epipolar
geometry, image planes baseline and depths Secondly, a Sum of Absolute Differences (SAD) correlation algorithm is outlined, for computing a disparity
Trang 36Chapter 1 Introduction
map This is developed from a pair of the left and right images using a stereoscopic camera system Next, the height and width of freespace are computed, these being based on a 3D point map The information from the 3D point map keep in a two dimensional (2D) distance map for assisting the wheelchair control The fourth part of this chapter previews the comparison of a 2D distance map obtained from the ‘Bumblebee’ stereoscopic camera with a 2D map using the ‘URG’ laser system This is shown in order to describe the effectiveness of using the stereoscopic camera in mobile wheelchairs
• Chapter 4 – This chapter discusses a Bayesian Recursive (BR) algorithm, this
being utilised to estimate the freespace height and width This assists the control plan in mobile wheelchairs Firstly, the conditional probabilities are computed based on Bayesian theorem, uncertain measured information and the control data Next, the Bayesian algorithm is computed recursively to produce optimal posterior probabilities Finally, the average values of the optimal probabilities are determined, and compared together, in order to make the Bayesian decision for the autonomous wheelchair to pass through the freespace
• In chapter 5, an advanced BR algorithm for estimating dynamic freespace and
user intention in populated environments is presented Firstly, depending on the obstacles moving in front of a mobile wheelchair, dynamic freespaces may be altered The advanced BR algorithm is based on sensor measurements, the
control data, the “obstacle” distances to the wheelchair as well as conditional
probabilities for estimating the dynamic freespaces Next, the user’s commands provided by a head-movement sensor are computed for controlling the mobile wheelchair Finally, a semi-autonomous wheelchair control is computed to combine the user’s intention and the autonomous mode The advanced BR algorithm based on the dynamic estimation is then applied for estimating the user intention for determining the control mode behind the wheelchair control
by the user
Trang 37Chapter 1 Introduction
• Chapter 6 – This chapter offers an overall conclusion for the detection of
obstacles, the determination of the freespace height and width using the stereoscopic camera system, in which the SAD algorithm was applied to construct disparity map related to detecting the freespaces and obstacles The chapter also discusses about head commands obtained from a head-movement sensor, a BR algorithm for estimating freespace and an advanced BR algorithm for estimating dynamic freespace and user intention in populated environments
In addition, this chapter proposes avenues for further research studies
• A bibliography and three appendices are provided as part of the overall thesis
Trang 38Chapter 2 Literature Review
In recent years, mobile vehicles have been embedded with computer generated sensors such as laser, ultrasound and cameras These detect obstacles in the vicinity of the vehicle, thereby assisting to avoid collisions The information obtained from these sensors includes depth, edging, colour, shape or the gesture of the user, these being utilised to compute for the detection of unknown obstacles, and avoid collision while simultaneously steering the mobile vehicles
Trang 39Chapter 2 Literature Review
Several advanced methods are used to compute for the mobile strategies of the vehicle, via the information collected from the sensors These methods are employed for estimating locations based on uncertainty information from sensors, or the computation
is used to determine the depth, map and environmental feature The data obtained in this way is used to develop the operations of the vehicle, such as collision avoidance and trajectory tracking
2.2 Sensor-based Controls
Sonar sensors mounted on mobile robots are used to provide range data For this application the sonar are continuously updated, building a two-dimensional (2D) Cartesian histogram grid as shown in Figure 2.1a The grid is used as a world model, which is then converted into a one-dimensional (1D) polar histogram (Borenstein and Koren 1990; Borenstein and Koren 1991) The polar histogram map is shown in Figure 2.1b Being a vector field histogram (VFH) method, a 2D Cartesian histogram grid is proposed to allow the detection of unforeseen obstacles and collision avoidance (Petillot, Tena Ruiz et al 2001) To ensure high performance, an enhanced method called VFH+ provides several improvements, such as a smoother robot trajectory with greater reliability Selection of the most suitable direction is based on the masked polar histogram and a cost function (Ulrich and Borenstein 1998; Ulrich and Borenstein 2000)
Figure 2.1a: 2D histogram grid map Figure 2.1b: 1D polar histogram map
Trang 40Chapter 2 Literature Review
The main problem for the mobile robots is in applying the obstacle avoidance algorithm
to calculate the arrangement of navigation points for comparing to past navigation points These points are used to detect freespace in a long corridor (Hu and Brady 1994) Using uncertain information from several sonar sensors, the algorithm uses a Bayesian approach (Moshiri, Asharif et al 2002) and decision theory to determine an
optimal response Based on probabilities of the state (passable or impassable) of nature
and the loss of actions obtained (Hu and Brady 1997; Miura, Uozumi et al 1999), the
Bayesian decision is then made for the action (maneuver or backtrack)
Sixteen pieces of ultrasonic and eight infrared (IR) sensors are mounted on a mobile
robot known as the Intelligent Security Robot (ISR), thereby avoiding the unknown
static and dynamic obstacles As shown in Figures 2.2a and 2.2b, these assist the robot
to reach the real world target in safety Based on the placement of the shapes and areas
of obstacles in the environment, a new environment map is constructed (Noykov and Roumenin 2007) While engaged in the process of obstacle avoidance, this allows the robot to modify its initial decision Ultimately, the time and length of the path will differ from when the robot navigates from the start position to the given position (Luo, Lin et al 2005)
Figure 2.2a: Obstacle avoidance only using
ultrasonic sensors
Figure 2.2b: Obstacle avoidance only
equipped with IR sensors
A further application of ultrasound sensors is their suitable deployment in powered wheelchairs, allowing individuals with severe disabilities to safely operate in an unknown environment (Dutta and Fernie 2005) The wheelchair system can detect static