Luc Berthouze, Paul Bakker, and Yasuo Kuniyoshi, Learning of motor control: a prelude to robotic imitation, in Proc., IEEE International conference Intelligent Robots and Systems, Osaka,
Trang 12 Foveated Vision Sensor and Image Processing – A Review 93
14 K Daniilidis, C Krauss, M Hansen and G Sommer, “Real-time tracking with moving objects with an active camera”, Journal of Real-time Imag- ing, Academic Press, 1997
15 Luc Berthouze, Paul Bakker, and Yasuo Kuniyoshi, Learning of motor control: a prelude to robotic imitation, in Proc., IEEE International conference Intelligent Robots and Systems, Osaka, Japan, November
resil-19 L.Sanghoon and A.C Bovic, “Very low bit rate foveated video coding for h.263”, in IEEE International Conference on Acoustics, Speech, and Signal Processing, 1999, 1999, pp 3113 –3116 vol.6
20 G Bonmassar and E.L Schwartz, “Real-time restoration of images graded by uniform motion blur in foveal active vision systems”, IEEE Transactions on Image Processing, vol 12, pp 1838 –1842, 1999
de-21 H Qian C Yuntao, S Samarasckera and M.Greiffenhagen, “Indoor monitoring via the collaboration between a peripheral sensor and a foveal sensor”, in IEEE Workshop on Visual Surveillance, 1998, 1998, pp 2–9
22 G.A Baricco, A.M Olivero, E.J Rodriguez, F.G Safar and J.L CSanz,
“Conformal mapping-based image processing: Theory and applications”, Journal Vis Com And Image Rend., vol 6, pp 35–51, 1995
23 J.M Kinser, “Foveation from pulse images”, in Proc of Information telligence and Systems, 1999, 1999, pp 86 –89
In-24 A.S Rojer and E L Schwartz, “Design considerations for a varying sensors with complex logarithmic geometry”, in the Proc Intl Conf on Patt Rec.,, 1990, pp.278–285
space-25 B.R Friden and C Oh, “Integral logarithmic transform : Theory and plications”, Applied Optics, vol 15, pp 1138–1145, March, 1992
ap-26 J.J Clark, M.R Palmer and P.D Lawrence, “A transformation method for the reconstruction of functions from non-uniformly spaced sensors”, IEEE trans Accoustic, speech and signal processing, vol 33(4), pp
29 Y Kuniyoshi, N Kita and K Sugimoto, “A foveated wide angle lense for active vision”, in the Proc of IEEE intl Conf Robotics and Automation, 1995.
Trang 294 M Yeasin and R Sharma
30 L Berthouze, S Rougeaux, and Y Kuniyoshi, Calibration of a foveated wide angle lens on an active vision head, in in Proc., IMACS/IEEE-SMC Computational Engineering Systems Applications, Lille, France, 1996
31 S Rougeaux and Y Kuniyoshi, Velocity and disparity cues for robust real-time binocular tracking, in in Proc., IEEE International Conference
on Computer Vision and Pattern Recognition , Puerto Rico, 1997, pp 1– 6.
32 S´ebastien Rougeaux and Yasuo Kuniyoshi, Robust tracking by a manoid vision system, in in Proc., First Int Workshop Humanoid and Human Friendly Robotics, Tsukuba, Japan, 1998
hu-33 S K Nayar, “Omnidirectional video camera”, in Proc of DARPA Image Understanding Workshop, New Orleans, May 1997
34 S K Nayar, “Ccatadioptric omnidirectional camera”, in Proc of IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Peurto Rico, June 1997
35 S K Nayar and S Baker, “Catadioptric image formation”, in Proc of DARPA Image Understanding Workshop, New Orleans, May 1997
36 S K Nayar, “Omnidirectional vision”, in Proc of Eight International Symposium on Robotics Research (ISRR), Shonan, Japan, October 1997
37 S K Nayar and S Baker, “A theory of catadioptric image formation”, in Proceedings of the 6th International Conference on Computer Vision, In- dia, Jan., 1998, pp 35–42
38 H.Ishiguro, K.Kato and S.Tsuji, “Multiple vision agents navigating a mobile robot in a real world”, in Proceedings IEEE Intl Conf on Rob and Automation, USA, May 1993
39 M.J.Barth and H.Ishiguro, “Distributed panoramic sensing in multiagent robotics”, in Proceedings of 1994 IEEE Intl Conf on MFI ’94, Las Ve- gas,USA, Oct 1994
40 E.R Kandel, J.H Schawartz and T.M Jessell, “Principles of Neural ence, 4/e”, McGraw-Hill, New York, 2000
Sci-41 M Bouldac and M D Levine, “A real-time foveated sensor with lapping receptive fields”, Journal of Real-time Imaging, Academic Press, vol 3, pp 195–212, 1997
over-42 P.M Daniel and D Whitridge, “The representation of the visual field on the cereberal cortex of monkeys”, Journal of Physiology, vol 159, pp 203–221, 1961
43 E.L Schwartz, “Spatial mapping in the primate sensory perception: lytic structure and relevance to percetion”, Biol Cybern., vol 25, pp 181–194, 1977
Ana-44 C Braccini, G Gambardella and G Sandini, “A signal theory approach
to the space and frequency variant filtering performed by the human ual system”, Signal Processing, vol 3 (3), 1981
vis-45 C Braccini, G Gambardella, G Sandini and V Tagliasco, “A model of the early stages of human visual system : functional and topological transformation performed in the periferal visual field”, Biological Cyber- netics, vol 44, 1982
Trang 32 Foveated Vision Sensor and Image Processing – A Review 95
46 E.L Schwartz, Computational studies of spatial architecture of primate visual cortex, Vol 10, Chap 9, pp 359-411, Plenum, New York, 1994
47 B Fischl, A Cohen and E.L Schwartz, “Rapid anisotropic diffusion ing space-variant vision”, Int Journal of Comp Vision, vol 28(3), pp 199–212, 1998
us-48 E.L Schwartz, “Computational anatomy and functional architecture of the strait cortex”, Vision research, vol 20, pp 645–669, 1980
49 Y Kuniyoshi, N Kita, K Sugimoto, S Nakamura, and T Suehiro, A veated wide angle lens for active vision, in Proc., IEEE International Conference on Robotics and Automation, Nagoya, Japan, 1995, vol 3,
fo-pp 2982–2985
50 R.C Jain, S.L Bartlett and N O’Brian, “Motion stereo using ego-motion complex logarithmic mapping”, Tech Rep., Technical report, Center for research on integrated manufacturing, University of Michigan, February, 1986.
51 R Jian, L Bartlett and N O’Brien, “Motion stereo using ego-motion complex logarithmic mapping”, IEEE Tran on Pattern Anal and Mach Intell., vol 3, pp 356–369, 1987
52 M Tistarelli and G Sandini, “Estimation of depth from motion using an antropomorphic visual sensor”, Image and Vision Computing, vol 8(4), 1990.
53 C Busetti, F.A Miles and R.J Krauzlis, “Radial optical flow induces vergence eye movements at ultra-short latencies”, nature, vol 390, pp 512–515, 1997
54 G Sandini, F Panerai and F.A Miles, “The role of inertial and visual mechanisms in the stabilization of gaze in natural and artificial systems, From: Motion Vision Computational, Neural, and Echological Con- straints, Eds J M Zanker and J Zeli”, Springer-Verlag, New York, 2001.
55 William Klarquist and Alan Bovik, Fovea: a foveated vergent active reo system for dynamic three-dimensional scene recovery, in Proc., IEEE International Conference on Robotics and Automation, Leuven, Belgium, 1998.
ste-56 Albert J Wavering, Henry Schneiderman, and John C Fiala, performance tracking with triclops, in Proc., Asian Conf on Comp Vi- sion, Singapore, 1995, vol 1
High-57 Brian Scassellati, A binocular, foveated, active vision system, Tech Rep., Massachussetts Institute of Technology, January 1998
58 Y Suematsu and H Yamada, A wide angle vision sensor with fovea - design of distortion lens and the simulated image, in the Proc., of IECON93, 1993, vol 1, pp 1770–1773
59 Y Kuniyoshi, N Kita, S Rougeaux, and T Suehiro, Active stereo vision system with foveated wide angle lenses, in Proc., 2nd Asian Conference
on Computer Vision, Singapore, 1995, vol 1, pp 359–363
60 M Yeasin and Y Kuniyoshi, “Detecting and trackning human face and eye using an space-varying sensor and an active vision head”, in the Proc
Trang 496 M Yeasin and R Sharma
of computer vision and pattern recognition, South Carolina, USA, June
2000, pp 168–173
61 G Sandini and P Dario, “Active vision based on space variant sensing”,
in the Proc of Intl Symp on Robotic Research, 1989
62 F Pardo, “Development of a retinal image sensor based on cmos nology”, Tech Rep., LIRA-TR 6/94, 1994
tech-63 E Martinuzzi and F Pardo, “FG II new version of the ccd retinal sensor frame grabber”, Tech Rep., LIRA-TR 1/94, 1994
64 F Ferrari, J Nielsen, P Questa, and G Sandini, Space variant imaging, Sensor Review, vol 15, no 2, pp 17–20, 1995
65 R Wodnicki, G W Roberts and M.D Levine, “A foveated image sensor
in standard cmos technology”, in Proc of custom integrated circuit ference, 1995, pp 357–360
con-66 R Woodnicki, G.W Roberts and M Levine, “Design and evaluation of log-polar image sensor fabricated using standard 1.2 µ m ASIC and CMOS process”, IEEE Trans On solid state circuits, vol 32, no 8, pp 1274–1277, 1997
67 E Schwartz, N Greve and G Bonmassar, “Space-variant active vision: Definition, overview and examples”, Neural Network, vol 7/8, pp 1297–
1308, 1995
68 G Bonmassar and E.L Schwartz, “Space-variant Fourier analysis : The exponential chirp transform ”, IEEE trans on Patt Analysis and Mach Intl., vol 19(10), pp 1080–1089, Oct., 1997
69 J Portilla, A Tabernero and R Navarro, “Duality of log-polar image resentations in the space and spatial-frequency domains”, IEEE Transac- tions on Signal Processing, vol 9, pp 2469 –2479, 1999
rep-70 S Negadharipour, “Revised definition of optical flow: Integration of dio-metric and geometric cues for dynamic scene analysis”, IEEE Trans
ra-on Pat Analysis and Mach Intl., vol 20(9), pp 961–979, 1998
71 B Horn and B Schunck, Determining optical flow, Artificial gence, vol 17, pp 185–204, 1981
Intelli-72 P Anandan, Measuring visual motion from image sequences, PhD thesis, University of Massachussetts, Amherst, MA, 1987
73 A B Watson and A J Ahumada, Motion: perception and representation, chapter A look at motion in the frequency domain, pp 1–10, J K Tsot- sos, 1983
74 J L Barron, D J Fleet, and S S Beauchemin, Performance of optical flow techniques, International Journal of Computer Vision, vol 12, no 1,
pp 43–77, February 1994
75 M Yeasin, “Optical flow on log-mapped image plane: A new approach”,
in the lecture notes on computer science, Springer-Verlag, NY, USA, Feb
2001, pp 252–260
76 M Yeasin, “Optical flow on log-mapped image plane: A new approach”, IEEE Trans Pattern Analysis and Machine Intelligence, vol to appear
Trang 52 Foveated Vision Sensor and Image Processing – A Review 97
77 M Yeasin, “optical flow in log-mapped image plane - A new approach”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol 24,
pp 125 –131, 2002
78 D Todorovic, “A gem from the past: Pleikart stumpf’s anticipation of the aperture problem, Reichardt detectors, and perceived motion loss at equi- luminance”, Perception, vol 25, pp 1235–1242, 1996
79 J Marroquin, S Mitter and T Poggio, “Probabilistic solutions of posed problems in computational vision”, Jour of Amer Stat Assoc., vol 79(387), pp 584–589, 1987
ill-80 E.C Hildereth, “The computation of velocity field”, in Proc of Royal Stat Soc., London, B221, 1984, pp 189–220
81 J.A Movshon, E.H Edelson, M.S Gizzi and W.T Newsome, “Pattern recognition mechanisms”, In the analysis of moving visual patterns: C Chagas, R Gattass and C Gross eds., Springer-Verlag, pp 283-287, New York, 1985
82 J.A Movshon, “Visual processing of moving image”, In the Images and Understanding:/Thoughts about images; Ideas about Understanding, H Burlow, C Blackmore and M Weston-Smith eds., pp 122-137, Cam- bridge Univ Press, New York, 1990
83 B.K.P Horn and B.G Schunk, “Determining optical flow”, Artificial telligence, vol 17, pp 185–203, 1981
In-84 J.K Kearney, W.R Thompson and D.l Bolly, “Optical flow estimation,
an error analysis of gradient based methods with local optimization”, IEEE Tran on Pattern Anal and Mach Intell., vol 14(9), pp 229–244, 1987.
85 J.M Foley, “Binocular distance perception”, Psychological Review, vol
87, pp 411–435, 1980
86 J.M Foley, “Binocular distance perception: Egocentric visual task”, Journal of experimental Psychology, vol 11, pp 133–149, 1985
87 E.C Sobel and T.S Collett, “Does vertical disparity scales the perception
of stereoscopic depth?”, In the Proc of Royal society London B, vol
90 Y Yeshurun and E L Schwartz, “Cepstral filtering on a columnar image architecture: a fast algorithm for binocular stereo segmentation”, IEEE Trans Pattern Analysis and Machine Intelligence, vol 11, no 7, pp 759–
Trang 698 M Yeasin and R Sharma
93 David J Fleet, “Stability of phase information”, Transaction on Pattern Analysis and Machine Intelligence, vol 15, no 12, pp 1253–1268, 1993
94 David J Fleet, Disparity from local weighted pase-correlation, IEEE ternational Conference on Systems, Man and Cybernetics, San Antonio,
Trang 73 On-line Model Learning
for Mobile Manipulations
Yu Sun1
, Ning Xi1
, and Jindong Tan2
1 Department of Electrical and Computer Engineering, Michigan State University, East Lansing, MI 48824, U.S.A
For robotic automation, a manipulator mounted on a mobile platform can significantly increase the workspace of the manipulation and its appli-cation flexibility The applications of mobile manipulation range from manufacturing automation to search and rescue operations A task such as
a mobile manipulator pushing a passive nonholonomic cart can be monly seen in manufacturing or other applications, as shown in Fig 3.1 The mobile manipulator and nonholonomic cart system, shown in Fig 3.1, is similar to the tracker-trailer system Tracker-trailer systems gener-ally consist of a steering mobile robot and one or more passive trailer(s) connected together by rigid joints The tracking control and open loop mo-tion planning of such a nonholonomic system have been discussed in the literature The trailer system can be controlled to track certain trajectory using a linear controller based on the linearized model [6] Instead of pull-ing the trailer, the tracker pushes the trailer to track certain trajectories in
com-Y Sun et al.: On-line Model Learning for Mobile Manipulations, Studies in Computational
www.springerlink.com Springer-Verlag Berlin Heidelberg 2005c
Intelligence (SCI) 7, 99–135 (2005)
Trang 8100 Y Sun et al
the backward steering problem Fire truck steering is another example for pushing a nonholonomic system and the chained form has been used in the open loop motion planning [3] Based on the chained form, motion plan-ning for steering a nonholonomic system has been investigated in [16] Time varying nonholonomic control strategies for the chained form can stabilize the tracker and trailers system to certain configurations [13]
Fig 3.1 Mobile Manipulation and Noholonomic Cart
In robotic manipulations, manipulator and cart are not linked by an ticulated joint, but by the robot manipulator The mobile manipulator has more flexibility and control while maneuvering the nonholonomic cart Therefore, the planning and control of the mobile manipulator and non-holonomic cart system is different from a tracker-trailer system In a tracker-trailer system, control and motion planning are based on the kine-matic model, and the trailer is steered by the motion of the tracker In a mobile manipulator and nonholonomic cart system, the mobile manipula-tor can manipulate the cart by a dynamically regulated output force
ar-The planning and control of the mobile manipulator and nonholonomic cart system is based on the mathematic model of the system Some pa-rameters of the model, including the mass of the cart and its kinematic length, are needed in the controller [15] The kinematic length of a cart is
Trang 93 On-line Model Learning for Mobile Manipulations 101
defined on the horizontal plane as the distance between the axis of the front fixed wheels and the handle
A nonholonomic cart can travel along its central line and perform ing movement about point C; in this case, the mobile manipulator applies a force and a torque on the handle at point A, as shown in Fig 3.2 The line between A and C is defined as the kinematic length of the cart, |AC|, while the cart makes an isolated turning movement As a parameter, the kine-matic length |AC| of the cart can be identified by the linear velocity of point A and the angular velocity of line AC The most frequently used pa-rameter identification methods are the Least Square Method (LSM) and the Kalman filter [8] method Both have the recursive algorithms for on-line estimation Generally, if a linear model (linearized model) of a dy-namic system can be obtained, the system noise and observation noise are also known The Kalman filter can estimate the states of the dynamic sys-tem through the observations regarding the system outputs However, the estimation results are significantly affected by the system model and the noise models Least Square method can be applied to identify the static pa-rameters in absence of accurate linear dynamic models
turn-Fig 3.2 Associated Coordinate Frames of Mobile Manipulation System
Parameter identification has been extensively investigated for robot manipulations Zhuang and Roth [19] proposed a parameter identification method of robot manipulators In his work, the Least Square Method is used to estimate the kinematic parameters based on a linear solution for the unknown kinematic parameters To identify the parameters in the dynamic
Trang 10102 Y Sun et al
model of the robot manipulator [12], a least square estimator is applied to identify the parameters of the linearized model It is easy to see that LSM has been widely applied in model identification as well as in the field of robotics
With the final goal of real-time online estimation, the Recursive Least Square Method (RLSM) has been developed to save computation re-sources and increase operation velocities for real time processing [9].For identification, measurement noise is the most important problem There are two basic approaches to processing a noisy signal First, the noise can be described by its statistical properties, i.e., in time domain; second, a signal with noise can be analyzed by its frequency-domain properties
For the first approach, many algorithms of LSM are used to deal with noisy signals to improve estimation accuracy, but they require the knowl-edge of the additive noise signal Durbin algorithm and Levinson-Wiener algorithm [2] require the noise to be a stationary signal with known auto-correlation coefficients Each LSM based identification algorithm corre-sponds to a specific model of noise [10] Iserman and Baur developed a Two Step Process Least Square Identification with correlation analysis [14] But, for on-line estimation, especially in an unstructured environ-ment, relation analysis results and statistical knowledge cannot be ob-tained In this case, estimation results obtained by the traditional LSM are very large (Table 2 in section 3.4) The properties of LSM in the frequency domain have also been studied A spectral analysis algorithm based on least-square fitting was developed for fundamental frequency estimation in [4] This algorithm operates by minimizing the square error of fitting a normal sinusoid to a harmonic signal segment The result can be used only for fitting a signal by a mono-frequent sinusoid
In a cart-pushing system, positions of the cart can be measured directly
by multiple sensors To obtain other kinematic parameters, such as linear and angular velocity of the object, numerical differentiation of the position data is used This causes high frequency noises which are unknown in dif-ferent environments The unknown noise of the signal will cause large es-timation errors Experimental results have shown that the estimation error can be as high as 90% Therefore, the above least square algorithms can not be used, and eliminating the effect of noise on model identification be-comes essential
This chapter presents a method for solving the problem of parameter identification for a nonholonomic cart modeling where the sensing signals are very noisy and the statistic model of the noise is unknown When ana-lyzing the properties of the raw signal in frequency domain, the noisesignal and the true signal have quite different frequency spectra In order to reduce the noise, a method to separate the noise signal and the true signal
Trang 113 On-line Model Learning for Mobile Manipulations 103
from the raw signal is used to process them in the frequency domain A raw signal can be decomposed into several new signals with different bandwidths These new signals are used to estimate the parameters; the best estimate is obtained by minimizing the estimation residual in the least square sense
Combined with digital subbanding technique [1], a Wavelet based model identification method is proposed The estimation convergence of the proposed model is proven theoretically and verified by experiments The experimental results show that the estimation accuracy is greatly im-proved without prior statistical knowledge of the noise
8.2 Control and Task Planning
in Nonholonomic Cart Pushing
In this section, the dynamic models of a mobile manipulator and a holonomic cart are briefly introduced The integrated task planning for manipulating the nonholonomic cart is then presented
non-8.2.1 Problem Formulation
A mobile manipulator consists of a mobile platform and a robot arm Fig 3.2 shows the coordinate frames associated with both the mobile platform and the manipulator They include:
ł World Frame 6: XOY frame is Inertial Frame;
uni-W
( , ) ( ))
(p x c p p g p
M (3.2.1) where W9 are the generalized input torques, M ( p)is the positive definite mobile manipulator inertia matrix, c(p,p)are the centripetal and coriolis torques, and g ( p) is the vector of gravity term The vector
Trang 12104 Y Sun et al
T m m
m y x q q q q
q
q
p { 1, 2, 3, 4, 5, 6, , ,T } is the joint variable vector of the
mobile manipulator, where {q1,q2,q3,q4,q5,q6}T is the joint angle vector
of the robot arm and {x m,y m,Tm}T is the configuration of the platform in the unified frame6 The augmented system output vector x is defined
asx { x1, x2}, where x1 { px, py, pz, O , A , T }T is the end-effector position and orientation, and x2 { xm, ym, Tm}is the configuration of the mobile platform
Applying the following nonlinear feedback control
)(),()(p u c p p g p
} , , , , , , ,
,
{ d x d y z d d d d m d m d m d
d
y x T A O p p
p
orientation of the mobile manipulator in the world frame6, a linear back control for model (3.2.3) can be designed
feed-The nonholonomic cart shown in Fig 3.2 is a passive object Assuming that the force applied to the cart can be decomposed into f1 and f2, the dynamic model of the nonholonomic cart in frame 6 can be represented by
c
c c c c
c c c c c
I f m
f m
y
m
f m
x
2 1 1
sincos
cossin
O
T T