Using the Image Quality Measure IQM to detect an in-focus area in an image, a Micro VR camera system had been developed to provide real-time all-in-focus image which is a composite image
Trang 1Hệ thống tạo ảnh toàn nét và ứng dụng thời gian thực
trong các hệ robot cấp độ micro All-In-Focus imaging and real-time microrobotic applications
Nguyễn Chánh Nghiệm
Trường ĐH Cần Thơ, e-Mail: ncnghiem@ctu.edu.vn
Văn Phạm Đan Thủy
Trường ĐH Cần Thơ, e-Mail: vpdthuy@ctu.edu.vn
Kenichi Ohara and Tatsuo Arai
Osaka University
Tóm tắt
Trong khoa học sự sống, việc quan sát và thao tác các vật thể vi sinh diễn ra rất thường xuyên và mang tính lập lại trong đó việc điều chỉnh lấy nét là một yêu cầu tiên quyết Nhiều giải thuật lấy nét tự động đã được đề xuất để giúp thao tác viên giảm thiểu thời gian điều chỉnh lấy nét Những giải thuật này cũng có thể được áp dụng để tự động hóa các khâu vi cảm biến hay thao tác các vi vật thể như đo độ cứng của tế bào, gắp thả, hay giữ cố định các vật thể di động Bài nghiên cứu này đề xuất ứng dụng giải thuật tạo ảnh toàn nét để giúp tự động hóa thao tác các vi vật thể trong khi có thể quan sát chúng được rõ nét trong thời gian thực Thí nghiệm gắp thả các vi vật thể với kích thước khác nhau được thực hiện để kiểm tra tính khả dụng của một hệ vi thao tác tự động thời gian thực
Abstract:
In life sciences, observing and manipulating various microbiological objects may be performed frequently and repeatedly in which object focusing is the preliminary task of the operator In order to reduce the manual focusing time, various autofocus algorithms have been proposed These algorithms can also be implemented to automate microsensing and micromanipulation tasks such as measurement of cell stiffness, pick-and-place of various microobjects, immobilization of moving objects, etc This paper proposes the All-In-Focus algorithm
to automate micromanipulation of microobjects while they can be observed clearly in real-time Pick-and-place of single microobjects with different sizes is performed to demonstrate the effectiveness of a real-time micromanipulation system
Chữ viết tắt
IQM Image Quality Measure
AIF All-In-Focus
LTPM Line-Type Pattern Matching
1 Introduction
Focusing a target microobject is a frequent and
preliminary task in observing the microobject and
further manipulating it The difficulty of this
manual task depends on the size of the target
object A small microobject requires larger
magnification lens with a narrower depth of field
A thick microobject thus requires longer manual
focus adjustment The transparency of most
microbiological objects, in addition, contributes
more difficulties for precise focusing In order to
reduce the operator time in manual focusing of
microobjects, various autofocus algorithms have been proposed
An introduction and comparison of various autofocus algorithms ranging from the well-known
to the most recently proposed algorithms can be found in [1]-[3] Based on the choice of evaluation criteria for the best-focused position, these algorithms are classified into four categories, i.e., derivative-based, statistic-based, histogram-based, and intuitive-based algorithms [2]
Using the Image Quality Measure (IQM) to detect
an in-focus area in an image, a Micro VR camera system had been developed to provide real-time all-in-focus image which is a composite image created by merging all in-focus areas from various images of the observed object taken at different
Trang 2focal distances [4] This algorithm can thus be
called All-In-Focus (AIF) algorithm and is
classified into derivative-based category The
system also provides a depth image in real time so
that 3D positions of microobjects can be obtained
to facilitate automated micromanipulation, e.g.,
automated grasping and transporting an 8 μm
The real-time micro VR camera system estimates
the depth from in-focus pixels extracted from a
series of images taken along z-direction It is,
therefore, independent on the shape of the object
There are, however, a few problems towards
obtaining accurate 3D information from this
imaging system For example, there is a trade-off
between the frame rate and the accuracy of the
system In order to achieve real-time detection,
fewer images are used to create the AIF image
which increases the resolution error To capture
images at different focal position, an actuator is
used to move the lens in the optical axis Vibration
from the actuator may also reduce the quality of
the AIF image and contribute noise to the system
Thus, the error in depth information of a
transparent object in fast motion can be significant
Fig 1 System overview
By integrating a micromanipulation system and
utilizing the depth information obtained from the
system to find the 3D position of both the
end-effector
of the micromanipulator and the target object, it is
possible to develop an automated
micromanipulation system This paper proposes an
automated micromanipulation system that uses a
two-fingered microhand as the micromanipulator
because it is capable of dexterous
micromanipulation such as cell rotation [7], and
measurement of mechanical properties of a living cell [8, 9]
To solve the inherent problems of real-time AIF imaging, this paper proposes Line-Type Pattern Matching and Contour-Depth Averaging to measure 3D positions of a micromanipulator's tip and a target micro transparent object, respectively The effectiveness of the proposed methods is experimentally demonstrated with the pick-and-place of single microobjects with different sizes The proposed method can be applied to find the 3D positions of transparent end-effector tips of common microtools, as well as glass micropipettes, and other micro biological cells This helps the All-In-Focus imaging system a versatile 3D imaging system that can be integrated into a micromanipulation system to provides not only real-time extended depth of field with the AIF image but also the 3D positions of transparent microobjects to handle them automatically
Fig 2 Illustration of All-In-Focus algorithm
2 System overview 2.1 All-In-Focus imaging system
The All-In-Focus imaging system is developed based on the Micro VR camera system [4] and consists of a piezo actuator and its controller, a processing unit to create the AIF and HEIGHT image, and a high-speed camera attached to the camera port of the microscope (Fig 1) The piezo actuator can move the objective lens cyclically up and down over a SWINGdistance up to 100 µm along the optical z-axis When the system is running, the high-speed camera (Photron
Trang 3Focuscope FV-100C) captures images at different
focal planes at the rate of 1000 frames per second
As the lens traverses a cyclic SWING distance, the
focal plane changes and a stack of images at
consecutive focal planes is collected These
images in the stack all have the same number of
pixels The best focal distance for each pixel
location is obtained by evaluating the local
frequency of image intensities around that pixel
location in all images in the image stack [10]
Thus, the AIF image is created by combining all
best-focused pixels from the image stack Fig 2
illustrates the AIF imaging algorithm and the AIF
image of a protein crystal The best focal distance
at each pixel location is normalized to a pixel
value at that pixel location in the HEIGHT image
(Fig 2) Therefore, the AIF image provides good
visualization of microobjects (Fig 3a) while the
HEIGHT image provides their positions (Fig 3b)
in the z-axis
Fig 3 AIF image (a) and HEIGHT image (b) of
protein crystal
Fig 4 The world coordinate system
The world coordinate system is shown in Fig 4
The Z-axis of the world coordinates is parallel to
the optical axis of the microscope The ( , )X Y
plane lies on the object plane and its X-axis and
Y-axis align with the horizontal x-Y-axis and vertical
y-axis of the AIF image, respectively The
relationship between the distance in ( , )X Y plane
and in the number of pixels of the AIF image is
obtained by measuring the pixel size of an AIF
image of a scalar
Let SWING {20, 40, 60,80,100} be the distance
over which the piezo actuator moves objective lens
This distance is normalized into a gray scale from
0 to 255 in the HEIGHT image Therefore, the z-coordinate of a pixel at position ( , )x y can be estimated from the corresponding pixel value ( , )
H x y in the HEIGHT image as
,
256
H x y
The distance between two consecutive focal planes which is also the resolution of the AIF imaging system can be calculated as
μm
30 *
SWING d
FRAME
where FRAME 1, 2, 4, 6 determines the frequency of scanning or the frame rate of the AIF imaging system as
30
frame rate
FRAME
Fig 5 Two-fingered microhand for dexterous micromanipulation applications
The highest and lowest frame rate of the AIF imaging system is 30 and 5 frames per second, respectively (Eq 3) With the lowest frame rate
best resolution of the system becomes d 0.1
(μm) It should be noted that the higher the frame rate, the more vibration is introduced to the system since the objective lens moves faster in a cyclic up-and-down motion
2.2 Two-fingered microhand
Glass end-effectors are generally more preferable for biological applications because of its biocompatibility In this study, a two-fingered microhand [6] that is mounted on the stage of the
(Fig 5) is used as the manipulator of the
Trang 4micromanipulation system The microhand has
two microfingers that are fabricated by pulling
glass rods or capillary tubes In addition, it is a
potential microtool with dexterous
micromanipulability for potential biological
applications
One of the two microfingers of this microhand is
controlled by a 3-DOF parallel link mechanism
The parallel link mechanism and the other
microfinger are mounted on a three-dimensional
motorized stage to provide the global motion of
the microhand in a large workspace Dexterous
manipulation is realized by the microfinger which
is controlled by the parallel link mechanism This
configuration enables manipulation of multisized
microobjects in a large workspace
3 Measuring microobject position in 3D
3.1 Measuring 3D positions of end-effectors
Having an elongated shape, a few lines can be
detected along the microfinger in its AIF image
The 2D position of the fingertip can be thus
obtained from these detected lines The z-position
of the fingertip is estimated from the HEIGHT
image using the information of the detected lines
The process is as follows
Fig 6 (a) Microfingers and 55 μm microsphere
(b) Detected lines superimposed on detected
microfingers
Fig 7 Line grouping using middle position of
lower endpoints of detected lines in
x-direction
3.1.1 Line detection
The two microfingers are set in the vertical direction and inclined toward each other (Fig 6) Due to the shallow depth of field, only part of the microfinger can be in focus The curvature of the surface of the microfinger functions as the surface
of a lens Therefore, the middle region of this local area will be brighter when it is in focus This phenomenon was shown in a relevant section and figure in [11] The AIF imaging system merges all in-focus parts of the object; it thus creates an image of a microfinger with the brighter region inside As a result, there exist three regions with different intensity levels for each microfinger in the AIF image among which the middle region is the brightest (Fig 6a)
Merging all in-focus regions along the elongated microfinger, four lines are ideally detected in the AIF image for each microfinger by split and merge algorithm [12] A threshold is set for the length of
a detected line to eliminate false lines that may result from the ghost of a microfinger in its AIF image especially when it is moving
The four detected lines for a microfinger characterize a microfinger in the AIF image Two
of these are located at the borders of the microfinger; they are thus termed border lines The other two lines which are in between the border lines are termed inner lines
3.1.2 Microfinger classification
Since there are two microfingers in the AIF image,
it is necessary to classify the detected lines in the
The x-coordinates of the lower endpoints of all detected lines are compared to their average value
x_midpoint as shown in Fig 7 A detected line is
classified as left-microfinger group if its lower
e n d p o i nt ’ s x - c o or d i na t e i s s ma l l er t ha n
x_midpoint; otherwise, it belongs to the
right-m i c r o f i n g e r g r o u p
3.1.3 Line-type pattern matching for fingertip identification in 2D
The AIF imaging system needs at least 30 images
to create the AIF image in real-time at 30 frames per second The system can provide good AIF observation of the microobject even when it is moving However, line detection for identifying two microfingers of the microhand becomes more difficult if it moves in high-speed The edges along the microfinger may form broken line segments due to the limited processing speed of the AIF imaging system hardware
Trang 5Because the microhand is set in a vertical direction
in the image and three regions with different
intensity levels are observed for each microfinger
in the AIF image, the image intensity can change
either “from bright to dark” or “from dark to
bright” when going across a detected line from left
to right This detected line is defined to be type
“0” and type “1”, respectively Let L1, L2, L3, L4
be the four detected lines for a microfinger in
order from left to right The line-type pattern in
case of four lines correctly found from a
microfinger is shown in Table 1 This holds true
because the microfinger is darker than the image
background and the middle region is the brightest
among the three image region of the microfinger
Table 2 shows the line-type patterns of three lines
inferred from that of the four-line case when a
certain line Li cannot be detected By matching
with these patterns, the line-type pattern of three
detected lines can also be used to identify a
microfinger
Table 1 Ideal line-type pattern of 4 detected lines
Table 2 Line-type patterns of 3 detected lines
Missed
line
Line type
Fig 8 (a) Detected lines from the microfingers
(b) Fingertip positions when microhand was
moving at 100 μm/s
115
255
y
H ,
0
x, y
255
y
H ,
0
x , tip y tip
x , tip y tipx, y
fitted line
90
Fig 9 Pixel values from HEIGHT image along inner line on left microfinger (a) and right microfinger (b) at initial setup Fitted line is calculated from 80 points
It is also possible that a line-type pattern of four detected lines does not match with that in Table 1 This can happen when the microhand is moving in fast motion so that the two broken lines can be found on the finger border (right finger in Fig 8a)
In addition, a line can also be found from the ghost
of the microfinger border (left finger in Fig 8b) due to limitations of the AIF processing speed of the hardware In these cases, the line-type pattern
of a set of three neighboring lines from the four detected lines can give a correct match as shown in Fig 8
When the actual existence of the microfinger is validated from the detected lines by Line-Type Pattern Matching, the 2D position of the fingertip can be accurately found from these lines Because the microfinger tip is quite sharp, the y-coordinate
of a microfinger tip can be set the same as the y-coordinate of the topmost endpoint of all the lines detected from that microfinger With the y-coordinate known, the x-y-coordinate of the tip is computed from the equation of either inner line L2
or L3
3.1.4 Inclination measurement and depth estimation of the end-effector
Depth estimation of the end-effector means finding the position of the microfinger tip in z-axis The z-position of the microfinger tip found at location (x tip,y tip) in the AIF image can be directly estimated from the gray value
H x tip y tip of the pixel at location (x tip,y tip) in the HEIGHT image using Eq 1 However, the HEIGHT image is very noisy Therefore, more information is required to obtain accurate z-position of the tip In this paper, the angle of
Trang 6inclination of the microfinger is utilized to obtain
accurate depth information of the fingertip
Given the positions of the pixels which lie on a
line detected from the microfinger in the AIF
image, the pixel values in the HEIGHT image at
these positions are collected A line is fitted from
the values of 80 pixels along the tip’s part of the
detected line The angle of inclination of the fitted
line est i ma t es t he inclinat ion a ngle of t he
microfinger to the object plane Figure 9 shows the
values of the HEIGHT image’s pixels along the
inner lines of the left microfinger and the right
microfinger Because of the limited SWING range
of the AIF imaging system, only the upper part of
the detected line in the AIF image (the tip’s part)
i s u s e d i n t h i s f i t t i n g p r o c e s s
The z-coordinate of the fingertip is estimated from
the fitted line at (x tip,y tip) rather than the single
pixel value H x( tip,y tip) in the HEIGHT image In
Fig 9, the ordinate of the rightmost point on the
fitted line at (x tip,y tip) relates with the
z-coordinate or z-position of the tip of the
microfinger according to Eq 1 In this sense, the
inclination of the microfinger is utilized to
eliminate noise in the HEIGHT image to estimate
accurate depth information of its tip The
inclination angle of the microfinger can also be
useful information when oriented
micromanipulation is required although the
inclination angle is not controlled in the current
microhand system
The inclination angle and depth information can be
obtained from either the border lines or the inner
lines However, it is observed that the inner lines
are clearer and less broken especially when the
microfinger is in fast motion For this reason, the
inner lines of a microfinger are used to estimate its
tip’s position in z-axis If two inner lines can be
found for a microfinger after Line-Type Pattern
Matching, the z-position of the fingertip is
estimated from the fitted line with the smaller
regression error
Since microfingers and micropipettes can be
fabricated similarly by pulling a glass rod or tube,
they may have similar elongated shapes Thus, the
proposed method can also be applied to measure
the 3D position of a micropipette However, a
micropipette may have less-invasive rounded
shape Therefore, the method should be modified
to identify the position of the tip in the 2D AIF image Unlike the tip of a sharp microfinger, the x-coordinate of the rounded tip of a micropipette (pointing in y-direction) should be determined as the average of the x-coordinates of the upper endpoints of the detected lines on the micropipette
3.2 Measuring 3D positions of target objects
The AIF imaging system can also be used to find the 3D position of micro transparent objects Unlike the tip of a microfinger or a sharp end-effector whose position can be characterized by a single point in 3D space, the 3D boundary of a microobject characterizes its 3D position Under optical microscopes, it is difficult to reconstruct 3D model of a micro- transparent object Thus, the contour of the object and its centroid in the AIF image provide its 2D position The z-coordinate of the object can be considered as its centroid position in z-axis
Assuming that the object is round-shaped and suspended on the glass plate, the contour of the object on the plane that passes through the object’s center and is perpendicular to the z-axis can be considered as the outermost contour in the 2D AIF image Using this assumption, Contour-Depth Averaging is proposed to estimate the z-position of the object as
(μm)
* ( , ) 256
C
H x y
x y C n
where C is the contour or the boundary of the object in the AIF image and n Cis the number of pixel points on the contour C
In this paper, a glass microsphere is used as the target object The microsphere is transparent and qualifies our assumption Thus, its 2D contour in the AIF image is detected as a circle using Hough gradient algorithm [13]
4 Experimental methods
The performance of the AIF system depends on the parameter SWING and FRAME. Adjusting parameter FRAME is a trade-off between the resolution (Eq 2) and the frame rate of AIF imaging (Eq 3) The resolution of AIF imaging is also determined by changing the scanning range
SWING of the AIF imaging system (Eq 2)
Trang 7In the experiment, the values of these parameters
are: SWING 80μm, FRAME 2 These settings
are to achieve adequate resolution of AIF imaging
1.3
d
μm for objects with different sizes in the
scanning range of 80 μm However, frame rate of
AIF imaging is reduced to 15 frames per second
0
20
40
60
80
100
120
140
1 10 19 28 37 46 64 73 82 91
Pixel Gray Value
Fig 10 Intensity histogram of pixels on the circle
around a microsphere in HEIGHT image
The AIF imaging system is integrated into an
Olympus IX81 inverted microscope under
transmitted light bright-field observation mode
An Olympus LUCPlan-FLN 20X/0.45na Ph1
objective lens is used to achieve comfortable
visualization of microobjects which are of
different sizes in the desired range from 10 μm to
100 μm
4.1 Accuracy assessment of depth measurement
In order to evaluate the effectiveness of the AIF
imaging system, it is necessary to assess the
accuracy of depth estimation or measurement of
z-positions of both the end-effector tip and the target
object
4.1.1 Depth measurement of the target object
Figure 10 shows the histogram of the gray values
of the pixels on the circular contour around a 55
μm microsphere in the HEIGHT image Most of
the pixels (88%) have the gray value of 119 and
127 The standard deviation of these pixel values
is about 4.0 This corresponds to about 1.24 μm
which is about the same as the resolution of the
AIF imaging system at the chosen settings
Therefore, the average gray value of all the pixels
along the detected circle in the HEIGHT image
can be used to find the z-coordinate of the center
of that microsphere using Eq 4
In order to evaluate the linearity against z-position
of the object, a microsphere was moved 60 μm in
z-direction with a step-distance of 2 μm The plot
of measured z-position of the microsphere versus its displacement is shown in Fig 11 A high linearity can be observed from the dotted trend line
4.1.2 Depth measurement of the microhand
A linear displacement of 30 μm in z-direction was sent to the microhand and the measured z-position
of the moving microhand is shown in Fig 12 Good linearity of the measured data can also be observed from the trend lines
15 25 35 45 55 65 75 85
Displacement in z-direction (micrometer)
Fig 11 Measured z-position of a microsphere
0 10 20 30 40 50 60 70
Displacement in z-direction (micrometer)
f1 f2 Linear (f1) Linear (f2)
Fig 12 Measured z-position of left microfinger f1 and right microfinger f2
4.3 Pick-and-place of different-sized microspheres
As an application of the AIF imaging system, pick-and-place task was performed to single microspheres by using a two-fingered microhand [6] The microspheres are suspended in the water
on a glass plate to resemble biological cells in their culture medium The 3D positions of the two microfingers of the microhand and of a microsphere estimated from the AIF imaging system helped automate the pick-and-place task
Because the microhand was developed to have a multi-scale manipulability, microspheres of 96 μm,
55 μm, and 20 μm in diameter were used This is
Trang 8also the size range of our currently interested
objects; for example, lung epithelial cells whose
stiffness was measured [8] were about 20 μm in
diameter
In this experiment, the microhand is placed over
100 μm from a target microsphere in the 2D object
plane It is manually brought to about the same
z-level of the microsphere and coarsely focused so
that both the microhand and the target object are
within the scanning range of the AIF imaging
system After this initial setup (Fig 13a), the
position of the two fingertips are calculated and
the automatic z-alignment is performed by moving
the right microfinger to the z-level of the left
microfinger (Fig 13b) A cycle of pick-and-place
task is then performed for the target microsphere
as follows
Fig 13 (a) Initial setup (b) After automatic
z-alignment A cycle of pick-and-place: (c)
Approach, (d) pick-up, (e) transport, (f)
release target
Step 1:The position of the microsphere is
calculated and the two fingers are
automatically opened wider than its width
about 5 μm The microhand approaches
the microsphere so that the microsphere is
in between the two microfingers (Fig
13c)
Step 2:The microsphere is grasped by closing the
right microfinger so that the distance between the two microfingers is less than the microsphere’s diameter about 5 μm to hold the microsphere firmly In the case
of grasping microbiological objects, they may deform slightly but they should not
be damaged by this slight deformation The microsphere is then picked up a distance z that is about the object diameter (Fig 13d)
Step 3:The microsphere is transported
100 μm
x
away from its position (Fig 13e)
Step 4:The microsphere is moved down the same
distance z by the microhand and is released (Fig 13f)
5 Results and discussion 5.1 Real-time tracking of the microhand
The microhand was tracked for 500 image frames
in this experiment The success rate was about 93.2% The average computation time for searching the microhand was about 14.5 ms The tracking frame rate was about 21 frames per second Thus, real-time tracking was achieved
During tracking, the performance of LTPM was also recorded In detecting the two microfingers in
500 successive AIF images for 20 times, the case where 3 lines were found was about 58% and about 93% of these cases have similar line-type patterns shown in Table 2
Although the detection of a high-speed moving micro transparent object is not the scope of this paper, the microhand moved at the highest speed
of the system which is limited to 100 μm/s If the microhand moves faster, the success rate of real-time tracking of the microhand may decrease dramatically due to the hardware limitations of the AIF imaging system
5.2 Pick-and-place of different-sized microspheres
Table 3 shows the success rate of pick-and-place experiment with different-sized microspheres after
20 trials The success rate decreased for smaller objects
It was observed that smaller objects were more adhesive to the microfinger and they were difficult
Trang 9to release In addition, the AIF imaging system
was set up for an appropriate scanning range
SWING= 80 μm for different-sized objects With
FRAME = 2, the resolution of the system was
about 1.3 μm which may not be suitable for a
perfect spherical object such as a 20 μm
microsphere Since the experiment was performed
to evaluate the method of obtaining 3D
information from the AIF imaging system, no
treatment to the microfingers was performed to
overcome adhesion problem that might have
contributed to the decrease of the success rate
The success rate might also attribute to the
vibration generated by the piezo actuator when
grasping smaller microspheres In the case of a
microsphere, it can slide out of the two
microfingers while being grasped if large vibration
occurs In the case of grasping a biological cell,
vibration may not affect much at the grasping step
since cells are generally adhesive However,
releasing a cell will be more difficult Using a
fingertip to push a cell which is adhered to the
other microfinger may help to successfully release
the cell
Table 3 Pick-and-place performance for
microspheres of different sizes
Success rate 90% 80% 74%
Although a trade-off between the accuracy and the
scanning frequency of AIF imaging was
parameterFRAME,better piezo actuators with less
vibration and higher scanning frequency may
improve the accuracy as well as the
real-time performance of the system The success
rate of pick-and-place task can also increase with
better experimental setup to reduce vibration and
by giving the feedback of the object’s size to
adaptively change parameter SWING to obtain
higher resolution or accuracy of AIF imaging
In this experiment, the size of the smallest
microsphere is 20 μm in diameter The
z-resolution of the AIF imaging system might be
large compared with the size of the smallest
microsphere To achieve higher success rate of
pick-and-place of smaller microobjects such as 20
μm microspheres, the parameter SWINGshould be
adjusted to improve AIF resolution depending on
the detected size of the target object before
handling it The resolution of AIF imaging can
also be improved by increasing the value of
parameter FRAME; however, this adjustment lowers the frame rate and affects the real-time performance of AIF imaging directly
6 Conclusion
This paper presents the AIF imaging system which
is used to extend the depth of focus when observing microobjects In addition, it also provides 3D information of microobjects being observed Thus, 3D position measuring techniques have been proposed for both the end-effector and the target object so that handling microobjects can
be automated
As a potential tool for micromanipulation, a two-fingered microhand was used in the experiment Line-Type Pattern Matching was proposed to detect the 3D positions of the tips of the microfingers
Multisized microspheres were used as target objects in the pick-and-place experiment and their z-coordinates could be estimated with Contour-Depth Averaging
As AIF observation of microobjects and their 3D information can be obtained in real-time, an automated micromanipulation system for potential real-time microrobotic applications can be developed by integrating the AIF imaging system
to a micromanipulation system such as a dexterous two-fingered microhand
References
[1] Groen FC, Young IT, Ligthart G: A
Comparison of Different Focus Functions for Use in Autofocus Algorithms, Cytometry, vol
6, no 2, pp 81–91, 1985 [2] Sun Y, Duthaler S, Nelson BJ: Autofocusing
in Computer microscopy: Selecting the
Research and Technique, vol 65, no 3, pp 139–149, 2004
[3] Mateos-Pérez JM, Redondo R, Nava R, Valdiviezo JC, Cristóbal G, Escalante-Ramírez B, Ruiz-Serrano MJ, Pascau J, and
Desco M: Comparative Evaluation of
Autofocus Algorithms for a Real-Time System for Automatic Detection of Mycobacterium Tuberculosis, Cytometry, vol 81A, no 3, pp
213–221, 2012
Trang 10[4] Ohba K, Ortega C, Tanie K, Rin G, Dangi R,
Takei Y, Kaneko T, and Kawahara N:
Real-Time Micro Observation Technique for
Tele-Micro-Operation, in IEEE/RSJ International
Conference on Intelligent Robots and
Systems, vol 1, pp 647–652, 2000
[5] Ohara K, Ohba K, Tanikawa T, Hiraki M,
Wakatsuki S, and Mizukawa M: Hands Free
Micro Operation for Protein Crystal Analysis,
in IEEE/RSJ International Conference on
Intelligent Robots and Systems, vol 2, pp
1728–1733, 2004
[6] Avci E, Ohara K, Takubo T, Mae Y, Arai T:
A new multi-scale micromanipulation system
with dexterous motion In: Int symp
micro-nanomechatronics human science, pp 444–
449, 2009
[7] Inoue K, Tanikawa T, Arai T:
Micro-manipulation system with a two-fingered
micro-hand and its potential application in
bioscience J Biotechnol, vol 133, no 2, pp
219–224, 2008
[8] Kawakami D, Ohara K, Takubo T, Mae Y,
Ichikawa A, Tanikawa T, Arai T: Cell
stiffness measurement using two-fingered microhand ROBIO, pp 1019–1024, 2010
[9] Inoue K, Nishi D, Takubo T, Arai T:
Measurement of mechanical properties of living cells using micro fingers and AFM
micro-nanomechatronics human science, pp 1–6,
2006 [10] Ohba K, Ortega JCP, Tanie K, Tsuji M,
Yamada S: Microscopic vision system with
All-In-Focus and depth images Mach Vis
Appl, vol 15, no 2, pp 55–62, 2003 [11] Boissenin M, Wedekind J, Selvan AN, Amavasai BP, Caparrelli F, Travis JR:
microscopes Image Vis Comput, vol 25, no
7, pp 1107–1116, 2007 [12] Jain R, Kasturi R, Schunck BG: Machine
vision McGraw-Hill, Inc., New York, 1995
[13] O’Gorman F, Clowes MB: Finding picture
edges through collinearity of feature points
In: Proc 3rd int joint conf artif intell, pp 543–
555, 1973