Page 34114— Alternatives to Machine Vision 14.1— Laser-Based Triangulation Techniques These sensors Figure 14.1 project a finely focused laser spot of light to the part surface.. Figure
Trang 1Page 339Specific technical advice includes the following:
1 A system with built-in climate control may avoid maintenance problems in certain applications
2 Avoid requiring unnecessary peripheral equipment to be included in the system; this will just complicate the
application
3 Define system interface requirements fully
4 Avoid applications that require extended lengths of cable
5 If possible, incorporate a manual mode to exercise the system for one full cycle to allow an easy test mode for servicing
Expect that the vendor knows the process involved so he or she can make independent assessments of variables and reflect an awareness of the environment Expect that the vendor will provide training, documentation, and technical support after as well as before installation
Vendors should recognize that the application of machine vision technology is a learning experience for the user; this could lead to new expectations for the equipment, especially where new knowledge about the production process itself comes about as a consequence of being able to make observations only for the first time with such machine vision equipment
Recognize that software is not a "Band-Aid" for otherwise poor staging designs As a last piece of advice, one user panelist suggested, "Never trust a machine vision vendor that uses the phrase 'piece of cake.' "
References
Applications
Abbott, E H., "Specifying a Machine Vision System," Vision 85 Conference Proceedings, Machine Vision
Association of the Society of Manufacturing Engineers, March 25–28, 1985 Revised for SME Workshop on Machine Vision, November 1985
Abbott, E., and Bolhouse, V., "Steps in Ordering a Machine Vision System," SMEIMVA Vision 85, March 1985.
Funk, J L., "The Potential Societal Benefits from Developing Flexible Assembly Technologies," Ph.D Dissertation, Engineering and Public Policy, Carnegie Mellon University, December 1984
LaCoe, D., "Working Together on Design for Vision," Vision, September 1984 Quinlan, J C., "Getting Into Machine Vision," Tooling and Production, July 1985 Robotics Industries Association, "Economic Justification of Industrial
Robots," pamphlet
Rolland, W C., "Strategic Justification of Flexible Automation," Medical Devices (MD & DI), November 1985.
Page 340
Sephri, M., "Cost Justification Before Factory Automation," P&IM Review and APICS News, April 1984.
Zuech, N., "Machine Vision: Part 1-Leverage for CIM," CIM Strategies, August 1984; "Machine Vision: Part
2-Getting Started," CIM Strategies, September 1984.
Zuech, N., "Machine Vision Update," CIM Strategies, December 1984.
Trang 2Page 341
14—
Alternatives to Machine Vision
14.1—
Laser-Based Triangulation Techniques
These sensors (Figure 14.1) project a finely focused laser spot of light to the part surface As the light strikes the
surface, a lens in the sensor images the point of intersection onto a solid-state array camera Any deviations from the initial referenced point can be measured based on the number of sensor elements deviated from the referenced point Accuracy is a function of standoff distance and range Figure 14.2 depicts an integrated system performing both 2-D and 3-D measurements using sensor data based on laser triangulation principles
These techniques can be extended to making contour measurements (Figure 14.3) In this case, light sections or
structured light sheets are projected onto the object The behavior of the light pattern is a function of the contour of the object When viewed, the image of the line takes on the shape of the surface, and a measurement of that contour is made Again, a referenced position is measured, and deviations from the referenced position are calculated based on triangulation techniques Determination of the normal-to-surface vectors, the radius of curvature,
Page 342
Figure 14.1Laser-based triangulation technique
Trang 3Page 343and the distance from the apex to the sensor (range) can be made in a single measurement.
Arrangements of multiples of such units can be configured to accommodate virtually any combination of shapes and sizes
14.2—
Simple Photoelectric Vision
Optical methods can be used to provide edge guidance, typically associated with opaque web products (paper, rubber, etc.) Two photoelectric ''scanners" are used, one above and one below the web Each scanner includes an emitter and receiver arranged so that when the two units are in operation, each receiver sees light from the other's emitter By phase-locking techniques, the two beams developed can provide edge-guidance feedback
14.3—
Linear Diode Arrays
An alternate approach is to use two linear diode arrays positioned at the edges (Figure 14.4) Differences in edge locations are simultaneously detected and used to determine edge positional offset
Linear array cameras are well suited to making measurements on objects in motion both perpendicular to and along the line of travel Perpendicular measurements are derived by pixel-counting techniques The resolution of measure-
Figure 14.2System offered by CyberOptics that employs laser triangulation principles
to make dimensional measurements
Trang 4Page 344
Figure 14.3Depiction of light-sectioning principles
Page 345
Figure 14.4Gauging with linear array cameras
ment along the axis of travel is determined by the scan rate of the system A higher resolution can be achieved by increasing the frequency of data gathering with increasing number of pixels in the array
Another application for which linear arrays are well suited is pattern recognition to control the amount of spray
material released In these systems the array scans the product as it passes on the conveyor The image with data
associated with the object's extremities are stored and fed back to the spray mechanism to control the spray pattern This is especially useful where different sizes and shapes are comingled on the conveyor
14.4—
Fiber-Optic Arrangements
Trang 5Fibers within a bundle can be custom arranged for specific applications (Figure 14.5) For example, to detect the
presence of the edge of a moving web and to control its position, versions with three bundles can be used Using this arrangement and a special photoelectric switch with one emitter and two receptors, two relay outputs can be obtained capable of controlling web width and position
2 intercepted by the object and scattered or specularly reflected by it, and
3 evident at the surface of the object at the incidence point on the line
These techniques can be used to make measurements, check for presence and absence, and assess surface quality (e.g., pits, scratches, pinholes, dents, distortions, and striations)
In the case of making measurements, a typical laser gaging system (Figure 14.6) uses a rotating mirror to scan the laser across the part The beam is converged by a lens into a series of highly parallel rays arranged to intercept the part being measured A receiver unit focuses the scanning rays onto a sensor Because the speed of the scanning mirror is
controlled, the time the photodetector "sees" the part shadow can be accurately related to the dimensional characteristic
of the part presented by the shadow
In the case of surface characterization, a similar laser scanner arrangement projects light across the object By
positioning the photodetector properly, only light scattered by a blemish will be detected Analysis of the amplitude and shape of the signal can, in some cases, provide characterization of the flaw as well as size discrimination
14.6—
Laser Interferometer
Trang 6Interferometers function by dividing a light beam into two or more parts that travel different paths and then recombine
to form interference fringes The shape of the interference fringes is determined by the difference in optical path
traveled by the recombined beams Interferometers measure the difference in optical paths in units of wavelength of light
Page 347
Figure 14.6Principles of laser-gauging approach to dimensional measurements
Since the optical path is the product of the geometric path and the refractive index, an interferometer measures the difference in geometric path when the beams traverse the same medium, or the difference of the refractive index when the geometric paths are equal An interferometer can measure three quantities:
1 Difference in optical path,
2 Difference in geometric path, and
3 Difference in refractive index
Laser interferometers are used to perform in-progress gaging on machine tools The laser is directed parallel to the axis of the machine toward a combination 90' beam bender and remote interferometer cube The beam bender-
Z-interferometer is rigidly attached to the Z-axis slide and redirects the optical beam path parallel to the X-axis and toward the cutting position at the tool turret
The beam is thus directed at a retroreflector attached to the moving element of a turret-mounted mechanical gage head The actual measured distance is between the retroreflector on the gage head and the interferometer
14.7—
Electro-Optical Speed Measurements
Laser velocimeters exist for the noncontact measurement of the speed of objects, webs, and so on Some of these are based on the Doppler effect In these cases, a
Trang 7Page 348beam splitter breaks the laser beam into two identical beams that are directed onto the surface of the object at slightly different angles with regard to the direction of motion.
Both beams are aligned to meet at the same point on the object's surface The frequency of the reflected light beam is shifted, compared to the frequency of the original light, by the movement of the object The shifted frequencies are superimposed so that a low-frequency beat (interference fringe pattern) is produced that is proportional to the speed of the moving object
14.8—
Ultrasonics
Ultrasonic testing equipment beams high-frequency sounds (1–10 MHz) into material to locate surface and subsurface flaws The sound waves are reflected at such discontinuities, and these reflected signals can be observed on a CRT to disclose internal flaws in the material
Cracks, laminations, shrinkage cavities, bursts, flakes, pores, bonding faults, and other breaks can be detected even when deep in the material Ultrasound techniques can also be used to measure thickness or changes in thickness of materials
Measurements are useful in finding defects in rod, wire, and tubing
14.10—
Acoustics
Acoustic approaches based on pulse echo techniques (Figure 14.7), where emitted sound waves reflected from objects are detected, can be used for part presence detection, distance ranging, and shape recognition In the case of part
presence, if the part is present, there is a return signal to a detector In the case of ranging, the sensors detect the time
of flight between an emitted pulse of acoustic energy and the received pulse reflected from an object
Page 349
Trang 8Figure 14.7Acoustic-based pattern recognition.
In shape recognition, the system uses sound waves of a fixed frequency usually at 20 or 40 KHz The fixed-frequency sound wave reflects off objects and sets up an interference pattern as the waves interfere constructively and
destructively An array of ultrasonic transducers senses the acoustic field set up by the emitter at a number of distinct locations, typically eight Pattern recognition algorithms deduce whether the shape is the same as a previously taught shape by comparing interference patterns
14.11—
Touch-Sensitive Probes
Touch-sensitive probes employ some type of sensitive electrical contact that can detect deflection of the probe tip from
a home position and provide a voltage signal proportional to the deflection
When such probes are mounted on a machine, the electrical signal corresponding to probe deflection can be
transmitted to a control system In this
man-Page 350ner they can serve as a means to determine where and when the workpiece has been contacted By comparing the actual touch location with the programmed location in the part program, dimensional differences can be determined
Trang 9Extent to which a machine vision system can correctly interpret an image, generally expressed as a percentage to reflect the likelihood of a correct interpretation; the degree to which the arithmetic average of a group of measurements conforms to the actual value or dimension
Acronym
Model-based vision technique developed at Stanford University that uses invariant and pseudoinvariant features
predicted from the given object modes; the object is modeled by its subparts and their spatial relationships
Active Illumination
Illumination that can be varied automatically to extract more visual information from the scene; for example, by
turning lamps on and off, by adjusting brightness, by projecting a pattern on objects in the scene, or by changing the color of the illumination
(1) Angle formed between two lines drawn from the most widely separated points in the object plane to the center of
the lens (2) Angle between the axis of observation and perpendicular to the specimen surface
Aperture
Opening that will pass light The effective diameter of the lens that controls the amount of light passing through a lens and reaching the image plane
Area Analysis
Process of determining the area of a given view that falls within a specified gray level
Area Diode Array
Solid-state video detector that consists of rows and columns of light-sensitive semiconductors Sometimes referred to
as a matrix array
Trang 10Array Processor
Programmable computer peripheral based on specialized circuit designs relieves the host computer of high-speed numbercrunching types of calculations by simultaneously performing operations on a portion of the items in large arrays
Artificial Intelligence
Approach in computers that has its emphasis on symbolic processes for representing and manipulating knowledge in solving problems This gives a computer the ability to perform certain complex functions normally associated with human intelligence, such as judgment, pattern recognition, understanding, learning, planning, classifying, reasoning, self-correction, and problem-solving
Aspect Ratio
Ratio of width to height for the frame of a televised picture
The U.S standard is 4 : 3 Also, the value obtained when the larger scene dimension is divided by the smaller scene dimension; e.g., a part measures 9 × 5 in.; the aspect ratio is 9 divided by 5, or 1.8
Automatic Gain Control
Camera circuit by which gain is automatically adjusted as a function of input or other specified parameter
Automatic Light Control
Television camera circuit by which the illumination incident upon the face of a pickup device is automatically adjusted
as a function of scene brightness
Page 353
Automatic Light Range
Television camera circuit that ensures maximum camera sensitivity at the lowest possible light level as well as
provides an extended dynamic operating range from bright sun to low light
Automatic Vision Inspection
Technology that couples video cameras and computers to inspect various items or parts for a variety of reasons The part to be inspected is positioned in a camera's field of view The part's image is first digitized by the computer and then stored in the computer's memory Significant features of the stored image are than ''compared" with the same features of a known good part that has been previously placed in the computer's memory Any difference between the corresponding characteristics of the two parts will be either within a tolerance and hence good or out of tolerance and therefore bad Also see Computer Vision and Machine Vision
B
Back Focal Distance
Distance from the rearmost element in a lens to the focal plane
Trang 11Barrel Distortion
Effect that makes an image appear to bulge outward on all sides like a barrel Caused by a decrease in effective
magnification as points in the image move away from the image center—
Bayes Decision Rule
One that treats the units assigned by a decision rule independently and assigns a unit u having pattern measurements or features d to the category c whose conditional probability P(c) given measurement d is highest
Bit (Binary Digit)
Smallest unit of information that can be stored and processed by a computer In image processing the quantized image brightness at a specific pixel site is represented by a sequence of bits
Trang 12Boolean Algebra
Process of reasoning or a deductive system of theorems using a symbolic logic and dealing with classes, propositions,
or on-off circuit elements such as AND, OR, NOT, EXCEPT, IF, THEN, etc., to permit mathematical calculations
Bottom-up Processing
Image analysis approach based on sequential processing and control starting with the input image and terminating in an interpretation
Boundary
Line formed by the adjacency of two image regions, each having a different light intensity
Boundary Tracking (Tracing)
Process that follows the edges of blobs to determine their complete outlines
Brightness
Total amount of light per unit area The same as luminance
Brightness Sliding
Image enhancement operation that involves the addition or subtraction of a constant brightness to all pixels in an image
Burned-In Image (Burn)
Image that persists in a fixed position in the output signal of a camera tube after the camera has been turned to a
Center; in the case of a two-dimensional object the average X and Y coordinates
Chain Code (Chain Encoding)
Method of specifying a curve by a sequence of 3-bit (or more) direction numbers; e.g., starting point (X, Y) and the
sequence of 3-bit integers (values 0–7) specifying the direction to the next point on the curve