THE TECHNOLOGY COMPONENT OF INFORMATION SYSTEMS

Một phần của tài liệu Introduction to iinformaton systems 3e wallace (Trang 96 - 128)

This chapter provides an overview of these three important pieces, showing how the parts fit together and why they sometimes don’t fit together as well as one might hope.

In the online decision-making simulation called “Devil’s Canyon,” you will learn about a new mountain resort and the dreams that the young team of entrepreneurs have for it.

You will also hear about their challenges as they try to think through what hardware, software, and telecommunications will be the best choices for them. Then you will take on the responsibility for choosing those components. What hard- ware will they need? Should the resort go with cloud com- puting or build its own data center? What software should

they implement? Should they buy smartphones for the staff?

Should they invest in webcams along the slopes so custom- ers can purchase videos of their ski runs? Costs matter, of course, and you have a budget to work with. But so does making sure all the parts fit together. The choices should also support the team’s business objectives and add competitive advantage so the new resort is a smash hit.

Hardware, software, and telecommunications work together to create the enterprise architecture (EA). For a new company like Devil’s Canyon, the EA is a guide on what to purchase and install, how long it will take, how every- thing will work together, why certain decisions were made, and what it will cost. For existing organizations, the EA also describes the current situation and how the EA should be

IntroductIon

An online, interactive decision-making simulation that reinforces chapter contents and uses key terms in context can be found in MyMISLab™.

c h a p t e r

3 Information and Communications

Technologies: The Enterprise Architecture

MyMISLab™

• Online Simulation:

Devil’s Canyon: A Role- Playing Simulation on Enterprise Architecture for a Mountain Resort

• Discussion Questions: #3-1,

#3-2, #3-3

• Writing Assignments: #3-11,

#3-12

www.freebookslides.com

DevIl’s Canyon

a role-playing Simulation on enterprise architecture for a Mountain resort

changed or upgraded to support the mission, focusing espe- cially on business strategy and the technology required to achieve it. A roadmap describing how to get from the present to that future state guides decision making about technol- ogy directions. The EA helps managers navigate through all the choices as they add new information systems and retire older ones.

T he knowledge you gain from this chapter will help you understand what the technology options are and how they apply to organizations. People in all parts of an organization have a role to play in designing the enter- prise architecture, and input from all stakeholders is needed to ensure the technology matches the business strategy.

MyMISLab

Online Simulation

enterprise architecture (EA)

a roadmap created by an organization to describe its current situation and where it should head to achieve its mission, focusing on business strategy and the technology infrastructure required to achieve it.

IM_photo/Shutterstock.

the hardware

The physical basis of information systems covers an immense range of hardware, from main- frames and servers in giant data centers to robots, microprocessors, smartphones, printers, scanners, digital cameras, sensors, smart cards, and much more. These devices generally share two important features. First, they are digital, so they all process information using the same binary language of zeroes and ones. Second, they can all be considered computers or computer components.

The computer is any electronic device that can accept, manipulate, store, and output data and whose instructions can be programmed. That definition covers equipment you might not ordinarily think of as a computer, such as the smartphone, a game console, or a robotic rat with cameras for eyes and highly sensitive wire whiskers.

Times have changed since 1947 when the world marveled at ENIAC, the first electronic computer (Figure 3-1). Weighing 27 tons, that giant was 26 meters (80 feet) long and con- tained more than 17,000 vacuum tubes. With every breakthrough and each succeeding gen- eration, the overarching goal is to make technology work for human beings, making the components smaller, less expensive, less power hungry, and considerably more intelligent.

Although the details vary considerably, computers typically have four components: input, output, processing, and storage (Figure 3-2).

input and Output

Figure 3-3 includes various input devices that accept signals and convert them to a digital for- mat that matches the signal’s meaning. Some also display the output, such as digital cameras and touchscreens.

Human Input

Most input devices rely on human input, so they are designed with human capabilities in mind. Hands can type on keyboards, and each key press is converted into a different string combina- tion of zeros and ones. The AScII code and its variants determine how characters are encoded into digital strings, so that a back- space might send 00001000 and a SHIFT + s sends 01010011, for capital S.

Productivity guru David Allen commented on Twitter.com,

“Communicating without knowing how to type is like talking with marbles in your mouth.” As an interface for human beings, the keyboard is an underappreciated milestone in computer history.

Skilled typists can type more than 100 words per minute—faster than most people speak. Managers once disdained keyboards because they seemed linked to low-level clerk-typist jobs, but typing soon became an essential productivity skill for everyone.

Describe the four hardware 1

components of a computer, giving examples of each component.

FIgure 3-1

ENIAC, the first electronic computer.

Source: Pictorial Press Ltd/Alamy Stock Photo.

1WVRWV +PRWV 2TQEGUUKPI

%27

FIgure 3-2 Hardware components.

AScII code

a code that defines how keyboard characters are encoded into digital strings of ones and zeros.

computer

any electronic device that can accept, manipulate, store, and output data and whose instructions can be programmed.

Unfortunately, the keyboard layout evolved from the typewriter—originally designed to slow down data entry to prevent collisions of the hammers striking the paper. Because so many people already knew the QWERTY layout, though, attempts to introduce a better design failed. As sometimes happens, a superior technology solution lost out, in this case because human behavior can be so difficult to change.

The ASCII keyboard also helps explain why some countries adopted computing much earlier than others. Although a standard keyboard handily encodes languages that use the Roman alphabet with its 26 letters, 10 numerals, and a few punctuation marks, it is very cum- bersome for languages like Chinese and Japanese, which use thousands of characters. Such obstacles are overcome with more intelligent software, but they certainly made faxes more useful in those countries compared to email and delayed widespread computer use.

The mouse, joystick, and graphics tablet are other human input devices, and these can trans- mit motion and location informa- tion. Touch-sensitive screens, for example, respond to finger motions and convert them to digital signals.

A screen is organized into x- and y-axes, and locations can be trans- mitted as coordinates. Large touch- screens that several people can swipe at the same time are gaining popularity as a way to collaborate.

Gloves equipped with sensors can also transmit complex hand move- ments, such as those used in Amer- ican Sign Language.

Microphones capture human speech and transmit it in digital format. Although the sounds can be represented digitally just as sounds, speech recognition soft- ware can also identify the words.

FIgure 3-3 Input and output devices.

Source: ArchMan/Shutterstock, Pokomeda/Shutterstock, Cliparea/Custom Media/Shutterstock, Igoriale/

Shutterstock, Claudio Bravo/Shutterstock, Algabafoto/Shutterstock, Aperturesound/Fotolia.

Productivity tiP

Although the mouse is very useful, it can slow you down as you move your hands from the keyboard. Try the keyboard shortcuts in Figure 3-4 to eliminate some unneeded motion.

%64.% %QOOCPF % %QR[UGNGEVGFVGZV

%64.8 %QOOCPF 8 2CUVG

%64.5 %QOOCPF 5 5CXGEWTTGPVFQEWOGPV

%64.< %QOOCPF < 7PFQ

%64.( %QOOCPF ( 1RGPC(KPFYKPFQY

9KPFQYU /CEKPVQUJ (WPEVKQP

FIgure 3-4

Keyboard shortcuts that improve productivity.

Why hasn’t speech input overtaken fingers and hands? It almost has for many applica- tions, especially with software such as Apple’s Siri or Microsoft’s Cortana (see the Produc- tivity Tip on this page). But many people still prefer typing to speaking for longer and more complex work.1 Long voice mails, for instance, are all but dead in some organizations, partly because people appreciate the opportunity to reread, edit, and avoid embarrassing mistakes.

Human preferences like these play an important role when designing the enterprise architecture.

ScannerS and SenSorS

Optical scanners capture text or images and convert them to digi- tal format in thousands of settings.

They can scan virtually anything into an image, but combined with software or special symbols, they can decipher much more detail. For example, the barcodes that appear on products, price tags, and postal mail represent specific numbers and other symbols, and scanners transmit those details—not just the image (Figure 3-5).

The quick response code (QR code) that appears on magazines, newspapers, and even restaurant menus is another type of barcode. Originally invented by Toyota to track vehicles in the factory, QR codes are now widely used in consumer advertising. Smart- phone users can install QR code readers so they can scan the square image and hop directly to the website or other image (Figure 3-6). Such apps often collect user data for marketing purposes.2

Scanners, combined with optical character recognition (Ocr) software, can inter- pret the actual letters and numbers on a page, creating a digital document that can be edited rather than a flat picture. Banks were early adopters of this technology, which they use to process checks. The unique font is standard throughout the industry, and magnetic ink allows scanners to read the characters even if someone writes over them.

Google uses OCR to scan old books so that some contents become searchable, but legal wrangling over copyrights has slowed the project.3

Digital cameras are another important input device widely used for surveillance, security, and just entertainment. They monitor traffic patterns, building entrances, hall- ways, homes, ATMs, baby cribs, and even bird nests. In addition to the fixed cameras, mobile phones equipped with cameras are widespread, so the chances of passing a whole day without appearing in a photo or video are slim.

radio frequency identification (rFID) tags, another key technology in the Internet of Things, are small chips equipped with a microprocessor, a tiny antenna to receive and transmit data, and sometimes a battery (Figure 3-7). RFID tags can store information on an object’s history and whereabouts and can be embedded in anything, including pets, livestock, and human beings. The Department of Energy relies on RFID tags to track shipments of hazardous nuclear material.

Productivity tiP

Expanding your use of speech recognition can save you much time and improve your productivity, particularly for setting reminders, entering events on your calendar, getting directions, sending short text messages, locating the nearest gas station, or conducting simple web searches.

FIgure 3-5 Sample barcode.

FIgure 3-6

QR code. If you have a smartphone, down- load a QR reader for it and scan the image below. Recognize what is in the picture?

Source: U.S. Fish and Wildlife Service.

FIgure 3-7 RFID tag.

Source: Albert Lozano/Shutterstock.

Source: Isonphoto/Fotolia

radio frequency identification (rFID)

a technology placed on tags with small chips equipped with a microprocessor, a tiny antenna to receive and transmit data, and sometimes a battery that stores information on the tagged object’s history.

optical character recognition (Ocr)

the capability of specialized software to interpret the actual letters and numbers on a page to create a digital document that can be edited rather than a flat picture.

optical scanners

electronic devices that capture text or images and convert them to digital format.

With the Internet of Things, sensors are spreading extremely rapidly, and the trend is still in its infancy.5 Environmental sensors on smart buoys capture data such as water temperature and wind speed, and their data is transmitted in real time to the Internet (Figure 3-8).

output devIceS

The familiar flat-panel display is the most common computer output device for desktop com- puters. Falling prices make a large screen, or even two of them, quite affordable. For human beings, screen real estate is a valuable commodity, making it possible to view several applica- tions at the same time.

On the other end of the spec- trum are the small screens used for cell phones and handheld devices and the somewhat larger ones used in tablets and e-book readers. Other common output devices include computer printers and speakers as well as an enormous variety of controllers that operate machinery, from lawn sprinklers and lights to

an aircraft’s landing gear. Powered USB ports open up opportunities for creative inventors, who came up with several oddball output devices: heated slippers, coffee warmers, and air darts fired off with the mouse.

Swiping a touchscreen is easier than using a mouse and keyboard, but you can also use gestures without touching the screen. tiny controllers can sense hand and figure motions in the air so you can input touch-free commands in three dimensions. Many applications will benefit: Surgeons can perform operations with much smaller incisions,4 and gamers can crush foes with their bare hands.

DiD YOu KnOw?

Productivity tiP

Adding a second monitor can improve your productivity and also reduce the need to print documents. Sales figures show that corporations are buying at least two monitors for more than a third of their employees, and research confirms that most people work more efficiently with more screen real estate.

You’ll appreciate the second monitor even more if you work with a laptop on a desk.

FIgure 3-8

Buoy sensors collect live data that is made available on the Internet.

Source: National Oceanic and Atmospheric Administration, National Data Buoy Center. http://www.ndbc.noaa.

gov/, accessed February 18, 2016.

Processing

The computer’s brain is the central processing unit (cpU), which handles information pro- cessing, calculations, and control tasks. Early versions used vacuum tubes that frequently blew out, but with the invention of the transistor—a small electrical circuit made from a semicon- ductor material such as silicon—computers switched to using electrical signals to represent zeros and ones. The transistors are packed onto integrated circuits and mass produced at low cost (Figure 3-9).

Decades ago, Intel cofounder Gordon Moore predicted that the number of transistors fitting on a chip would about double every 2 years, a forecast that has proven surprisingly accurate. Now known as Moore’s Law, his prediction about density also captures advances in processing speed, storage capabilities, cost, and other computer features. Today’s low-cost laptop outperforms mainframes from the 1960s—and takes up far less space.

The computing architectures in Figure 3-10 illustrate how the technology has evolved, as each generation took advantage of declin- ing costs, increasing power, and advances that support mobility and ease of use. As we discuss later in this chapter and as you’ll see in the Devil’s Canyon decision-making simulation, decisions about these computing architectures should fit into the larger picture. Choices depend on the enterprise architecture. For example, a company that still relies on an old software system running on an expensive mainframe would need to keep that running temporarily but plan its replacement in the roadmap.

Computing Architectures Description

Mainframe Developed for large businesses in the 1960s and often called “big iron,” mainframes are still used for massive bulk processing tasks and financial transactions requiring high reliability. They are also deployed as servers for large networks. The mainframe market is dominated by IBM.

Supercomputer Introduced in the 1960s, these high-end computers feature the fastest processors for calculation-intensive tasks in areas such as physics, weather modeling, and molecular analyses.

Minicomputer Designed to be smaller in size and less expensive than mainframes, minicomputers and the terminals connected to them worked well for small and midsized businesses through the 1990s, after which many were replaced by PC servers. Now they are called “midrange computers,” and are used as servers.

Microcomputer Called PCs for short, these devices proliferated in organizations in the 1990s, replacing the dumb terminals and offering far more capability on the desktop. Powerful PCs are widely used as servers as well.

Laptop Valued for their integrated display screens and portability, these battery-powered PCs became popular in the late 1980s, facilitating mobility. They could run much of the same software as their desktop cousins, though more slowly. Many newer laptops offer touchscreen sensitivity, similar to tablets.

Netbook Engineered to be even smaller and less expensive than laptops, netbooks gained attention in the late 2000s as a cost-effective means to wirelessly connect to the Internet. Their low cost also facilitates widespread distribution, especially in developing countries.

Smartphones Offered initially in the 1990s, these devices combine cell-phone capabilities with data communications for web browsing, email, and text messaging.

Tablet A mobile device with a large touchscreen and virtual keyboard, a tablet is smaller and thinner than a laptop but larger than a smartphone.

They gained popularity with the introduction of Apple’s iPad, and many people add a regular keyboard.

FIgure 3-10 Computing architectures.

FIgure 3-9 Integrated circuits.

Source: Olga Miltsova/Shutterstock.

random access memory (rAM)

a computer’s primary temporary storage area accessed by the cpu to execute instructions.

in-memory computing

refers to the use of primary storage as the main place information is stored, rather than in secondary storage devices such as hard drives, to vastly increase speed.

byte

Measurement unit for computer storage capacity; a byte holds eight zeros and ones and represents a single character.

Moore’s Law

a principle named for computer executive Gordon Moore, which states that advances in computer technology, such as processing speed or storage capabilities, doubles about every 2 years.

transistor

a small electrical circuit made from a semiconductor material such as silicon.

central processing unit (cpU)

the brain of a computer, which handles information processing, calculations, and control tasks.

storage

How is all this digital information stored? Fortunately, Moore’s Law seems to apply to storage technology, so it is cheaper to satisfy the hungry appetite for more space. Storage capacities are measured using the byte, which typically holds eight zeros and ones—the equivalent of one key press for text. Figure 3-11 shows sample storage capacities. The Ethical Factor explores some of the ethical challenges surrounding these huge data repositories, called “big data.”

prImary Storage

A computer’s primary storage, typically on integrated circuits located close to the CPU, includes random access memory (rAM). RAM serves as a temporary storage area as the CPU executes instructions. It is a critical factor in the computer’s performance, and you often find extra room so that additional RAM can be inserted. RAM is volatile storage that is erased when power is turned off or lost; however, computers also have other nonvolatile chips that permanently store information.6

Secondary Storage

The massive quantities of digital information are written to secondary storage devices, includ- ing computer hard drives. Although easily accessible and searched, hard drives can be a million times slower than primary storage. Their rotating disks and moving heads cannot compare to solid-state integrated circuits, but capacity is higher and costs far lower.

Optical disks (CD-ROMs and DVDs) also offer low cost secondary storage as well as backups for offline storage needed for archiving, disaster recovery, and portability. Magnetic tapes provide cost-effective long-term storage capability.

Solid-state storage with no moving parts is also gaining popularity as prices drop and capacity increases. This category includes flash memory used in USB keys, memory cards for cameras, and hard drive substitutes for rugged laptops.

As prices for primary storage drop and processor speeds increase, developers are begin- ning to explore in-memory computing for certain applications that benefit from very high- speed, real-time access. This term refers to the use of primary storage as the main place information is stored rather than in secondary storage devices. The far slower hard drives are used mainly for backup and recovery. The increase in speed for software applications is phenomenal, particularly for analyzing enormous volumes of big data quickly.7

The business drivers that affect storage decisions include access, speed, cost, and safety. Organizations must have their most important data easily accessible to respond to customer queries and process transactions. For safety, all the organization’s data must also be backed up, and storage solutions depend partly on how much downtime the orga- nization can risk. Reloading from magnetic tapes stored in secure warehouses will take much longer compared to reinstalling from hard drives. Strategies for storage, backup, and recovery should reflect the organization’s needs.

Name Abbreviation Capacity Description

Kilobyte A short, email message

Megabyte 10242 bytes A digital song runs about 3 MB Gigabyte

KB 1,024 bytes MB

GB 10243 bytes About 1 hour of TV recording (not HD) Terabyte TB 10244 bytes About 150 hours of HD video recording

Petabyte PB 10245 bytes Facebook stores more than 300 PB of user data (2016)

FIgure 3-11

Measures of storage capacity.

Một phần của tài liệu Introduction to iinformaton systems 3e wallace (Trang 96 - 128)

Tải bản đầy đủ (PDF)

(468 trang)