SoftwareSSystems Units Selected for Review ‘able 1: Comparison of Components With IDEAL Model " 1 ‘Table 3 Phases of the IDEAL™ Model 20 ‘Table + Phases and Tasks ofthe IDEAL!™ Model 2
Trang 1GAO H77 TU EU DI Report to the Chairman and Ranking
Member, Subcommittee on Readiness and Management Support, Committee on Armed Services, U.S Senate
Trang 2Appendixes: Appendix i Objectives, Scope, and Methodology 18
Appendix Comments From the Department of Defense a Appendix II: Description of SEI Capability Maturity Models 36 Appendix Detailed Results of Review of DOD Components SPI Programs 38 Appendix Vz GAO Contaet and Staff Acknowledgments 66
‘Tables Table 2 SoftwareSSystems Units Selected for Review ‘able 1: Comparison of Components With IDEAL Model " 1
‘Table 3 Phases of the IDEAL™ Model 20
‘Table + Phases and Tasks ofthe IDEAL!™ Model 2
‘Table 6: Army Examples of Alignment ‘Table 6: Comparisons of Army SPI Activities With the IDEAL™ With IDEAL™ 0
Model a
‘Table 7 Air Force Examples ‘Table 8: Comparisons of Air Force SPI Activites With the of Alignment With IDEAL 4
IDEAL Model 46 Table 8 Examples of Navy Alignment With IDBAL™ sĩ
‘Table 10: Comparisons of Navy SPI Activities With the IDEAL™ Modet 5L
‘Table 1: Examples of Marine Corps Alignment With IDEAL 56
‘Table 12: Comparisons of Marine Comps SPI Activities With the IDEAL™ Model 5T
‘Table 18: Examples of DFAS Alignment With IDEAL 6L
"Table HE CompardsonsofDEAS SPI Activiles WHh the TDEALZ% Model s Figures Figure I; Simplified Diagram of the IDEAL™ Model Figure 2; SW-CMAMU® Levels and Descriptions s6 7
Figure’ Partial Army Organization Chart Highlighting Units Responsible for SoftwaresSystems 30 Figure 4: Partial Responsible for Software/Systems Air Force Organization Chart Highlighting Units “ Figure 6: Partial Navy Organization Chart Highlighting Units Responsible for SoftwaredSystems 4g Pace (€X0.0116 Beton nfrmatonTechaaogy
Trang 3
‘igure 6: Partial Marine Comps Organization Units Responsible for SoftwareSystoms Chart Highlighting 55 Figure 7: Partial DFAS Organization Chart Highlighting Units Responsible for Information Systems oo igure: Part DIA Organization Chart Highlighting Units Responsible for Software Systems 65,
‘Kvbreviations APA AlrForce Academy AFCA Air Force Communications Agency AMC Army Materiel Command
AMCOM Aviation and Missile Command CECOM Communications-Electronies Command C10 chief information officer
CMMI™ Capability Maturity Model Invegration ™*
DFAS Defense Finance and Accounting Service DLA Defense Logistics Agency
DOD Department of Defense PSO Financlal Systems Organization IDEAL™ initiating, dlagnosing, establishing, acting, and leveraging TTD Information and Technology Directorate
MCTSSA Marine Corps Tactical Systems Support Activity MSG NAVAIR Naval Aviation Systems Command Materiel Systems Group
OSD Office of the Secretary of Defense SEC Software Engineering Center SED Software Engineering Directorate SEI Software Engineering institute SEO systems engineering organizations SEPG software engineering process group SPAWAR Space and Naval Warfare Systems Command SSC SPAWAR Systems Center
SSG Standard Systems Group SW-CMME® Software Capability Maturity Model®
SPL softwarefystems process improvement
15 Det ufrmale Team
Trang 4“The Honorable James M Inhofe Chairman
‘The Honorable Daniel K Akaka Ranking Member
‘Subcommittee on Readiness and Management Support
‘Committee on Armed Services United States Senate
With an annual information technology budget of about $20 billion, and tens of billions more budgeted for technology embedded in sophisticated
‘weaponry, the Departinent of Defense (DOD) relies heavily on software Intensive systems to support military operations and associated business funetions, such as logisties, personnel, and financial enanagement One
‘important determinant of the quality of these systems, and thus DOD'S nission performance, is tne quality of the processes used to develop, ogre, and engineer them Recognizing the importance ofthese processes
to producing systems that perform as intended and meet cost and schedule commitments, successful public and private organizations have alopted and implemented softwareystems process improvement (SPD programs!
KT stan abd tens ening Softare deepen fest ets an
‘ration seo bd and sansa software whe sofware acest ees
‘tes an ceadzacon tedxlonieln >efa deteoped by nleronenôatea Sms ensncrn ett ete onganton ss ie del, et
Trang 5
“This report is part of our response to your request to compare and contrast OD information technology practices with leading practices In particular,
‘you asked us to review DOD components (military services and Defense agencies) SPI management activities to ensure that DOD is taking the necessary steps to continuously strengthen its software and systems ‘development, acquisition, and engineering processes As agreed with Your
“offices, our objectives were to (1) compare selected DOD components! SPL
‘programs against Carnegie Mellon University’s Software Engineering Institutes (SED IDEAL!™ model, which isa recognized best practices model, 2) determine how these components have approached
‘management of their SPI programs, and (3) determine what DOD-wide efforts are under way to promote and leverage the components? SPI
‘rograms The components that we selected were the Departments ofthe
“Army, Air Force, and Navy the Marine Corps; the Defense Logistics Agency (DIA); and the Defense Finance and Accounting Service (DFAS)
reviewed, We performed our work from March through December 2000, in saccordarice with generally accepted government auditing standards (See appendix I for details of our objectives, scope, and methodology including the specific service units reviewed.) DOD provided us with written
comments on a draft of this report These comments are summarized in the
"Agency Comments and Our Evaluation" section ofthis letter and are reproditced in fallin appendix
Background OD maintains a fore of about $ milion military and eliian personnel worldwide To protet the secunty of the United States, the departn
relies on a complex array of computerdependent and mutually supportive ‘organizational components, inching the military services and Defense agencies It also relies on a broad array of computer systems, including
"Sir a anonalyreoogtand (orl fande esearch and develop cmon ‘Stash at Carnegie lon Unveray ose xo xe engineering penis
"ADEAL ea serce mark of Cessation Dnrersdy and stande for iting agri asin, cig and eer
Trang 6
“weapons aystems, command and control ystems, satelite ystems, inventory management systems, Gnatcialeystems, personnel ystems,
‘payment systems, and others Many of these systems in tum are connected
‘with systems operated by private contractors other government agencies, and international organizations DOD's ability to effectively manage information technology i critical to its
hy to accomplish its mission, fs reliance on software-intensive systems
to support operations related to intelligence, survellance, security, and sophisticated weaponry—along with financial management and other business functions —ill only increase asthe department modernizes and responds to changes in traditional concepts of warfighting
“The scope of DOD's information technology inventory is vast: over 1.5 rillion computers, 28,000 systems, and 10,000 computer networks Further, ‘many of DOD's most important technology projects continue to cost more than projected, take longer to produce, and deliver less than promised AS aresul, we have designated DOD systems development and modernization
‘efforts as a high-risk area*
"The quality ofthe processes involved in developing, acquiring, and
‘engineering software and systems has a significant effect onthe quality of the resulling products Accordingly, process improvement programs can Increase product quality and decrease product costs Public and private ‘organizations have reported significant returns on investment through such rocess improvement programs SEI has published reports of benefits rallzed through process improvement programs For example, SEI reported in 1905" that a major defense contractor implemented a process Improvement program in 1088 and by 1905 had reduced its rework costs from about 40 percent of project cost to about 10 percent, increased staff productivity by about 170 percent, and reduced defects by about 75, percent According to a 1900 SBI report, a software development
‘contractor reduced its average deviation from estimated schedule tie {raat Dip a Det Pil War 100 Peace Rapa and
‘isl Tar 001 Peformance Pra (GADNSIAD- 188, ane 9 28)
‘aug is Sere: An Upto (OAR, aaary 1,
‘Tectia Report CMUSELOS-TRANT, November 105
"Teetwla T1
Trang 7
‘rom 112 percent to 6 percent between 1088 and 1906 During the same
‘period, SEI reported that this contractor reduced its average deviation
‘rom estimated cost from 87 percent to minus 4 percent
‘To aid organizations attempting to initiate and manage SPI programms, SEL has published a best practices model called IDEAL,™ hich defines systematic five phase, continuous process improvement approach, with &
‘concurrent sixth element addressing the program management tasks spanning the five phases*(see figure 1)
“." Ố tro ‘Gn@01 116 Defene tortion Technolgy
Trang 8
‘igure 1: Simplified Diagram of he IDEAL™ Mode
+ Initiating: During this phase, an organization establishes the ‘management structure ofthe process improvement program, defines and assigns roles and responsibilities, allocates initial resources, {develops a plan to guide the onganization through the first three phases
‘ofthe program, and obtains management approval and funding Two key ‘organizational components ofthe program management structure established during this phase are a management steering group and a ‘software engineering process group (SEPG) Responsibility for this, hase ests with senior management
`
Trang 9
> Diagnosing: During this phase, the SEPG appraises the current level oF ‘process maturity to establish a baseline capability against which to
‘measure progress and identifies any existing process improvement inliatses, The SEPG then uses the baseline to identify weaknesses and target process improvement activities It also compares these targeted activities with any ongoing process improvement activities and reconciles any differences Responsibility for this phase rests primarily with lie managers and practitioners
+ Establishing: During this phase, the SEPG prioritizes the process Improvement activities and develops strategies for pursuing them It then develops a process improvement action plan that details the activities and strategies and includes measurable goals for the activities and metrics for monitoring progress against goals Also during this phase, the resources needed to iruplement the plan are committed and training is provided for technical working groups, who will be
responsible for developing and testing new or improved processes Responsihity for this phase resides primarily with line managers and practitioners
‘+ Acting: During this phase, the technical working groups, formed under the esiablishing phase, create and evaluate new and improved processes, Bvaluation of the processes is based on pilot tests that are formally planned and executed, If the tests are successful, the working -sroups develop plans for orgaaization-wide adoption and
Insitutionaliztion, and once approved, execute them Responsibility {or this phase resides primarily with line managers and practitioners + Leveraging: During this phase, results and lessons learned from earlier phases are ansested and applied, an appropriate, to enhance the
structures and plans of process improvement programs Responsibility
{or this phase rests primarily with senlor management
“The models sixth element, continuous program management, specifies
‘management structures ad tasks for planning, organizing, directing, staffing, and monitoring the program Responsibility for this element rests
‘with senior management
"Sis devsopd proces atari no Tor svar develops, sofware seats and etre einer wel an negated mee fer apronng afte see ction and ance Se apn irra te
Trang 10
ach phase of the IDEAL™ model contains several recommended tasks, Appendix I, which deseribes our objectives, seope, and methodology, Identifies al tasks for each phase
Results in Brief ‘The DOD components that we reviewed vary in how they compare to SE's IDEAL™ model In particular, the Air Force, Army, and DFAS generally
satisfed the models recommended tasks, as did certain Navy units
However, DLA, the Marine Comps, another Navy units dd not Specifically, DLA does not have an SPI program, although during the course of our review the DLA Chief Information Officer stated that she intends to establish one Further, although the Marine Corps is performing many SPL activities, core tasks associated with an effective SPI program, such asa plan of action or dedicated resources to implement recommended Improvements, are missing Finally, certain Navy units also do not have SPI programs aligned with the IDEAL™ model, although one is perforng a few of the model's recommended asks
‘The four components with SPI programs (Army, Air Force, DFAS, and parts ofthe Navy) are using different management strategies for directing and controlling their respective programs Nonetheless, all components with SPI programs report that they have realized benefits in product quality and
‘productivity For example, DRAS uses a centralized management approach land reports that is SPI program has helped decrease development costs about one-third lower than those of similar organizations In contrast, the Army uses a decentralized approach and also reports tht the SPI programm for one ofits organizational units has helped it almost double its
‘productivity in developing software
DOD-wide activities to promote and leverage component SPI programs do rot exist Aecording fundamental to continuous process improvement While wo organtzational 10 the IDEAL model, leveraging SPI experiences is
“unite within the Office ofthe Secretary of Defense (OSD) that have {important leadership roles to play in department software and system, processes are both taking steps almed at strengthening DOD software,
‘these steps do not specifically include SPL In particular, OSD des not have initiatives under way or planned to determine where in DOD SP! programs
do and do not exist so that steps can be taken to promote programs in ‘component units where they do not, such asa DLA Similarly, actions do
"Rötexis to share informtion across the department about the experiences
‘of successful SPI programs, such as those within the Army, Navy, Alr Force, ‘nud DFAS, According o OSD officials, uncertainty about the casts versus
Pose {0d0.01-16 Bete tafrmatonTehntogy
Trang 11
‘benefits of SPI, resource constraints, and other priorities have precladed such a focus Without such actions, DOD is missing opportunities to realize potential SPI benefits in all DOD components To adress this, we are
‘making recommendations to the Secretary of Defense
DOD provided written comments on a dealt ofthis report In commenting, DOD agreed that SP practices should be used and encouraged and that information about best practices should be shared, However, DOD stated that itis premature at this point to mandate SPI programs throughout the ‘department, as we recommend, and that ihas established a working group
to review how best to proceed While we believe that sufilent bases ‘curently exist to mandate SPI, particulary in ight of the evidence inthis report on (I) components that are not implementing SPI in the absence of a ‘mandate and 2) the benefits being reported by components that are {implementing SP, we do not view DOD's desire to await the results ofits ‘working group as beng unreasonable or inconsistent with our recommendations
‘ofhow cach component and te units, i applicable, compared to the
‘model, For exaniple, in 1096 the Secretary of the Army mandated that all, software development, acquisition, and maintenance activities establish
‘SPI programs Further the Amy requires that its software activities
‘continually improve their process maturity and has set maturity goals for all ofits unit, Army regulations also mandate that contractors be
‘evaluated for software process maturity Moreover, the two specific units
‘within the Army that we reviewed have SPI management structures, plans, And dedicated resources In addition, these units have continuously evolved in software and system process maturity through many years of asesaing their baseline process capabilites, implementing new andl Improved process inilatives, reassessing process maturity, and Implementing lessons learned Both Army units satisfy all IDBAL™ tasks
te -GÁ0 81116 Defense Information Technology
Trang 12
—-—ễễễễễ= Tablet: Comparison of Components With IDEAL™ Model
‘componeet Softwarfsysems nit Gonerallycatistieg?
Am Commend ‘Canmunicaions Hacoucs _ Seware Engiioning Garay For Monmouth, Vee Ni
‘akon and aie Comand SofmusEnạnsemgDredomiefReddmne - Vm Tava vain Store Comrand, No ppieaio™ = bee Pansat er MD
Space snd Navel War Sytas Systems Cenex oman ‘San Diogo CA -
vA te
XE Em Ssbme Gemir Seg sone on Wwe Arras Yor
ato Stas Gov, Woah Patezan Ar Yo
Au Force Academy Colesso Not npcae ve Sprngs,cO
‘iarme Cope Marine Cops Syste Caps Ser Conmand Warne Crp Totes Sysere Cpe aa Sage sm We
ras Tremsien ni recta, Ainge, VA Top Ye
oa Heasquarions For Bevo, VA Not appizable m3
“Tagg pn ta tapers es ah Be Comer ae
eters gaan oe
In contrast, DLA, the Marine Corps and two of the Navy's four units that
‘we reviewed do not perform important IDEAL™ model tasks In particular, DLA currently doesnot satisfy any of the model's recommended tas
‘According to DLA offical, it had an SPI program prior to 1988, but at that time the program was terminated to reduce costs During (C10 stated that the agency plans to begin a new SPI program and has taken our review, DLAs first step by assigning organizational responsibilty
‘The Marine Corps has many SPI activites under way that could form the foundation of a program However, itis not performing several key SPL tasks that are fundamental Marine Comps has assigned responsibilty for process improvement, and it to SPI program success For example, the
‘has begun assessing its software process maturity to establish baseline
Trang 13
Components’ SPI
Management
Approaches Vary, Yet
‘All Report Positive
Program Results
‘capability, However, tis not using this baseline as a basis for implementing
‘ecommercd improvements, or does it have an SPI plan or dedicated resources for these activities As such, the likelihood of the Marine Corps? process improvement initiatives producing desired results is diminished
"Two of the four Navy software/systems units that we revieeedl also do not have SPI programs that are aligned withthe IDEAL! model To thele credit, however, one has recently taken the first step toward initiating a programa and the other has activities under way that could form the beginnings of
‘rogram (See appendix IV for more detailed results on each ofthe
‘components that we reviewed.)
‘The four components that have SPI programs—Amy, Ait Force, DFAS, and parts of the Navy—have differen approaches for directing and controling ther respective programs, raging from centralized to highly decentralized; each, however, repors positive results For example, DFAS has a
‘centralized approach, wit ts headquarters all SP acivties In contrast, the Arty, Alt Force, and Navy have office directing and controlling decentralized approaches to SPI program management, The Army, which bbgan its SPI program centrally, has since delegated SPL responsiblity to its commands, which—in the case ofthe two commands we reviewed—have further delegated SPI program management to their respective
software!systems units Sinilany the Air Force units that we reviewed further delegated SP! management to their respective sotware/systems units The Navy commands fellow different approaches-—one manages its
‘program centrally and the other has delegated SPI management to its
Trang 14
DOD-Wide Efforts to
Promote and Leverage
SPI Programs Do Not
Exist
Within OSD, the Assistant Seeretary for Command, Control,
‘Communications, and lnteligence i responsible for establishing and implementing DOD's policies, processes, programs, and standards
‘governing the development, acquisition, and operation of ronweapons systems software and information systems." Similarly, the Under Secretary {or Acquisition, Technology, and Logistics is responsible fr establishing,
‘DOD anquistion policies and procedures” Accordingly, OSD his an Important leadership role to playin ensuring that DOD components reap the maximum possible benefits of effective SPI programs Sich leadership
‘an include dissemination of polices and guidance promoting SPL programs and activities, knowledge ofthe nature and extent of
‘components SPI programs and activities, associated lessons leamed and best practices, and facilitation of SPI knowledge-sharing across DOD ‘components oth OSD organizational units have efforts under way aimed at improving some aspects of DOD's ability to develop and acqire software and systems For example, they have established teams to conduct software Acquisition maturity assessments and established a software collaborators group They also are collecting software metrics and establishing training,
for managers
However, OSD has no SPI actions under way or planned, such as issuing policy and guidance on SPI programs; determining where in DOD SPL prograns do and do not exis; promoting the establishment of programs in
‘component units, such as DLA, where they do not exist; and sharing
‘knowledge across DOD about the experiences of reportedly successful SPL programs, such as those within the Anny, Air Force, DFAS, and pars of the Navy According to OSD officals, uncertainty about the costs versus
‘benefits of SP, resource constraints, and other priorities have precluded such afocus However, as stated earlier inthis report, various
‘organizations including some DOD components, report positive returns on {nvestinent from SPI programs that arghe for SP being treated as a funding proxi
DD Dee RTT pop Decewe S41,
Trang 15
‘Several DOD components have SPI programs that realigned closely to the
‘Conctusions: best practices embodied inthe SEI IDEAL model and ts provide ‘excellent examples of SPL However, such programs are lacking in other
pars of the department Where they exist, these programs are being
‘redied with producing higher quality software and systems produets faster and at less expense, whether managed in a centralized or
decentralized fashion
(08D has an important leadership role to playin expanding <department In particular, it ean seize opportunities to build upon and SP! across the leverage the existing base of SPI programs within DOD's components and help ensure that ll of ts components realize the strategie value (Le,
‘benefits that exceed costs) that both private and public-sector
‘organizations, including some DOD components, attbute to these programs While OSD is faced with making funding choices among competing leadership inltiatves, such as is efforts to conduct software acquisition maturity assessments and collect software metrics, dese are some ofthe very tasks that are embedded within an effective SPI program ‘Thus, by ensuring that DOD components have effective SPI programs, OSD
an leverage programs to nde accomplish is ter he pronty Recommendations for psteien DIA Marne Corps and Navsoftmae and ates : : fevelopment, acquisition, and engineering processes, we reco Executive Action the Secretary the Marine Corps, and the Secretary ofthe Navy to establish SPI prograns of Defense direct the Director of DLA, the Commandant of
‘where this report shows none currently exist In so doing, these officials sould consider following the best practices embodied in the SEI IDEAL
‘model and drawing some Navy unis from the experiences ofthe Any, Air Force, DPAS, and
‘Further, to strengthen DOD-vide SPI, we recommend thatthe Secretary of Defense direct the Assistant Secretary of Defense for Command, Control, Communications, and intelligence, in collaboration withthe Under Secretary of Defense for Acquisition, Technology, and Logistics, to (1) issue policy requiring DOD components that are responsible for
_ystemsvsoftware development, acquisition, or engineering to implement SPI programs, and 2) develop and issue SPI guidance and, in doing So,
‘consider basing tis guidance on the SEI IDEAL model and the positive
Trang 16
‘examples of SPI within the Anny, Air Force, DPAS, and some Navy nits
‘ted in his report
We also recommend that the Secretary direct the Assistant Secretary for Command, Control, Communications, ‘determine the components compliance with the SPI policy and and Intelligence to (1) annually (@) establish and promote a means for sharing SPI lessons leamed and best practices knowledge throughout DOD
‘Agency Comments and ln writen comments on a draft ofthis report, the Deputy Assistant
Our Evaluation agreed withthe reports message that SPI practices should be used and Secretary of Defense for Command, Control, Communications, and Intelligence, whois also the DOD Deputy Chief Information Officer (C10),
fencouraged, and that information about SPI practices should be shared ‘among DOD components To this end, and since receiving a draft ofthis report, the Deputy CIO stated that the Under Secretary of Defense (Acquisition, Technology, and Logistics) has established a working group" {hat s, among other things, to develop a plan for implementing SPI
According tothe Deputy CIO, this plan willbe ready for internal review in
‘April 2001 Further, the Deputy CIO stated that a January 2001 revision to DOD Regulation 50002-R” represents a policy step toward addressing software improvement by including in the regulation a section on software
‘management, According tothe Deputy CIO, while this section does not specifically call for an SPI program, the regulation provides guidance for Improving software by using, for example, SEI Capability Maturity Model level 3 or ts equivalent for major acquisition programs with procurement costs in excess of $219 billion *
‘il group cle the ncopenlen Raper Program dew Wag Gap owas ssubtied Janay 200, tern Regulation 50022, “Mandatory Procdhres fr Major Defense Aegon
"ganar toa ean Ste Anton Pr” Cana 4
“tr Regulation 50028 refers to dese progam as AcqustionCaegury (ACA 1 programs
tang -0A0.01116 Deteex Idol Tteaoela”
Trang 17
Tnlight ofthe above, the Deputy CIO stated that DOD agreed with our recommendation to establish and promote a means for sharing SPI lessons learned and best practices knowledge throughout DOD, and add that a DOD steering group,” which was chartered during the course of our review, has been assigned responsiblity for this function, However, the Deputy ClO disagreed with our recommendation that DOD issue apoliey to
‘mandate SPI programs for all DOD components and their relevant activities According to the Deputy CIO, establishing a policy requiring oF otherwise directing DOD components that donot have SPI programs to lunplement them would be premature at this time because there are insufficient data to Justify the sole use ofthe SELIDBAL™ model and that
‘unless a specific model were used, compliance with such a policy oF dlrective would be problematic Therefore, the Deputy CIO stared a
‘decision regarding the issuance of DOD-wide policy mandating the {implementation of SPI programs would not be made until the work group
‘reports its results and develops its plan for implementing SPL At this point and without the work group's findings, according to the Deputy ClO, {issuance of SPI guidance (as opposed to *policy”) would be “a more Deneficial approach.”
In our view, the Deputy C10’ comments are not inconsistent with our recommendations, and our point of disagreement appears to center around sump the timing of actions rather than the recommended actions
‘hemselves Specifically, while we continue to believe that suficent bases currently exist for issuance of a DOD SPI policy requirement, especially in light ofthe evidence in our report that (1) without this requirement not all
‘components are implementing SPI and (2) those components that are currently implementing SP1 are reporting substantial benefits, its reasonable for DOD to await its work group's results before malang & {ecision on how to proceed Further, we agree with the Deputy CIOS
‘comment that there ae insicient cata to ustfy citing in DOD poliey the SELIDBAL™ model a the single model for SP Our report recognizes that notall ofthe DOD components that we cited as having effective SPL programs not prescribe are using the same model Asa result, our recoznmendations did a specific SPI model Instead, we recommended that in developing SPI policy and associated guidance, DOD should consider basing this dance on the SEI IDEAL" model as well a the positive
“is oap scaled he Stiware nerve Sens Sern Grow was area
=——
tre lề A0-0.448 Dette Internation Tchatogy
Trang 18‘rogramns that exceed a certain dollar threshold Therefore, the revised
‘regulation does not fall the intent of our recommendations,
‘Young, in their eapacities as Chairmen, Ranking Members, of Ranking Minority Members of Senate and House Committees and Subcommittees Inaddition, we are sending copies ofthis report to the Secretaries of the
‘Ammy, Navy, and Air Force; the Commandant of the Marine Corps; the Directors of DLA and DFAS, and the Director, Office of Management and Budget Copies will also be available at GAO's web site, www gao.go% Ifyou have any questions about this report, please contact me at (202) 512-
399 or by e-mail at hter@gao go Key contributors to this reportae listed
in appendis V
2⁄2
Randolph C Hite Director, Information Technology Systems Issues mạn {©a0.01 146 Deteae ttomation ebnsogy
Trang 19Appendix 1
Objectives, Scope, and Methodology
‘Our objectives were to (1) compare selected DOD components? SPI
‘programs against SETs IDEAL™ model, which is a recognized best
‘Prictices model; (2) determine how these components have approached
‘management of thelr SPI programs and what program results they are reporting; and (3) determine what DOD-wide efforts are under way to promote and leverage the components’ SPI programs The selected
‘components include all four services—Army, Air Force, Navy, Marine CCorps—and two DOD agencies that have lage, software intensive system rodemization programs under way-—the Defense Finance and Accounting Service (DFAS) and the Defense Logistics Agency (DLA)
‘To address the first objective, we reviewed the components respective information technology strategic plans as well as available SPI policies, fuidance, and program documentation, and interviewed headquarters officials from each component Using this information, we frst ascertained
‘whether SPI programs or activities existed for a component, and 50, How they were organized and stractured For the components in which we {ound SPI programs or activities, we then identified the units within the
‘components responsible for implementing those programs and activities Ininstances in which these responsibilities were decentralized (Army, Alt Force, and Navy), we worked with component headquarters and command officials to select atleast two units in each component (@) had mussions involving both software intensive weapons and business that collectively systems and (2) were reeporuable for the largest percentages of software and systems development, acquisition, and engineering activities within
‘each comporient Table 2 shows the DOD conponents ane softwarefsystoms nits where we reviewed SPI programs and activites
‘Where “not applicable” is indicated in the table, SPI responsibility resided atthe “Command/major organizational unit,” and therefore our work dia hot extend toa “Software/sytems ni”
“GFAS iso ape over $27 ten 207 to modi ia nce and acon
‘tems DLA pans to ta abo K02 mlơn by 2 to nơdefnie le buanes tens
Trang 20AE “—.ửừ._ừ_ò
“Tnhle2: SofeerSyelone Une 8eleeted for Relew
‘Gampanent — Cemmaranajer organizational unit SSiwaslysemsuni _
ny ‘Comunieatne- Electonics Command Sofware Engineering Geno
‘Raion a wie Command Sofware Engng Desert Navy Taal Avan Stara Command Net aspicab
‘Space end eval Wear Systeme Sysans Gener
Command ‘Sen Diogo me
‘irFern ocvone Sytem n "an xin IS Mater Systems cup
‘laine Gaps Maro Corp Stone Conmand Marine Corps Tacieal ‘Systems Support Acity
oa TRSTalonand Techno Beste Not appheabes
Designed to assist onganizations in implementing and managing effective SPI programs, the SEF-developed IDEAL™ model comprises fve specifi phases; sixth element addresses overall management ofthe five phases,
‘Table 3 provides more information about the tasks involved in each phase
‘Table 4 Usts every task included under each phase,
Page 19 (640.0116 Desese tgarmation Tool,
Trang 21Offoatve Seope, ant Metodoony
Pras "Paseiplonor ypioiaste
Tan, Senor managers oslabish SPI pga suture, Gli lon, steal resource, ad develop apa to gu (0a) ‘fecrganaton tough he Estaiohing pao: management coment ons cng esas Ts Koy ‘Srstral componoria eablshd nthe phso area ranagerent stering poup ahd eotwae engbsemma Peccss gio (SEPG) Bisgreaing 1 basen capabnyagarel 'SEPG—wi ie managats and praciiones—appracs ha lve of coware process au i lane wren lo racute progress ry xising process improverot faves re
ofc slong wh mesineasea and neesed error
Em đen 'SEPG, tne ranages, and practioner protze SP ates and devlop Hates ane an aco lan, Ineladng measurable goal nd mets fr molting progiss, Rescues needed to mpimont tbe pan a0
‘omni and Veing i provod for occa Wort oupe fa Joep and is Nw or mgroves process ing Fiona lene we ooaueg eau nev alnpietpooneay coved Baloch (60a) ‘pouna esis scone, pares dowloped er fgariabonwide aaeptan, nebitanaleaton, a
Laesgne (Tasks) ha Marsoing DI suse que an proaos “Senior manager rau hel deisana Ters 262i onrpìn:atonangeds and ta ta mansgsmanE SP task
Teak hen lao dep Organize dacovery aan fodewiop« Số mo3Hm proposal io management
lEenhh boss neeo:anddrves r hhemnsmanE Bead an SP propa cate and bl eu
‘Goin approve for SP propel adi Tsou Enbish SẽIninsnsame
‘sess te nae for SPL Detne genera 1 aes Beine gướng phpopsssl SE) mg
Launch te progr
Page 20
Trang 22
ae c.à
(Gerrued From Provous Pago) Phase Task
‘iagrosng — Detare wa bassinet) are neated Panly baseinei)
‘ond baseinate) Fresno
Devoop fra ndings and rcorvmondations reser
“Tarelorm general 51 gone fo mansiabe goa
‘restoupaao SP sepe pan
‘a consensus row, prove SP sain pls and Sone
Package improvement and rn orto SEPG
‘isbn eerie! woking group
“Tansienlo ong sueor T8mmgng Gator esons earned ‘nates wsten eae
Trang 23
‘Managing Sete stage for SFI Organs he SP negra
Plane SPI program Sate SP progam ono the SPI progam Diet ha $1 Program CS ————
‘To address the second objective, we analyzed the aforementioned information, conducted additional interviews, and reviewed additional
‘program information from the component units to which SPI management responsibilty had been delegated As part of this objective, we also reviewed program progress and performance reports and discussed program accomplishments with responsible officials to identify examples
‘of SPI benefits We then analyzed each components SPI program results in ‘elation to its program management approack to determine whether any patterns were evident We did not independently validate components reported accomplishments and benefits,
‘To adress the third objective, we interviewed responsible component officials, reviewed supporting records and documentation, and visited Internet sites to identify SPI program best practices and lessons leamed, along with what efforts are being made to share these with other activities and components throughout the department We also identified two offices ‘within the Office ofthe Secretary of Defense (OSD) that have responsibility and activities underway relating to the advancement of software and system management practice in the department—the Office of the Deputy Under Secretary of Defense for Acquisition, Technology, and Logistics; and the Office ofthe Assistant Secretary of Defense for Command, Control, Communications, and Intelligence For each office, we analyzed
documentation describing their respective ongoing and planned activities and interviewed officials n doing so, we focused on identifying any activities that specifically promoted and leveraged SPI programs ancl activities under way throughout DOD We also discussed with SPI program officials in each component their awareness of the OSD efforts,
We performed our work at Army headquarters, the Pentagon, Arlington,
‘Virginia; and interviewed officials and reviewed documentation from the Communications Electronics Command Software Engineering Center at Fort Monmouth, New Jersey; and the Aviation and Missile Command Paget GAO.OL-116 Defme IntrmationTechaology
Trang 24
xa tfc Senge and Methodolge
‘Software Bagineering Directorate at Redstone Arsenal, Alabanva We ao performed our work at Navy headquarters in Arlington, Virginia; and Inwerviewed officials and reviewed documentation from the Naval Aviation
‘Systems Command at Patuxent River, Maryland; and the Space and Naval Warfare Systems Command Centers at San Diego, Califor; Chesapeake,
‘Virginia; and Charleston, South Carolina We also interviewed officials and reviewed documentation from the Air Force's Electronic Systems Center, Standard Systems Group at Maxwell Air Force Base, Alabama; the Materiel Systems Group at Wright Patterson Air Force Base, Ohio; and the Air Force
‘Academy in Colorado Springs, Colorado, We also performed our work at Marine Comps headquarters in Arlington, Vigna; and interviewed officials sand reviewed documentation from the Marine Corps Systems Command in Quantico, Virginia; and the Marine Comps Tactical Systems Support Activity
a Camp Pendleton, California, We also performed work at DEAS headquarters in Arlington, Virginia; and DLA headquarters at Fort Belvoir,
‘Virginia We condacted our work from March through December 2000, in accordance with generally accepted government auditing standards
tem A0 LAN 0dimeedonnsdenTramelam”
Trang 25———-— - pie omen Rp ss cma ov nghệ