Portfolio-Analysis Methods for Assessing Capability Options

Portfolio-Analysis Methods for Assessing Capability Options

Paul K. Davis
Russell D. Shaver
Justin Beck
Copyright Date: 2008
Edition: 1
Published by: RAND Corporation
Pages: 202
https://www.jstor.org/stable/10.7249/mg662osd
  • Cite this Item
  • Book Info
    Portfolio-Analysis Methods for Assessing Capability Options
    Book Description:

    An analytical framework and methodology for capability-area reviews is described, along with new tools to support capabilities analysis and strategic-level defense planning in the Defense Department and the Services. BCOT generates and screens preliminary options, and the Portfolio-Analysis Tool (PAT) is used to evaluate options that pass screening. The concepts are illustrated with applications to Global Strike and Ballistic Missile Defense. Recommendations are made for further defense-planning research.

    eISBN: 978-0-8330-4589-8
    Subjects: Political Science, Technology

Table of Contents

  1. Front Matter
    (pp. i-ii)
  2. Preface
    (pp. iii-iv)
  3. Table of Contents
    (pp. v-viii)
  4. Figures
    (pp. ix-x)
  5. Tables
    (pp. xi-xii)
  6. Summary
    (pp. xiii-xxx)

    The research reported in this monograph is part of RAND’s continuing work on practical theory and methods for capabilities-based planning in the Department of Defense (DoD) and other organizations. Its particular contribution is to describe and illustrate in some detail an analytic framework and methodology for defensewide capability-area reviews—including DoD’s experimental Concept Decision Reviews and related evaluations of alternatives (Krieg, 2007). The monograph also describes newly developed enabling tools—one for generating and screening preliminary options and one for evaluating in a portfolio-analysis structure those options that pass screening. Variants of the methods can be applied for analysis across...

  7. Acknowledgments
    (pp. xxxi-xxxii)
  8. Abbreviations and Acronyms
    (pp. xxxiii-xxxvi)
  9. Glossary of Terminology
    (pp. xxxvii-xxxviii)
  10. CHAPTER ONE Introduction
    (pp. 1-2)

    This monograph describes an analytical structure and related methods for conducting defensewide capability-area reviews for the Department of Defense (DoD). Such reviews should define and prioritize needs, assess alternative ways to improve capabilities, illuminate tradeoffs among options within a given capability area, and identify decisions and trades cutting across capability areas. In some respects, such reviews are comparable to what were once called mission-area reviews.*

    Our perspective is top-down and defensewide, intended to be comfortable to the Under Secretary of Defense for Acquisition, Technology, and Logistics (USD (AT& L)), the Vice Chairman of the Joint Chiefs of Staff (VJCS), the...

  11. CHAPTER TWO Background: The Capabilities-Development Process
    (pp. 3-16)

    In recent years, the Department of Defense has sought to create a process suitable for the capabilities-based planning (CBP) mandated in 2001 and reinforced in the most recent Quadrennial Defense Reviews (Rumsfeld, 2001, 2006). The Aldridge Report (Joint Defense Capabilities Study Team, 2004) reviewed problems with the preexisting planning system and laid out objectives and schematics for an improved process. The Joint Staff has identified both operational and functional capability areas (see Appendix A) and has organized to address them systematically and comprehensively. The military Services have reorganized their planning systems accordingly. To a significant degree, commonalities of vocabulary and...

  12. CHAPTER THREE A Framework and a Generic Terms of Reference
    (pp. 17-34)

    Given the larger context described in Chapter Two, we next develop an analytic framework and methodology to support defensewide capability-area reviews, such as those in the new, experimental Concept Decision Reviews.* Some readers may wish to move forward to Chapters Five and Six to see concrete examples before reading the more-generic discussions in this chapter.

    The capability-area reviews that we have in mind are specifically for top DoD officials, such as the USD (AT& L), VJCS, and DPA& E. It is not the duty of such officials to manage programs. Rather, their responsibilities are at a higher level. The USD...

  13. CHAPTER FOUR Tools to Enable the Framework
    (pp. 35-46)

    A number of types of models and tools are needed to accomplish the goals described in previous chapters. Figure 4.1 indicates this schematically. The arrows are deliberately two-way. It would be folly to construct a bottom-up system for decision support, because so much of the reasoning and analysis needs to be top-down in character. Nevertheless, the quality of the framework used at higher levels and the validity of the information presented there sometimes depend on deep knowledge, such as that residing in the world of systems engineering, detailed modeling and simulation, and experimentation.

    Although this chapter discusses such cross-level issues...

  14. CHAPTER FIVE A Notional Example: Global Strike
    (pp. 47-104)

    In this chapter, we walk through a notional application of our methodology to the Global Strike problem. When we chose Global Strike for our application, it had not yet been defined as one of AT& L’s capability areas, and no major studies, such as analysis of alternatives (AoA), had yet been conducted. We saw this as an advantage for what was intended to be an unclassified illustration of methodology. Later, we were asked by AT& L to develop the work further as preparation for an evaluation of alternatives (EoA) for Global Strike within the new Concept Decision Process (Krieg, 2007;...

  15. CHAPTER SIX A Second Example: Ballistic Missile Defense
    (pp. 105-120)

    In this chapter, we sketch a second application of the methodology, one for the Missile Defense Agency (MDA).⁶³ This application illustrates choosing among investments in diverse technologies and systems for a single mission, whereas the Global Strike example in Chapter Five involved investing with alternative missions in mind. Classification guidelines prohibit much discussion of even hypothetical systems and their potential shortcomings and vulnerabilities, but we can make a number of methodological points. Consistent with our general methodology (Figure 3.1), we discuss, in turn, defining the mission; developing an appropriate scenario space and a parameterized spanning set of scenarios; defining CONOPS...

  16. CHAPTER SEVEN Conclusions and Next Steps
    (pp. 121-124)

    The primary conclusions of this monograph are the following:

    The relatively generic methodology we have describedand demonstratedcan be used to guide and report results of defensewide capability-area reviews such as those now under way in DoD, particularly the new integrative Concept Decision Reviews that are a highlight of USD (AT& L)’s strategy.

    The methodology can be depicted in a simple process diagram, but one that includes important new analytical features that are frequently not present in reviews.

    The enablers of the methodology include underlying capability models for the area under study, a tool for generating and screening options...

  17. APPENDIX A Joint Capability Areas
    (pp. 125-130)
  18. APPENDIX B Implications for Systems Engineering and Modeling and Simulation
    (pp. 131-136)
  19. APPENDIX C RAND’s Portfolio-Analysis Tools
    (pp. 137-144)
  20. Endnotes
    (pp. 145-154)
  21. Bibliography
    (pp. 155-164)