Motion Imagery Processing and Exploitation (MIPE)

Motion Imagery Processing and Exploitation (MIPE)

Amado Cordova
Lindsay D. Millard
Lance Menthe
Robert A. Guffey
Carl Rhodes
Copyright Date: 2013
Published by: RAND Corporation
Pages: 52
  • Cite this Item
  • Book Info
    Motion Imagery Processing and Exploitation (MIPE)
    Book Description:

    This report defines and investigates the potential of motion imagery processing and exploitation (MIPE) systems, which can enable military intelligence analysts to respond to the current information deluge and exploit a wide range of motion imagery collections. MIPE systems aid analysts in the detection, identification, and tracking of objects of interest and activities of interest in live and archival video.

    eISBN: 978-0-8330-8482-8
    Subjects: Political Science, History

Table of Contents

  1. Front Matter
    (pp. i-ii)
  2. Preface
    (pp. iii-iv)
  3. Table of Contents
    (pp. v-v)
  4. Figures
    (pp. vi-vi)
  5. Summary
    (pp. vii-ix)
  6. Acknowledgments
    (pp. x-x)
  7. Abbreviations
    (pp. xi-xv)
  8. Chapter One: Introduction
    (pp. 1-6)

    In the early 21st century, all sectors of society—from private businesses to governments, scientific and artistic institutions, and individuals—are experiencing a growing flood of information. The giant retailer chain Wal-Mart processes more than one million customer transactions per hour, feeding databases estimated at 2,500 terabytes,12which is the equivalent of 167 times the information in the U.S. Library of Congress.13The telescope used for the Sloan Digital Sky Survey collected more data in its first few weeks of operation during the year 2000 than had been collected in the entire history of astronomy, and its archive has now...

  9. Chapter Two: Motion Imagery Automatic and Assisted Target Recognition
    (pp. 7-15)

    As stated in Chapter One, although many technologies provide MIPE capabilities, much of the relevant research and development lies in the field of MI-ATR. MI-ATR has its roots in the field of computer vision, a relatively new field of study encompassing the science and technology that enables machines to extract information from an image or sequence of images.36The original goal of computer vision was to develop visual perception techniques to help mimic human intelligence in robots; it was not until the 1970s, when computers became capable of processing moderately large sets of image data, that a more focused study...

  10. Chapter Three: Testing and Evaluating Current and Future MIPE Systems
    (pp. 16-20)

    All of the services and a number of different agencies affiliated with the Department of Defense (DoD) have worked or are working to develop systems that provide MIPE capabilities. However, much of the information on these systems is proprietary and thus not available to the general public. In this chapter, we briefly discuss three representative MIPE systems that use different methods: VIVID, the Video and Image Retrieval and Analysis Tool (VIRAT), and video analytics.70We then discuss how military intelligence organizations can go about testing and evaluating current and future systems.

    VIVID is a MIPE system developed by DARPA. Like...

  11. Chapter Four: Conclusions and Recommendations
    (pp. 21-25)

    Systems that provide MIPE capabilities have the potential to revolutionize the way that military intelligence organizations manage and exploit motion imagery collections. Properly tuned to minimize false negatives, a mature MIPE capability could allow an analyst to watch multiple screens at once, with the confidence that the automated system would quickly flag or call out any features that could potentially be of interest. Present practice is to keep “eyes on” only one video at a time to ensure that nothing is missed. Although this is necessary for current operations, a sufficiently proven and effective MIPE system could begin to relieve...

  12. Appendix: Summary of Video Analytics Technology from Pacific Northwest National Laboratory
    (pp. 26-30)
  13. References
    (pp. 31-37)