An M-JPEG video compression system has been modified to satisfy the unique requirements of space-based applications. ["M-JPEG" signifies the still-image-data-compression method of the Joint Photographic Experts Group (JPEG) as applied to moving (e.g., video) images.] This ruggedly constructed, modular system is compatible with NASA interfaces and meets Agency requirements for reduced size, weight, and power. The M-JPEG system generates test patterns, enables users to select compression characteristics and desired output rate, is inexpensive to modify and upgrade, and has features that are adaptable to mission-specific requirements.
Two categories of video compression/decompression systems are commercially available: (1) computer-based video compression/decompression systems for multimedia applications and (2) stand-alone box compression/decompression systems for broadcast applications. None of these systems satisfies the requirements for all of NASA's unique applications. The sizes, weights, and power demands of commercial video systems are too high. Commercially available systems also do not provide compatible interfaces or compression options that users can select and control from front panels, and they do not generate test patterns. Moreover, most commercial compression/decompression systems are not ruggedly constructed or modular in design, many do not provide adequate video quality, and none is capable of adjustment to any desired output rate. Finally, the cost of retrofitting any of these systems to satisfy NASA requirements is prohibitive. To summarize: While the video compression/decompression systems currently in use satisfy the commercial industry requirements for which they were designed, they fail to meet NASA's unique requirements for space-based applications.
The M-JPEG system has been designed to perform functions specific to space flight. It was modified to enable it to perform the following functions: digitize a standard SMPTE-170M National Television Systems Committee (NTSC) signal using a variety of programmable color spaces; compress a digital video signal using an adaptive M-JPEG compression algorithm; enable the user to select compression modes and thereby modify compression parameters; packetize the compressed digital video signal for transport; provide both a fiber-optic and an electrical [emitter-coupled logic (ECL)] output interface; receive the compressed digital video signal, decompress the data, and produce an acceptable version of the original noncompressed digital video signal; and convert the decompressed digital video to a variety of formats for display [e.g., red, green, blue (RGB), composite NTSC, or component NTSC (Y/C)]. This range of capabilities enables NASA to: (1) improve the quality of video transmissions over that of standard analog video transmissions, (2) transmit multiple video channels within bandwidths previously needed for one channel, and (3) make efficient digital recordings of compressed digital video signals and multiple-generation recordings without degradation.
The M-JPEG video system consists of two primary subsystems: the onboard compression system and the ground-based decompression system. The figure shows the onboard compression system, which contains the M-JPEG video encoder. This encoder is housed in an anodized aluminum case that contains the printed-circuit boards (PCBs). There are four such PCBs: (1) a video digitizer, (2) the M-JPEG encoder, (3) a packetizer, and (4) the power supply. Many components of this subsystem are programmable logic devices; these include an erasable programmable read-only memory (EPROM), erasable programmable logic devices (EPLDs), and a stand-alone microsequencer (SAM).
The M-JPEG video system is designed to interact with either the high-frame-rate multiplexer (HRFM) of the International Space Station or with a space shuttle multiplexer. Because of the modular design of the onboard compression system, three of the PCBs - the video digitizer, the M-JPEG encoder, and the power supply - can be retained in their original state, while an alternate communication circuit can then be employed instead of the packetizer to serve as an interface with another system.
The ground subsystem consists mainly of three personal computer (PC) advanced technology (AT) Industry Standard Architecture (ISA) boards. The user connects the display circuit to either a red/green/blue (RGB) NTSC monitor or to a composite NTSC or component Y/C monitor, depending on which of the two display circuits is being used. Several programs are utilized to initialize the PC AT boards and run the ground system. Programs have been written to read the telemetry data to determine the configuration of the onboard system. The settings of the ground-system boards can be read to ensure that the proper configurations and frames can be captured whenever the user wants to import an image into the ground computer. (The current file format for imported images is the Targa 24 image file format.) Because of the limitations of the PC AT ISA bus and the enormous amount of storage capacity needed to store even a short video sequence of the required quality, the system does not allow storage of compressed motion video data in the ground computer. The M-JPEG video system can be used to provide broadcast-quality video to an existing analog video ground distribution system or, preferably, to a digital video ground distribution system. Decoding is done at the end viewing location.
The M-JPEG video is flexible enough to be amenable to modification] to satisfy a variety of requirements and to suit various applications. For example, computer interfaces could be changed to enable the use of the ground-based components of the system with alternate computers. Inasmuch as the onboard system is modular, it can be modified for different interfaces, as for different communication protocols. A system currently under development - the Moving Pictures Expert Group (MPEG) 2 codec - will incorporate the video digitizer and packetizer from the M-JPEG system, but the encoder board will be replaced with circuitry that implements an alternate compression algorithm.
While the concepts included in the design of the M-JPEG video system are not new, specific implementations of the design are new. The combination of existing techniques and equipment is unique and satisfies the similarly unique requirements of NASA's space-based applications.
This work was done by S. Douglas Holland of Johnson Space Center. MSC-22744