NTB: How does the IV&V validate and verify a project?
Jackson: We have a standard process that we tailor to the characteristics of a given mission. We start with what we call a criticality assessment analysis, where we look at everything in the software world, and try to determine whether it is critical and/or risky to the program and we concentrate our efforts in that. We never look at all the software associated with the mission, just the things that come out of our criticality assessment as being where we ought to put the focus. Then we start off in the earliest times looking for requirements as those requirements are being formulated. We try to find requirements issues and get them identified and resolved before software moves out of the requirement stage. Then we follow the development lifecycle, looking at development artifacts. We’ll look at, after requirements, design artifacts. Then we will actually get the code in, doing our analysis on the code using manual inspection techniques as well as with some analysis tools that we’ve built. We will look at the testing program. So if there is a three-year development lifecycle for the software in a given project, we’re involved with that project for the full three years.
The final steps are that usually the projects have their Milestone Reviews. The last one is the Mission Readiness Review, where the project managers ask us to come back and report to them on whether we think there are any issues with the software or not. Then, at the agency level, there is a final review with the chief engineer and the chief safety mission officer where we do the same thing. We have a number of techniques we have developed over the years, all of which are traceable to an IEEE specification, the IEEE Standard 1012 for Software Verification and Validation. We have a number of tools we’ve built that we apply, a number of commercial tools, that we use to help us with the job.
We have some really smart and inquisitive analysts, but it is not a team of engineers at their computers going over individual lines of code. In fact, by the time we start looking at actual code, we understand the requirements. We think the requirements are right, all the way from the system-level requirements that get allocated to software; we look at that allocation process. We understand how the code was intended through the design process. Then we look at the code, and how that code is being tested to verify that it will meet the mission needs. One of the things we look for from a testing standpoint is to try to identify areas that the project workers may not look at, and so demonstrate where we think that there might be some weaknesses.
NTB: With what sort of projects is IV&V involved?
Jackson: We supported 23 missions so far. They ranged from the space station through the rest of the human-rated software endeavors, both mission and shuttle. The shuttle is flown by software. It is automated until it gets down to about 10,000 ft. in re-entry. Only then do the pilots take over for the actual touchdown. We are about to get involved in Constellation, which will be another human-rated project. We look at the various robotic missions that are sponsored and being developed under the science mission directorate. We’re doing some analysis on the IEMP, the Integrated Enterprise Management Program. It’s the big financial system that NASA is putting in place that is SAP-related. It’s one the ERP (Enterprise Resource Planning) projects.
We have been involved with missions being developed or software being developed at every center at NASA with the exception of Stennis. We have not been involved with any projects at Stennis, but we’ve hit the other 10 centers. Our focuses are on the JPL, Goddard, and JFC/KSC.