Prototype, Validate, and Deploy Multi-Patient Ventilator Systems, Part 4: Verification and Validation, and Certification Overview
From the series: Prototype, Validate, and Deploy Multi-Patient Ventilator Systems
Discover how to use various tools and workflows in MATLAB® and Simulink® that can help you with requirements management, traceability aspects, and verification compliance to standards and guidelines using a Model-Based Design approach.
So now that we've seen the overview of the multiple ventilator model as well as the deployment capabilities with MATLAB, let's take a look at the verification and validation workflow. In a typical development workflow, most the errors introduced during the development phase. Some of these errors gets picked out during a testing phase, but a few errors can get propagated through the integration step or even in the field. And once we find the errors in the field, the cost to repair those errors is phenomenal and we definitely do not want to get there.
In Simulink, you've already seen that Simulink as a tool can be used to describe a complex medical system. So we have mechanical capabilities, electrical capabilities, hydraulics, state charts, control systems. So you can all those complex, model all those complex scenarios in a single model.
But what does complete this whole model based design setup is also the verification and validation capabilities which entails that as you're building the models you're constantly enables you to test them, verify them, perform the analysis. And even when you're going from your model to a production core generation. Even before that step perform these equivalence checks and testing to make sure that none of these errors gets propagated downstream in your model is totally flawless and your design is flawless.
So let's talk a little bit more of these different tools in Simulink which enables you to do that. The first one I want to touch upon is Simulink requirements. In the Simulink requirements MathWorks introduced a new capability to address the challenge of integrating requirements directly into the Simulink Editor so that you can work with the requirements without even leaving Simulink.
So you can import requirements from different tools such as doors or you can alter them in Simulink. You can easily create traceability between the requirements design, code, and test, you can identify gaps in your implementation and verification. And also whenever the changes occur they are identified by highlighting and you are guided to update what might be impacted. So it's a fairly autonomous tool over there.
Once the requirements are in Simulink, to track requirements to the design and test links are created. And this is the same concept used to require management tools. Links defined relationships, they also contain properties such as classification, such as in this one implemented by. That specifies the type of relationship between the elements. And traceability can be created across the entire development life cycle. It starts with a high level user or business score.
That are decomposed into the next load design requirements and specify the implementation. That link to the block that implements them and the tests that verify them. When generating code link information can also be added to the code.
So that being said, now the next step once we have the models, how do we verify compliance to certain standards and guidelines. So MathWorks the problem which is really trying to solve here is once we have the models in Simulink and do we want to know that if the design is built correctly, manual design the
views can be used but these are very tedious and error prone. Particularly, if these are complex components that errors are not easily visible.
If the code is to be generated for the model then we want to make sure if it doesn't contain anything that would prevent from generation, right? So to address those needs, we do have to what we call Simulink check which helps us automate the manual review steps by running a static analysis on the model. We can check for readability, performance, and other errors.
So in this Simulink check tool we also support-- we have a built in IEC 62304 certification which has the industry standards and can be leveraged to check our model to make sure that it is compliant and there are no errors from that perspective.
And after the checks are implemented, the results can be published in the comprehensive report to document analysis for design reviews or even presented as a record for audit purposes. So if most of us I guess if we are building a device which has to go through FDA. Then these tools are quite handy to use in our model development.
That being said, now let me switch gears and get to the functional testing. So even from a testing perspective, we do want to make sure not only if the model or the components is functionally correct, but we also want to ensure that does the test also meet the design requirements and then we also want to ensure that is it completely tested and have these missed out on any part.
So to meet those needs, we do have a very comprehensive Simulink test tool, which gives you a systematic way to test your model. You can isolate the component to test using test harnesses. This allows you to write your test without dotting the original model. Then you can write your test inputs in many ways, including MAT files, signal builder, test sequence, and even Excel files.
To assess the results you can compare them against outputs and those could be baseline outputs, or you can write a custom criteria with MATLAB unit tests, or even use the test assessment block to define online pass and fail conditions. You can compare against baseline in Excel files and a lot more.
You can also test your models in different modes, such as FIL, PIL, and HIL, which is the hardware of it in the loop. And even more than that for scalability purposes, you can scale your simulations of tests using the parallel computing toolbox and even run these on the continuous integration software. A lot of capabilities on there.
Now to verify that if we have enough test implemented or not, then we have this capability to measure the coverage analysis. So for these testing, we can actually identify any testing gaps if we have missed any requirements or if we come across any unintended functionality, all that gets reported out.
So we do generate a very comprehensive coverage analysis to measure our testing, to ensure that all the requirements have been linked with test cases, they've been thoroughly tested, and all the different parts of the model are being touched upon and are being covered in our testing.
So these coverage report does give us a very big insight, very detailed insight. But even beyond that we have the capabilities to automatically address some missing coverage. And that capability is coming from another tool, which is called Simulink Design Verifier.
And what it does is it assesses the coverage reports and figures out automatically which parts of the model are not touched upon and which part of components are not tested. And it integrates with the test manager. And you can automatically create new test cases automatically again. I'm emphasizing the word automatically to actually cover the missing coverage. So it will automatically generate the best cases to do that.
So that being said now since our testing is complete, let's jump to the equivalent testing which was a little bit about the software in the loop, processor in the loop, and hardware in the loop. The third and the final testing that is performed is referred to as in the loop testing, which is a type of equivalence testing to compare the results of testing of generated code and the model simulation.
This type of testing is sometimes also referred to as black box testing. And we need to ensure that the testing coverage is sufficient due to potential differences in code and model cymatics. To perform the equivalent testing, we have SIL, or software in the loop, which refers to reusing tests from the model to execute the generated code on a desktop PC. And then comparing the results to the simulation result.
You can also measure the coverage to see that the generated code, which could be the C or C++ code is completely tested. The second testing, which is the PIL testing is the processing in the loop and refers to cross compiling the generated code for the target processor. Like it can be a microcontroller or maybe even a embitter GP board. Executing that code the target processor and then comparing the result from the target execution to the simulation results.
There's also third in loop testing, which is called HIL or hardware in the loop. And this testing is called the real time-- checks the real time behavior of the design and code. So now you can simulate the actual signals from the field on the processor and then emulate the targets and we have the capability on that.
And mind you again, for all these SIL, PIL, and HIL testing, we can again reuse these tests which we had already developed for our model and then we can also collect all the reports. So we can collect the coverage reports, and the test reports, and then have them factored for the audit or verification, or filing that maybe-- for our medical device for the FDA.
So again, a long story short, all the C C++ code which gets generated could be thoroughly tested in the same fashion as we had done our model. And last but not the least in this capabilities. So beyond functional testing, can we scale it with continuous integration? Yes, we do.
So as of today we do support Jenkins as well as one of our continuous integration platform. So that we our MATLAB, VNB tools can seamlessly integrate into your software development cycle and leverage the capabilities, especially with multiple people working on the same software and performing the testing and equivalence kicking off your software design.
So that being said, those words have very high level overview of the VNB capabilities. And one thing I would like to mention at this point is if you have questions on where to start, what kind of support you need. So MathWorks does offer a lot of support resources for you to help getting started with these tools, getting to know more about these tools. So we have technical support if you run into any technical issues while using the tool.
But our application engineering and pilot engineering team gives you a complimentary guided support to help you adopt these new tools, understand your processes, and integrate these tools in your software development ecosystem. We also have training and consulting services, which helps you train on the specific tools which you would like to adopt on and consulting would help you build some custom solutions to meet specifically your needs.
So a lot of support available from MathWorks for making your successful with the model based design adoption. And that being said, I would like to thank you all for attending this webinar today. And these are a few links. If you have any more questions from MathWorks, site feel free to visit our MathWorks' page and the social media page.
But from quadrus side if you have any questions specifically for Dr. Shabbat over here. So you can also check his website Quadrus Health and you can find this ventilator app over here. So thank you everybody again for attending. I really appreciate your time. Have a good rest of your day.
You can also select a web site from the following list:
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.