In this article I want to present something whose utility I am convinced about: the principle of continuous integration. I want to clarify what the continuous integration is and how can it helps the improvement of the software quality while permanently reducing the development costs.
First, we have to understand that during the software development process, a big task in outlining the specific problems is to know the level of complexity of the system. At the end, the individual elements will be put together in order to find a solution for the general situation. This principle is known as 'divide and conquer', from the Latin expression 'divide et impera'. Normally, this procedure increases the dependency on individual parts of the system, when normally it must be only a manageable specific problem. Hence, it entails the risk that the specific parts do not go together well. One of the most specific examples in this respect is the case of the NASA Mars Climate Orbiter where one part was set following Newton principles and the other according to pound principles, a situation which led to the crash of the spaceship.
The software development according to the principles of continuous integration is a process under way, during which all the elements are permanently tested and build together. The automation guarantees that every written test following the software actualization is newly
According to the principles of Continuous Integration (CI), software development is understood as a constant process through which, during the development, when one element is changing, all the elements of the executable system are modified accordingly. The opposite situation is when different individual elements at different stages or only at the end of the development process are reunited into a common system. At the first sight, it sounds cheaper, but it entails the risk that system errors are discovered later after the components were added together. According to the CI approach, after every change of the source code the components of the system are reorganized together. In this context, the key-role is played by the Continuous-Integration Server (CI-Server). The CI-Server was assigned the task to integrate the changes of the version control system and, when necessary, to set up a new application file, in case of the embedded systems of flashable binary.
As for now, it does not look very spectacular, but through test automation, added value is generated:
The implementation of the once written tests, the setting of the software metrics, as well as the search for examples of failures through the Code analyzes, can be done following each change of the entire software. Eventual retroactive effects of the components that were not modified can be identified very fast, in a way that after each change during the integrated process, individual elements are also tested as part of the whole system, and this is available also when the process is still under way.
Automatic tests matched with permanent integration deliver constantly quality metrics over software retroactive effects
According to my experience, in practice it goes as follows:
The developer works out the source code. When the development of a specific part ended, including the program of automatic test, it takes place an update of the source code and a test of the version control developed.
These changes within the version control will be noticed during the monitoring process. Whereupon, automatically, the complete code source of the control system will be downloaded on the Continuous-Integration-Server. There, the source will be automatically compiled and the entire system built together. At the end, the tests will be automatically performed, the metrics created and the Code Analysis implemented.
For my current projects, I am using to this end the web Software Jenkins, which controls the building and the tests during the process.
If Jenkins notices a mistake, the programmer will be informed without delay. But even the best automatic tests don't do anything, if not put into execution. Unfortunately, in my experience it happens very often that the programmer himself does not perform the automatic test. Either this aspect was simply forgotten or the execution of the entire set of tests lasts too long, or for time reasons, will be executed only those tests that are considered to have something to do only with the changes operated. But usually the mistakes are residing in those aspects that were not tested. And this is the strength of Continuous Integration Servers. It helps to guarantee that the already tested functions are always working.
With the help of professional testing environments also the formal requirements of the integration tests will be fulfilled.
A suitable test environment is a Python framework. With the help of such a framework one can in a fast and intelligent way stimulate the target through corresponding test inputs and modify the outputs of the expectation values. In addition, test reports with corresponding traceability can be generated such as that at the same time also formal requirements of the documentation can be fulfilled. Particularly, this can be interestingly accomplished during the prerequisite certification procedures, for example in the context of the functional safety requirements.
In order to properly use the availability of the CI-Servers, more detailed and longer tests at specific time moment can be done, for instance during the night or the weekend. Thus, the test coverage can be even higher. All we can fear of is that a Monday morning of a developer starts with a bad news about his CI-Server at the first coffee. But regardless of the unique destiny of the developer whose task is to fix problems, such mistakes will never be discovered later by the client and their solving will improve the final product. And an error-free product is the aim of us all.