AI testing and certification
Artificial intelligence (AI) is becoming increasingly important in our technology landscape. For AI to deliver good and reliable results, regular...
Legacy modernization is a challenge. Many companies are faced with the need to modernize their outdated systems - often on mainframes or in older programming languages such as Cobol. The process requires a deep understanding of the existing software and close collaboration with specialists in order to document business logic and functionalities in a comprehensible way. Innovative approaches, such as the use of AI to analyze and document these systems, show promising ways to accelerate the transformation. Ultimately, the question arises as to the sustainability of the newly emerging architectures and the avoidance of future legacy problems. This makes regular appraisal and adaptation a central task for companies of the digital future.
In this episode, I talk to Erik Dörnenburg about the challenges and opportunities of legacy modernization in software development. We look at the complex world of legacy systems that often prevail in organizations and discuss how they can be transformed into modern architectures. I found Erik's approach of using the support of AI (RAGs) and specific tools particularly exciting. With these tools, many insights can be gained from the old code that help with modernization and new development.
"The use of retrieval augmented generation helps immensely in understanding legacy software." - Erik Dörnenburg
Erik Dörnenburg is a software engineer and passionate technologist. On his long journey through the tech industry, he has encountered many new technologies. It is always important to him to realize their potential while adapting best practices. At Thoughtworks, he helps clients solve their business challenges with modern technologies, platforms and practices.
Legacy modernization](https://www.richard-seidl.com/en/blog/hysterical-grown) is becoming increasingly important in many industries. Companies are faced with the challenge of adapting legacy systems, which are often based on outdated technologies, to modern requirements. Software modernization is a central component in ensuring competitiveness and efficiency in the long term.
Typical sectors in which legacy software is still widely used are insurance companies and banks in particular. These sectors often rely on Cobol-based mainframe systems and early versions of Java and .NET. Such systems form the backbone of many business processes, but their technological basis makes adaptations and extensions difficult.
Challenges arise from the use of old technologies such as Cobol, which mainly runs in mainframe environments. These environments are often difficult to access and have limited test coverage. Early Java and .NET versions also pose a challenge, as they often do not meet today's architectural standards and have to compete with modern cloud-native architectures.
Legacy modernization aims to transform these legacy systems - be it through re-implementation on mainframes or migration to more modern architectures - to ensure long-term maintainability, scalability and integration. An important aspect of this transformation is security tests with static analysis, which can detect security flaws in the code at an early stage and thus improve the quality of the software in the long term.
The modernization of legacy systems brings with it a variety of challenges that must be overcome to ensure a smooth transition. Some of the key issues include:
Overcoming these challenges requires an in-depth understanding of the existing systems as well as careful planning and implementation of modernization strategies. In particular, a migration testing process is essential to ensure that the migrated system performs the same as the legacy system and meets all requirements.
The modernization of legacy systems requires targeted approaches that minimize technical risks while ensuring the continuity of business processes. Two central strategies dominate here:
Existing systems are completely rewritten, often on the proven mainframe platform. This approach is particularly suitable if the preservation of the system environment is desired and a completely new development appears to make economic sense.
Functionalities are gradually transferred to modern cloud environments. The cloud-native architecture makes it possible to design applications as smaller deployable units, which promotes flexibility and scalability.
Splitting monolithic systems into microservices is a key lever for minimizing risk. By breaking them down into individual, independent services, errors can be isolated and releases can be carried out more quickly. This modularity supports agile methods and facilitates continuous improvements.
The so-called modernization in slices approach does not modernize the entire system at once, but in small, manageable steps. This maintains system availability, gives stakeholders early insight into progress and allows unexpected problems to be addressed quickly.
Agile software companies use these strategies successfully by closely integrating both technical and functional expertise to meet the complex requirements of modern legacy modernization.
Subject matter experts (SMEs) play a crucial role in the modernization of outdated systems. Their expertise is essential in order to understand the complex requirements and functions of these systems and to modernize them successfully.
However, the availability of subject matter experts for legacy systems can be limited. This can lead to bottlenecks in the modernization process, as the knowledge and experience of these experts may not always be accessible.
To address this problem, it is important to develop strategies to increase the availability of subject matter experts. This can be achieved through targeted training programs, knowledge sharing or the use of modern technologies to support communication and collaboration.
In addition, efficient methods for knowledge extraction and documentation are crucial in order to capture the expertise of experts and make it usable for modernization. Modern tools can help to optimize this process by facilitating the extraction and documentation of knowledge.
The involvement of subject matter experts and the optimization of knowledge transfer are therefore key factors for the successful modernization of outdated systems. Through targeted measures, bottlenecks can be overcome and the required expertise can be used efficiently.
The use of artificial intelligence (AI) offers a wide range of possibilities for modernizing old software systems. The use of GPT models enables code analysis and text extraction from outdated systems. These models can help to decipher complex codes and extract relevant information.
However, a critical eye should be kept on tools such as GitHub Copilot. Although they promise faster coding, it is important to carefully consider their effectiveness and potential risks.
Furthermore, the use of Retrieval Augmented Generation (RAG) can provide improved accessibility to documents and knowledge. This approach enables a targeted search for relevant information in extensive databases, which can be particularly helpful in deepening the understanding of legacy systems and making the modernization process more efficient.
In addition to AI-powered modernization, cloud migration could also be considered. The cloud offers benefits such as scalability and cost savings, but also brings challenges, especially in terms of data migration and security.
The use of graph databases plays a central role in knowledge retrieval and augmentation in legacy modernization. These databases make it possible to map and efficiently search complex relationships between different elements of the legacy system. Knowledge graphs structure the knowledge from documentation, source code and expert knowledge in such a way that relevant information can be found quickly and incorporated into the modernization process in a targeted manner.
A key aspect of the technical implementation is dealing with token limits in language models. As the input capacity (context window) is limited, large amounts of information must be pre-processed and compressed so that important details are not lost. Compression techniques help to provide the relevant data in a condensed form, which increases the quality of analysis and code support.
The combination of graph database-based knowledge management and advanced compression techniques allows even extensive legacy systems to be accessed efficiently. This creates a technical infrastructure that not only facilitates the extraction of knowledge, but also provides adaptive support for step-by-step modernization - always with a focus on preserving technical contexts and minimizing technical risks.
When modernizing legacy systems, it is crucial to apply efficient reverse engineering methods. By analyzing frequently shared files in monolithic systems, developers can lay important foundations for successful modularization. This process makes it possible to divide the code into smaller, manageable units and thus reduce complexity.
In addition, code quality plays a central role during modernization. It is essential to write sustainable code in order to avoid technical debt in the long term.
Implementing a test pyramid with different levels such as unit testing, integration testing and end-to-end testing ensures comprehensive test coverage and supports the stability of the system.
By combining effective reverse engineering, careful code quality and a well thought-out test strategy, companies can drive successful legacy system modernization, paving the way for future innovation.
In practice, legacy modernization often proves to be a complex undertaking in which special niche functionalities need to be retained and modern technologies integrated at the same time. A striking example is provided by ThoughtWorks, which specializes in new developments of Cobol systems and Java systems. Existing systems are not simply replaced, but replaced by customized software solutions in order to reduce technical debt and ensure long-term maintainability.
Artificial intelligence is playing an increasingly central role in supporting this process. The use of GPT models, for example, enables automated code analysis and knowledge extraction from complex legacy systems. Tools such as Retrieval Augmented Generation (RAG) significantly improve access to documentation and expert knowledge.
One particularly promising area is the AI revolution in test automation, where the use of AI is creating new possibilities for error detection. In addition, AI skills are shaping the future of software testing and are helping testers to be successful in agile teams.
Future developments in the field of AI promise even deeper integration:
These advances will make it easier for companies to make the transition from monolithic legacy systems to modern cloud-native architectures more secure, faster and more cost-efficient.
Artificial intelligence (AI) is becoming increasingly important in our technology landscape. For AI to deliver good and reliable results, regular...
The HUSTEF conference in Budapest is one of the largest gatherings for the testing and QA community in Europe. It was launched in 2011 and has...
Risk-based testing can help to significantly improve test coverage in software projects and minimize potential risks in testing, especially in the...