Effective management of a custom system and efficient use of available resources require insight into its various, largely technical, aspects. To support management information across a consortium in which multiple custom systems are in use, a quality model has been developed to provide insight into the condition of these systems. With this information in place, better-informed decisions can be made and the continuity of the systems’ functionality can be ensured.
This article was written based on work carried out by KPMG during a number of consulting assignments for Dutch government organizations. It builds on the Toetskader voor beheer en onderhoud (Assessment Framework for Management and Maintenance [ed.]) published by the ICT Assessment Advisory Board in September 2025 ([AICT25]).
Introduction
Many organizations and consortiums – particularly within the federal government – are responsible for maintaining, managing, and updating a portfolio of customized software systems, including robotic process automation (RPA) and AI applications. Because the processes these systems support are highly specific, custom software is often essential; such processes are tailored to the unique characteristics and requirements of a municipality, region, or country.
At the same time, reliance on custom software places responsibility on the organization to keep these systems current and to ensure the long-term continuity of services, for example through effective application lifecycle management (ALM). This calls for a systematic, multi-year approach that addresses all relevant resources in a balanced manner, including staff, external vendors, and financial capacity. In our view, such an approach should be developed and implemented under the leadership of the organization’s management.
A portfolio of systems inherently requires portfolio management, and various methods have already been developed for this purpose. These approaches typically focus on optimizing alignment with business processes and on the costs associated with system maintenance ([Groo15a], [Groo15b]). However, in a portfolio composed of systems with distinct and independent functions, such methods provide insufficient guidance for long-term planning. In these cases, insight is needed into how each system aligns – technically, functionally, and legally – with future requirements.
Recent ICT failures and reports by the ICT Assessment Advisory Board have made it clear that organizational leadership often lacks sufficient insight into the condition of their software systems, or that critical knowledge gaps exist. Key questions remain unanswered: which factors deserve attention? Should emphasis be placed on source-code maintainability metrics, such as star ratings; on the age of software or hardware; on incident frequency and resolution times; or on dependencies with surrounding systems? Each of these variables may be relevant. What is ultimately required is a coherent overview that offers in-depth insight into the overall condition of the ICT system.
KPMG has developed a model to address these questions for a government consortium involving multiple authorities and implementing organizations. The model is designed to provide concise insight into the condition of each software system, summarized in a report of just a few pages. Combined with information on the system’s functional development and the costs associated with required changes and ongoing maintenance, the model enables the consortium’s management to make well-informed decisions about allocating available resources across the application portfolio.
In this article, we describe the model developed in collaboration with the consortium’s executive support team. We also include perspectives from various stakeholders involved with the different systems. Finally, we outline how an organization or consortium can develop, implement, and sustain its own application portfolio model.
Reason
A consortium was established between several government agencies around a set of facilities that play a critical role in delivering services to citizens. A defining characteristic of these systems is the requirement that their functionality remain available for many years. Initially, each organization, organizational unit, and/or department (hereinafter referred to as an “organization”) was individually responsible for the management, maintenance, and lifecycle management of the systems it operated. The objective of the consortium was to centralize decisions on resource allocation, based on the expectation that this would increase efficiency and lead to a more balanced use of available resources.
In practice, this proved more complex than anticipated. Many of the proposed activities were presented by the participating organizations as strictly necessary – an understandable position given the accumulated backlogs. However, when it became clear that demand for resources consistently exceeded the available budget, and that backlogs were not being resolved but instead reappeared in different forms, the need emerged for a more objective and independent assessment of the condition of the various software systems. Without such an objective view, the consortium’s management was unable to steer resource allocation based on substantive, evidence-based considerations.
The emergence of the quality model
KPMG regularly conducts assessments of the technical quality of software systems. A recurring observation is that essential information about these systems is often not readily available. Basic questions – such as what documentation exists, how many lines of source code the software contains, which technologies are used, whether automated tests are in place, and whether technical debt has accumulated – frequently cannot be answered immediately. This is noteworthy, as precisely this type of information should form the foundation for steering technological development. It is not uncommon for such assessments to result in a reassessment of priorities for the further development and management of the systems concerned.
Drawing on this experience, several organizations have been supported in defining quality targets for the custom software they manage. To this end, straightforward metrics were applied, including method complexity, code duplication, and unit test coverage, while also taking into account the quality of supporting components (dependencies). A deliberate choice was made to work with quality targets rather than rigid standards, as this approach allows organizations to set priorities and ambitions in line with their specific context (see also [Amor13] and [Koed19]). Experience has shown that periodic reporting – ranging from one to four times per year, depending on the pace of change – and explicitly highlighting improvements or deteriorations in quality motivate employees to actively improve software quality and work toward the defined targets.
While establishing quality objectives for the software resulted in an initial model, it soon became clear that an effective portfolio management model must also address several additional aspects beyond the custom software itself. These include hardware, system architecture, and the organization of management and governance. To move toward a more comprehensive model, a series of inspiration sessions and interviews were conducted with stakeholders across the consortium, including both management representatives and system administrators. Insights from these discussions were used to develop an expanded quality model. This model was subsequently refined through multiple review sessions with the client, incorporating feedback from the participating IT management organizations.
Quality model for software systems
Following a thorough, iterative inventory of details relevant to decision-making, the findings were consolidated into a quality model comprising eight reporting aspects. For each aspect, a set of questions was defined to establish a clear and consistent view of the subject concerned. These questions are intended to be answered relatively easily and with minimal effort by the IT management organization, in order to limit the impact on other essential activities. At the same time, contributions from multiple departments or organizational units are required, as topics such as privacy and information security call for different expertise than issues related to technical debt.
The eight aspects are (in random order):
- Software architecture
A clear description of a system’s technical operation is essential, as it provides insight into the system’s capabilities and limitations. It also clarifies how non-functional requirements (NFRs), such as performance and scalability, are addressed. At a minimum, the non-functional requirements that have been considered should be explicitly documented. - Technology
The technologies used provide insight into the age of a system. While legacy COBOL software does not inherently perform worse than systems built using currently popular languages such as Python, it does reflect the attractiveness of the development environment to current and future software engineers – who are responsible for maintaining and evolving the system – as well as the willingness of technology vendors to continue investing in it. Similar considerations apply to hardware, network technologies, and development platforms. This information helps indicate when individual components or the system as a whole are likely to require replacement, although in practice system lifespans are often extended through interim or stopgap measures. - Technical debt
Technical debt refers to known maintenance backlogs that must still be addressed in order to bring a system up to date ([Amor13]). The term debt is used because it typically reflects deferred or overdue maintenance resulting from insufficient attention – often due to limited funding – being given to the management activities required to maintain software quality in line with the defined quality objectives. - Open-source strategy
Within government in particular, the use of open-source components is considered important as a means of reducing dependence on large ICT vendors. A clear strategy for the use of these components is therefore required, including guidance on whether software developed by the organization will be made available to the broader community. In addition, an inventory of the open-source components in use, including their version numbers, should be maintained. - Legislation and regulations
Legislation and regulations are often a key source of change requirements in the government domain. In practice, it is not uncommon for recent legislative or regulatory changes to have not yet been fully incorporated into system functionality. This applies not only to changes in operational domains, such as social security, but also to areas including AI, privacy, information security, and records management. In addition, procurement-related legislation – such as the Public Procurement Act – often plays a significant role in the engagement of suppliers and must, of course, be complied with. - Privacy & information security
The protection of personal data is critically important, particularly because government organizations are authorized to process highly sensitive information, such as the addresses of public figures or records of donations to political parties or religious organizations. To prevent this data from falling into unauthorized hands, strong attention to privacy and information security is essential. - Operations and release management
The ease of operations and overall system manageability – including the surrounding environment – are important characteristics, although they are difficult to capture in a single metric. Many systems do not yet meet the requirements for continuous integration (CI) and continuous deployment (CD), nor do they operate within a highly automated DevOps environment. Nevertheless, insight into testability and the speed with which releases can be deployed to production is essential for an accurate assessment of the system. - Risk management
No system operates without risk. It is therefore essential that IT management organizations are aware of these risks, take appropriate measures to prevent them or mitigate their impact, and prepare for the possibility that risks may materialize.
Figure 1 shows the eight aspects with their most important questions.
Figure 1. The eight aspects of the quality model for software systems.
IDE = Integrated Development Environment (such as Visual Studio or IntelliJ)
OTAP = Development, Test, Acceptance, and Production Environment
NORA = Dutch Government Reference Architecture
NFR = non-functional requirements
Quality Assurance refers to the second-line measures taken to guarantee the quality of the product. [Click on the image for a larger image]
Many of the questions do not lend themselves to quantitative answers and instead require a concise qualitative description of the situation. In a consortium setting, this may initially require some alignment across participants when conducting the first assessments. Nevertheless, we believe that answering these questions – together with a brief description of the system’s purpose and context, such as the number of users and stakeholders and the consequences of missed deadlines or service disruptions – provides a solid basis for setting priorities and allocating resources.
Reception
Following discussions with the executive support team, additional consultations were held with several IT management organizations. These organizations were less enthusiastic about the immediate implementation of the model. One of their primary concerns was that collecting the required information would demand significant time and capacity. They also noted that the various facilities differed substantially and were therefore, in their view, not readily comparable.
We received a notable response from one IT management organization regarding the request for a detailed inventory of the components in use. They indicated that the board was not interested in this level of detail. While such information may indeed need to be translated or summarized for board-level discussion, maintaining an inventory of standard components in use – including version numbers – is essential for any custom software system. The widespread concern following the discovery of the Log4j security vulnerability in December 2021 ([Hofm21]), during which many organizations were initially unable to determine whether they were affected, demonstrated the value of a well-maintained component inventory. Such an inventory enables rapid assessment when a commonly used component is found to have a vulnerability. Any IT management organization that aims to remain “in control” of its custom systems should therefore already have this information readily available.
The primary objection raised by the IT management organizations was that they did not perceive a need for centralized information provision and oversight. From the perspective that these organizations had already been fully responsible for their systems prior to the formation of the consortium, this position is understandable. However, it also means that effective governance at the consortium level remains challenging, and that cross-system prioritization, efficiency gains, and coordinated lifecycle management are difficult to achieve.
Ultimately, the value of a quality model is demonstrated through its practical application. We are currently in discussions with an IT management organization that is willing to collaborate on mapping the defined quality aspects of its custom software system. Our aim is to validate the quality model in practice, enabling the IT management organization to make informed decisions about the future of its systems based on a clear and shared understanding of their characteristics and quality attributes.
Conclusion
To prevent incidents and foreseeable failures, custom software systems must be properly maintained and supported by effective application lifecycle management. This requires a management plan in which available resources are allocated in a balanced manner, with clear attention to the highest priorities. Meaningful oversight of system maintenance is not possible without substantive knowledge of the systems themselves. This holds equally true in a collaborative consortium setting. We are therefore convinced that, in the long term, a quality model for custom systems that provides management with clear insight is unavoidable. Based on the responses from the IT management organizations consulted, we conclude that the quality model presented in this article is sufficiently robust to support an initial round of system assessments. In our view, this work should begin without delay. We also regard the Toetskader voor beheer en onderhoud (Assessment Framework for Management and Maintenance [ed.]) ([AICT25]) as a valuable complementary recommendation. Together, these instruments enable better-informed decision-making while allowing the quality model itself to be further refined based on practical experience.
References
[AICT25] Agentschap ICT-toetsing (2025, September 29). Toetskader voor beheer en onderhoud (only in Dutch). Retrieved from: https://www.adviescollegeicttoetsing.nl/onze-werkwijze/toetskader-beheer-en-onderhoud/toetskader-beheer-en-onderhoud1
[Amor13] Amoraal, J.M., Lanzani, G., Kuiters, P., & Koedijk, J.M.A. (2013). Grip op de kwaliteit van software. Compact 2013/2 (only in Dutch). Retrieved from: https://www.compact.nl/articles/grip-op-de-kwaliteit-van-software-2/
[Groo15a] Groosman, J.H.L. & Hoss, J.M. (2015). Application Portfolio Optimization Publication: Getting Ready for Digital Transformation. Compact 2015/2. Retrieved from: https://www.compact.nl/articles/application-portfolio-optimization
[Groo15b] Groosman, J.H.L., Kuiters, P., & Knip, S. (2015). Removing Application Entropy. Compact 2015/2. Retrieved from: https://www.compact.nl/articles/removing-application-entropy
[Hofm21] Hofmans, T. (2021, December 14). Wat weten we nu over Log4Shell, de kwetsbaarheid in de log4j-library? Tweakers (only in Dutch). Retrieved from: https://tweakers.net/reviews/9614/wat-weten-we-nu-over-log4shell-de-kwetsbaarheid-in-de-log4j-library.html
[Koed19] Koedijk, J.M.A. & Knottnerus, J.M. (2019). Zet je niet af, maar stel een doel. Compact 2015/2 (only in Dutch). Retrieved from: https://www.compact.nl/articles/zet-je-niet-af-maar-stel-een-doel-2/
