Photo by Shahadat Rahman on Unsplash

Leveraging Existing Tools Effectively: Colorado’s Modern Software Delivery Maturity Matrix

Sechi Kailasa
Project on Digital Era Government
7 min readMar 30, 2022

--

Author: Leonie Bolte, Master of Public Administration, 2022

This article explores how digital service teams can benefit from adapting existing digital tools as opposed to creating them from scratch as well as highlights the importance of working in the open. The Colorado Digital Team’s Matrix provides a case study on how to do this.

The end of the beginning of digital service units

One of the outcomes of the 2018 Digital Services Convening at the Harvard Kennedy School was the theme of the end of the beginning of digital service units. As part of this theme, the group called for maintaining or building political capital… to prepare for the next phase [of digital service units].” Different digital service units — such as the Colorado Digital Service (CDS) — are following this call only three years later. CDS’s Modern Software Delivery Maturity Matrix is an exemplary attempt to develop a digital service standard minimum viable product (MVP) inspired by other digital service tools to build political capital for their project work.

The Digital Services at the Harvard Kennedy School initially proposed a maturity framework for public sector digital services in 2018. The framework helps to contextualize the current status of a digital service unit and allows setting milestones for the future. Since then, this maturity framework has inspired at least two other tools. The first tool is Public Digital’s Open Source Capability Model for Governments, which was developed to provide stakeholders with an overview of their open-source capabilities. The second tool — and the focus of this article — is the Modern Software Delivery Maturity Matrix developed by CDS.

What is the Colorado Digital Service?

CDS is a digital service team that operates on the state level. Established in October 2019, the team works on the governor’s top priorities. It describes itself as “ a diverse cross-functional team of senior engineers, designers, product managers, and procurement specialists serving limited ‘tours of civic service’ (six months to two years) in government.” CDS works jointly with civil servants in the and across state agencies during these tours.

A project triggered the development of the Modern Software Delivery Maturity Matrix

The development of CDS’s very own Modern Software Delivery Maturity Matrix was triggered in the summer of 2020 when it worked with the team that oversees the state’s child welfare system. To kick off their work and better understand the project environment, CDS conducted a discovery sprint with two goals in mind. It first wanted to evaluate the maturity of the state’s child welfare system project and then strove to identify measures through which they wanted to advance the project.

The team collected the data to fulfill its two goals. However, as Karyn Lu, a CDS alumna and then project member, puts it, “our insights made sense to us, but felt fragmented and untethered as we cobbled together a report intended for an audience with very mixed levels of expertise in modern software delivery practices.” Hence, the team started to investigate potential options to organize and better communicate their thoughts.

While many foundational tools around service standards exist, none of the tools fully fit the two specific needs of CDS. For example, Harvard’s maturity framework for public sector digital services holistically evaluates the maturity of a digital service unit. CDS, however, needed a tool that would allow it to assess maturity on a project level. It set to work on developing a framework appropriate for its local context that would best enable it to deliver value to its citizens, and the idea of the Modern Software Delivery Matrix was born.

In its MVP version, the Modern Software Delivery Maturity Matrix has two axes. The first axis consists of eight categories:

● team culture

● purchasing vs. procurement

● modular contracting

● a user-centered approach

● product ownership

● agile software development

● DevSecOps

● building with loosely coupled parts

These categories are then detailed and potentially scored on a maturity scale: low, medium, and high.

The Maturity Matrix builds on three existing tools

The framework for the Maturity Matrix draws on three foundational tools: the maturity framework developed by the Harvard Kennedy School, the UK GDS’s Service Standard, and the Quality Assurance Surveillance Plan (QASP) developed by 18F.

These three tools were developed with different targets in mind. The first framework — Harvard’s maturity framework — is a maturity benchmarking tool for digital service groups. It also allows one to define a future target for the service group under consideration. In contrast to this, GOV.UK developed its Service Standard as a set of criteria that government teams ranging from central to local must meet when they build a digital service. It enables teams to create and operate inclusive, effective, and commonly agreed to be good digital public services. 18F’s QASP focuses on assessing software requests for proposals, and its ultimate goal is to ensure vendors deliver work that meets the needs of a custom technology project.

In this light, it is no surprise that the tool developed by CDS was inspired by different aspects of the three foundational tools. The team drew on aspects that it deemed as fundamental for digital service and product teams. For example, CDS’s categories of “purchasing vs. procurement” and “modular contracting” are inspired by the overall content of the quality assurance plan. As another example, all three tools strongly emphasize a user-centric design. This consensus is echoed in CDS’s matrix by the category of a “user-centered approach.” While CDS’s maturity axis drew on Harvard’s maturity framework for public sector digital services, it refrained from defining a future state.

CDS also went beyond the three tools to emphasize topics it deemed especially important. For example, it views culture as key for its Modern Software Delivery Maturity Matrix because “the human layer is the foundation that needs to be in place before a team can operate effectively.” None of the tools feature culture as prominently as the Modern Software Delivery Maturity Matrix. Hence, in this case, CDS shifted the tool focus to an area it considered crucial.

CDS used the Modern Software Delivery Maturity Matrix to identify levers for change

In practice, CDS used its Modern Software Delivery Maturity Matrix to evaluate the child welfare system project’s maturity and identify levers for change. First, the group assessed the project, and the assessment showed, for example, that the project was not well positioned to enforce quality standards on the team’s vendors. It also showed that the project had strong product managers.

Next, CDS focused on levers of change to drive the project forward. After negotiating with the human services agency, the team decided to hire talent to increase the level of software delivery maturity for the child welfare system team. In more specific terms, it focused on hiring the senior-most engineer for the project — a tech lead. The team chose this particular lever because it assumed that it would be an enabler for multiple categories in the framework. For example, the lever may help to increase maturity for team culture and purchasing vs. procurement simultaneously.

Other digital service groups can learn from the Maturity Matrix

While the Modern Software Delivery Maturity Matrix holds many lessons, three lessons are especially interesting:

● The development process shows one path to build up the lever of a service standard.

● The matrix positioning illustrates the value of transparency and openness for discussion.

● Levers such as standard tools must be used appropriately.

First, the Modern Software Delivery Maturity Matrix shows one possible path to build up the lever of a service standard. Developing and setting (service) standards is often viewed as one of the levers that a digital service group can use to influence an organization or a project. These potential paths can include forming a standard for one’s own digital service unit or using an MVP as a consulting tool to structure project conversations. CDS pursued the latter path, proving that adapting existing tools is an opportunity to structure consulting conversations on the ground. This can then help identify the most critical levers so that change can be fostered while other aspects of the maturity remain low.

Next, CDS’s case illustrates the value of tool transparency and openness for discussion. For example, CDS published their tool on their GitHub page and made it open for reuse, and it also encourages users to provide feedback on its tool. This move fosters the knowledge exchange among different digital service units and leverages input for improvement from the outside world beyond said units.

Finally, levers such as standard tools must be used appropriately, meaning not only does selecting the right levers matter but effectively using the levers is also equally important. CDS found the tool helpful for the child welfare system project but noted it did not reuse it as much as expected. While the team attempted to leverage an adjusted tool for another project, this idea ultimately failed. Additionally, the tool has not been updated beyond its original publication on CDS’s GitHub page, partially because the team has since pivoted away from doing discovery sprints where the tool was most useful. However, in hindsight, one might ask if the team — knowing what it knows now — would develop the tool in the same manner, for example, if it dedicates the same amount of human resources to the tool’s development.

Overall, digital service teams considering altering foundational tools should consider the lessons mentioned above. However, the most important takeaway from the CDS’s example is that it illustrates how digital service teams can benefit from adapting foundational digital tools for their purposes.

--

--