GovTech’s March magazine’s article, Data-Driven Ways to Maximize City Budgets Post-Pandemic, authored by Harvard-Kennedy Professor of Innovations in Government Stephen Goldsmith, was almost genius in the way it broke down the benefits of data use into simple components. Former government appointee and thought leader Jane Wiseman co-authored the piece, and collectively, they hit the nail on the head.

I have long held that the word data is one of the most oft-used but least understood words in government today. What exactly is data, anyway? It means different things to different people. And to almost everyone, the word can suggest an intimidating reference to sophisticated technology and informational analytics.

But for an archeologist, an unearthed piece of broken clay is a data point. A cardiologist would find blood pressure or heart rate data useful. For a police chief, crimes per capita per geographic square mile are significant data points. Certainly any police chief would understand those measures and their importance, but making an assumption that everybody shares that ability would be a mistake. The point is, data must be presented in a manner in which anybody can understand—especially public-sector data, which is often lacking in structure and availability.

Adopting a Culture of Data Literacy to Combat Data Intimidation

As both a data analyst and an office-holding policymaker, I can say with certainty that many public sector stakeholders who have limited exposure to technology, finance or analytics often shy away from asking questions for fear of embarrassment. This is particularly concerning with policymakers, who may be tasked with making decisions on information they may not completely comprehend.

Goldsmith and Wiseman suggest, “The best strategy is to use data as a tool—to identify what works and find operational efficiencies and identify the areas in the greatest need.” One with a technical or analytical background might infer he’s referring to the Pareto Principle, also referred to as the “80/20 Rule”, which implies that 80% of outputs are driven from just 20% of inputs.

Using data, a city discovered it was spending considerably more on its police department on a per-capita basis than any of the other 17 cities in the region. Blue bars represent region averages, red represent the city’s actual spend. By addressing the imbalance in its next contract, the city saved $2 million over the new contract term.

But not every municipality can afford to have a Business Analyst on staff, nor do they need to. Investing in a simple-to-use analytics platform, and with minimal training, a “culture of data literacy” can take root, and eventually erode away the intimidation that the typical lay person harbors when it comes to data and technology. Given the right inputs (the 20% Pareto references), technology can be leveraged to provide a prescriptive set of outputs, in an intuitive, dashboard-like visualization that even the broader community-at-large can understand, with no prior training at all!  

In one real-world example, a community in Michigan found that they were spending considerably less than the 11 other cities in its region by reviewing just such graphic outputs. When expanding their search/analysis to cities across the state of like size (budget, taxable value, population, e.g.), it emphasized the finding was truer than they even realized. In one sense, community officials could be proud of their cost controls, but the question that remained was, How does this relate to the quality of service we are delivering at that budget, and was it appropriate and sufficient?  

Another community found that it was spending considerably less than the 11 other cities in their region (middle chart), and when expanding their search to cities across the state of like size (budget, taxable value, population, shown at right) it emphasized the finding was true.

When they looked at their reported crime data, they realized that the money saved was not benefiting the residents, since they were near the top in every one of the four major categories of crime reporting data.

Aha. A more complete data set reveals a much different takeaway. And nobody needed an advanced degree to read the digital tea leaves. This community clearly and confidently recognized that it needed to invest more in their police department, and made the decision to do so almost immediately. 

Survey Says!

The authors cite a recent survey they conducted in 2020 with the Chief Data Officers of 20 cities. They report that the responses were contradictory, in that many cited the need for massive increases in data use, but anticipated less funding would be available for the infrastructure that creates value from data. One respondent said, “The appetite for data has tremendously increased, and data insights are becoming the norm.” 

One concern is that as data needs increase, budget reductions “will eliminate new initiatives and impact some ongoing operations,” according to Philadelphia CIO Mark Wheeler. The other concern confronting them is the loss of experienced staff due to the boomers hitting retirement eligibility and a shortage of qualified or interested talent coming into the public domain.

According to the article, consulting firm McKinsey estimated that globally, “Government could capture $1 trillion of value by using data analytics, both to identify revenue not collected and to recoup payments made in error, and estimates that using data analytics to eliminate waste, fraud and abuse in government can have returns as high as 10 to 15 times the cost.”

At Munetrix, we’ve seen this in situations as simple as analyzing an Accounts Payable report.  Why pay a single vendor 50-70 times per year when payment terms of once per month can be easily established?  If we calculate the cost of processing a payment, order to bank reconciliation, transactions costs range from $35 to almost $150 based on estimates from governments of different types and sizes. If 50 payments using an average of $50 were eliminated, we could free up $2,500 for other operational purposes. Multiply that by dozens of vendors and we’re talking real money.

Simple, intuitive and rich data visualizations have the ability to turn the invisible into the visible.

Looking at data visually can provide insights to trends or anomalies that may otherwise be left undiscovered.

Removing Uncertainty, Instilling Confidence in Decision Making

Goldsmith-Wiseman’s article concludes by saying that public officials should be able to check the following five boxes if they are interested in fostering a culture that respects the power of data to unlock insight. Does your community…

  1. Use public scorecards to show returns on investment measured in terms of customer service and dollars saved.  
  2. Use a predictive analytics program.
  3. Make widespread use of layered data and spatial analytics to identify trends and relationships.
  4. Use data to identify revenue opportunities in service areas or with examining unpaid fees. 
  5. Establish an internal “culture of data literacy” initiative with employees. 

And I would recommend a #6 be added: Don’t be afraid of data, and don’t be afraid to ask for help! There is nothing to fear in making more informed decisions and more confidently prescribing solutions to today’s challenges as we continue to recover from the effects of a global pandemic.

When data is Reliable, Timely, Relevant, Useful, Comparable and Consistent—the six qualitative characteristics of data per the Government Accounting Standard Board (GASB)—the invisible becomes visible; and added clarity will ultimately improve decisions, outcomes and literacy.

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe Today!

Subscribe to get the latest Munetrix news sent straight to your inbox!