For more information on the subject, we invite you to attend our next Telfer MBA Conference on November 21, 2015.

Register at telfer.uOttawa.ca/mbaconferences.

Gregory Richards, MBA, PhD, FCMC
Director, Telfer School of Management MBA Program & the Centre for Business Analytics and Performance.

A recent study by McKinsey Global Institute suggests that governments around the world can unlock $3 trillion in economic value by leveraging data more effectively.  But most government organizations will tell you that they struggle with sharing data, with working through privacy issues, and with finding the time and skills to actually use data effectively.  In this short article, I will address some of the successes and challenges facing public sector organizations. I’ll conclude with a brief overview of a case study demonstrating how to solve one of the core problems: integrating analytics into the “way we do business” in a public sector organization.

In terms of successes, some organizations have established analytic offices to crunch through data. In Canada, many organizations such as the Canada Revenue Agency and Service Canada for example, have fairly strong analytic practices in place.  In the US, a variety of organizations have developed analytic approaches that improve program effectiveness and efficiency.  The IBM Centre for the Business of Government in Washington has chronicled many of these efforts.  These organizations have managed to solve data sharing issues and have been able to partner with universities and other analytic institutions to leverage data in new and interesting ways.

Despite these pockets of success, one of the key challenges is to mainstream analytics as a core process within organizations. Cultural resistance is still strong.  Part of the issue of course, is that it is difficult to trust data if we are not sure of the source, and if we don’t understand how the data have been transformed. Furthermore, books such as How to Lie with Statistics have demonstrated that it is possible to confuse analysis with interpretation and tell almost any story with a particular data set. 

So how might an organization overcome cultural resistance to integrate analytics as a core operational process?  One provincial organization accomplished this task long before the term Big Data become popular. Here’s the important point: they did not set out to launch a Big Data program; they set out to improve program effectiveness and efficiency and found that evidence-based decision making helped.  There were three keys to success. First, the organization had a clear mandate from the deputy head who insisted on measurable strategic goals. Second, the organization had reams of data with which to make informed decisions, but they made a significant investment in getting the data right. Third, the focus was on learning not on finger pointing.

The organization in question (who asked not to be named), was able to weed through the mission statements and other required planning documentation to focus on 3 high-level measureable goals.  Clearly, a number of subordinate targets contributed to these goals, so the first step was development of a clear network of measures for each responsibility centre across the organization.

The next step was a bit of stumbling around realizing that the data they had were either not up-to-date and often contradictory. An investment of millions of dollars over a 12-month time frame helped to clean the data and install practices and procedures for managing data quality and data validation.  Along the way, decision makers were provided education on the data stores, the use of data, and some of the analytic techniques. The organization was not naive enough to think that they could or should transform managers into analysts. But they knew enough to provide basic information so that their managers became data savvy. The managers understood enough to know how to critically examine reports and when to call in experts.

Finally, and this is a most important point because it speaks to the human aspect of the organization, the outputs of all this number crunching was used to stimulate learning and change.  When a problem or opportunity was noted, managers would commission studies, conduct benchmarking exercises, and disseminate learning widely throughout the organization so that any action to be taken was fully understood by those who had to take it.  It is easy to create toxic environments if we use analytics programs to point fingers at things that are going wrong. People will feel that they are always under scrutiny. They can become fearful of making mistakes and therefore, the analytics program could have the opposite effect of what was intended. When the program is focused on learning, however, the spirit of the organization emerges as a group intent on making things better and not being afraid to own up to mistakes and to learn from them. This does not mean that accountability is compromised, but it does mean that the hard number crunching side of these programs is wrapped within a more humanistic context.

In this brief article, I’ve highlighted some of the challenges, but I’ve also pointed out some of the successes being noted in the use of analytics in government organizations.  Many departments and agencies have pockets of analytics, but it’s now time to think about embedding evidence-based decision making across the organization. The keys to success include a strong mandate from the top, ensuring high data quality, and wrapping analytics within an appropriate culture that balances accountability with learning and growth.