Happy Wednesday! Today, I’m returning to more normal programming with a selection of quick hit reads and commentary. Nonetheless, all still fit within the framework and agenda I set out over the last few posts.
I’ve had a chance to speak to a few newsletter readers over the past couple of weeks, which has been a lot of fun. I’m so glad this work resonates with so many of you! I’ve been enjoying the process of writing this, and I’m looking forward to doing more. I’ll also be working with another organization to write a couple of deeper reports on some of the themes I’ve been discussing, so do keep your eye out for more over the coming weeks.
If you have been enjoying my work, I would appreciate it if you could share it with your networks. I’m very much in the early start-up phase of this newsletter and going solo with my business, and sharing this makes a big difference in the visibility of my work.
And with the blatant self-promotion over with, here’s the actual stuff you’re reading this for!
Run Better Programs
This new report from the Office of the Auditor General of Canada on the Industrial and Technological Benefits Policy picks up the themes of building state capacity and being more intentional with innovation policy. Administered by ISED, the policy “aims to leverage defence procurement and strengthen Canada’s economy through research and development, job creation, and economic growth for Canadians.” The policy applies for defence procurements over $100m and may apply to those between $20m - $100m. It requires companies awarded defence and Canadian Coast Guard procurement contracts to “engage in business activity in Canada for an amount equal to the value of the contracts.” Given calls for higher defence spending, better procurement use, and more industrial R&D, this policy appears well-suited to some of Canada’s issues at first glance.
However, the devil is in the detail, and the details ain’t pretty.
The Auditor General has found that ISED “was unable to demonstrate that the policy met its objectives, which included supporting the long-term viability and growth of the defence industry.” Furthermore, ISED lacked “some elements to ensure sound administration of the policy, such as clear rules and guidance on how to apply the policy and good tracking of contract obligations, economic benefits, and job creation.”
The report goes into a lot of detail on these failings. I won’t go over all of them, but they are pretty damning. I’ll highlight one, which is the failure to measure the policy against its objectives. The Auditor General found that the indicator that ISED reported on “did not align with any policy objective or performance indicator in the department’s performance measurement framework for the policy.” In addition, some short-term performance indicators “had targets that either were set below the baseline used as a starting point or were not aligned with what the indicator was measuring. Therefore, the targets would not improve or increase outcomes”. When it comes to the target to develop skilled jobs, ISED “had no performance indicators to assess whether this objective was met.” Indeed - during the audit period, while total employment increased in the defence industry, the number of highly skilled jobs actually declined.
Unfortunately, this is not an isolated problem. As I’ve written about before, we still seem to be pursuing policies that lack transparent goals and evaluation frameworks. Alex Usher writes about it well when he discusses the difference between making policy choices and implementing policy and how we can’t assume success just because a choice has been made and money spent:
In Canada, this is very poorly understood, and deliberately so. Most of Canadian governance is about evading responsibility in the event of non-achievement of results. This is why we are so bad at collecting data, and averse to doing things like actually announcing policy targets as opposed to simply announcing the initiatives themselves. This is not specific to the Canadian discourse on innovation, but it is a specific way in which Canadian discourse on innovation is impoverished.
The Auditor General’s report includes many actionable recommendations to improve the implementation of this policy. However, the government needs to consider them more broadly to ensure that it doesn’t always take a report like this to fix bad policy implementation.
Growth v Equity - US Place-Based Policies
Going in a different direction, this article by Grete Gansauer in the Regional Studies academic journal looks at the various Biden administration place-based policies and their implications for US regional inequality. Gansauer breaks down the 33 place-based regional development programs into four categories:
National boosters, which prioritize national competitiveness
Regional boosters, which stimulate regional-scale growth in lagging places
Regional asset builders, which enable future regional growth by emphasizing distributive investments in baseline infrastructures and capacities
National equity builders, which prioritize redistributive investments for social and economic inclusion.
Out of the four, national boosters have received 71% of funding (US$73.3 billion). As Gansauer sets out, they “focus on investing in cutting-edge sectors and high-potential regions; in this sense they carry out a national competitiveness agenda ‘with’ regions. This contrasts place-based policies ‘for’ regions, where regional-scale economic development is the central aim, such as regional boosters”.
Interestingly for me, Gansauer highlights that regional boosters that represent place-based policies “for” regional growth have received relatively little investment (US$9.8 billion) - something that might be “a missed economic and political opportunity to first, unlock latent economic potential within lagging regions, and second, to answer the concerns of discontented voters within such places”. Gansauer urges international policymakers to avoid these types of policies getting “overlooked between national competitiveness and social equity agendas” and to consider “how growth policies tailored to the regional scale can be strategically applied.”
As I’ve argued before, Canada needs to think more about place-based policy with an eye to tailored regional strategies and policies. To do this well, though, we need to heed something else that Gansauer argues:
In all, this analysis demonstrates that place-based strategies can be employed toward varying policy aims, which sometimes produces contradictions. As national growth interventions, place-based strategies may leverage existing regional competitive advantages to build global market relevance. However, place-based strategies are also necessary for disadvantaged places which precisely lack such capacities, infrastructures and institutions. Policymakers and policy entrepreneurs in the US and internationally must therefore be clear – in their politics and in the bones of policy design – about which aims (growth or equity) and scale of impact (regional or national) place-based policies are meant to serve.
That then nicely brings us full circle back to the need for clear and transparent targets and effective evaluation. There was a reason improving our state capacity was first on my list of policy areas as part of the agenda for a more inclusive and innovative Canada.
I agree with Canada needing to think longer-term in its place-based policies. I am particularly interested in the US Regional Technology and Innovation Hubs program - loosely based on this idea from ITIF/Brookings
https://itif.org/publications/2019/12/09/case-growth-centers-how-spread-tech-innovation-across-america/
Focusing on places with strong fundamentals for a new innovation ecosystem makes sense to me. I also think in the Canadian context, any innovation-related place-based policy should include massive investments in housing and other supporting infrastructure (including cultural infrastructure - which was important for Pittsburg's revitalization https://nextcity.org/urbanist-news/how-the-once-struggling-pittsburgh-is-reinventing-itself-as-innovation-hub).
After the announcement of the high-speed rail stops, I thought some of them sounded like some interesting B-tier cities that you could double down on to try and share growth in... but hosting a competition like the innovation hubs program and Canada's superclusters did would likely make the most sense.
As an evaluator, I can say that this is such a common way to do programs, not only at government levels but in the NFP sector as well. It's like evaluation is never invited to the table when it comes to structuring the intent of the initiatives and guiding the discussion about how change is supposed to happen. Yes, early stages are not an indication of how a program will work out in the end, but having clear expectations can help us measure how far from the goals we are, find the reasons behind results and adapt accordingly.