We Need Evidence-Based Public Policy More Than Ever
About five years ago Herb Emery and I published a research paper on the decline of Canadian economics. The paper analyzed the journal publications of academic economists at Canadian universities and found that explicit analysis or recognition of Canadian issues was in decline, especially for those hired since 1990. As the incentives to conduct theoretical research or use data from the U.S. or overseas in empirical research have not changed, we doubt that the pattern has changed, except that those senior economists more interested in Canadian issues grow fewer with each retirement cycle.
While supply contracts, the need for good empirical analysis of Canadian issues continues unabated. Indeed, in the emerging era of fake news and alternative facts, the presence of insightful and timely data and analyses may be more valuable than ever in sorting through the rapidly expanding political, economic and social chatter. I would argue that Canadian policy discussion appears to have escaped the worst of this trend so far and I would point to the federal government’s stated commitment to evidence-based policy as well as its apparent reliance on sound evidence in its treatment of issues such as the new Canada Child Benefit and the restoration of the long-form census, although there is no room for complacency.
The forces working against evidence-based public policy are not hard to identify. Such policy takes time and money, both of which are always in short supply. For one thing, it takes time to sort through the studies and to assess good research from bad, something with which academic economists could assist if there were more focus on Canadian issues and evidence in academia. This would provide an important resource for the media, politicians and policy watchers at a time when traditional journalism faces unprecedented resource challenges. For another thing, it also takes careful and expensive data development, which can also benefit from academic advice but ultimately requires leadership and dollars from government. There are already signs that lack of data development is hindering policy development, which may provide convenient excuses for political expedience to intervene even with a government committed to doing things differently.
Years of budget cuts at Statistics Canada have taken their toll. In one of my own areas of research, the Survey of Labour and Income Dynamics has been an essential tool to analyze labour market activity and economic well-being among Canadians, but its panel format has now been scrapped in favour of a cross-sectional survey that will be less expensive, less valuable and less attractive to researchers and policy analysts. This represents regression in a long-standing Statistics Canada program to develop labour market and income data, from the Survey of Work History and the Labour Market Activity Survey to the overlapping six-year panels of SLID from 1993 to 2010 that link survey data on labour market activity with income data from tax records. It is hard to see how the current SLID cross-sectional data will provide reliable information on labour market and income “dynamics,” the essence of which is change over time.
There is a sense that administrative data may fill the void created by such cuts. While Statistics Canada and other data providers have been developing this potentially important data source, it must be recognized that there are significant limitations to administrative data that prevent it from providing answers to important social questions and acting as a perfect substitute for well crafted household surveys. Senior researchers have observed that those well designed household surveys may face more obstacles from nonresponse and misreporting in the future, but there does not as yet appear to be an alternative if evidence-based policy is a priority. In this regard, I was concerned that the low response rates for the voluntary National Household Survey in 2011 might carry over to response rates for the Census long from survey in 2016, but that does not appear to have been the case, as 2016 response rates appear to have exceeded the Census long form response rate for 2006. From this experience, forecasts of the demise of survey microdata appear to have been greatly exaggerated. Now, can we stimulate new interest in academia and government to restore, utilize and expand valuable survey data resources for evidence-based policy?