The world of work is changing fast. Our evidence generation approaches need to keep pace.
The world of work is changing fast. As technology, global supply chains, demographics, and climate change are quickening the pace of change in our labour markets, Canada’s skills development systems need to evolve to keep pace. Workers may need to re-train several times in their careers in order to keep up with these rapid changes. And the COVID-19 pandemic has placed increased pressure on our skills development systems, causing rapid and profound changes in the labour market.
In this context, how we can we generate evidence to help us improve our skills development systems? Evaluations that take several years and extensive resources to find out if a training program works in a community will be of little use if, during the same time period, the local economy changes and employers no longer needs the skills taught in the program. We need a more agile approach that not only delivers timely insights into whether a program works, but why it works and how it can be improved.
Traditional evidence generation approaches prioritize rigorous evaluation of skills development to understand what works. These approaches are useful for measuring a program’s impact, but don’t provide the full range of insights needed to “future-proof” our skills development systems.
Many skills development practitioners are rapidly developing, iterating and adapting programs and services in response to changes in the labour market. This means that program models may not be well-developed and outcomes may be uncertain or hard to define.
Traditional evaluation methods that focus on measuring outcomes are unlikely to be useful in the cases. Other evidence generation approaches that provide timely insights to help develop and refine innovations are needed. Approaches like user testing and rapid-cycle evaluation can provide actionable information to practitioners to help them iterate and develop their programs.
Ongoing learning and improvement
Evaluations of skills development programs at a single point in time are not enough to ensure that a program’s effectiveness will be sustained in the long term. Given the rapid pace of change in the labour market, skills development practitioners need continuous evidence about how well their programs are performing.
Collecting data about program implementation and outcomes on an outgoing basis can help practitioners learn whether their programs are consistently achieving outcomes, identify opportunities for improvement, and track what happens when they make changes or adjustments. For example, short satisfaction surveys with program participants at regular intervals can help practitioners identify areas of potential improvement and monitor how program changes affect levels of participant satisfaction.
How we get there
As a consortium partner of the Future Skills Centre (FSC), Blueprint is developing evidence generation approaches that can help prepare our skills development systems for the future of work.
We are continuously working to design, test, iterate and strengthen our evidence generation approaches – mirroring the continuous learning and improvement that we are trying to foster in the skills development ecosystem. Our aim is to develop tools and methods that will help us generate rigorous, timely and actionable evidence that is relevant to policy and practice.
Our focus on ongoing learning and improvement requires us to collaborate closely with practitioners and build strong and trusting relationships. We work closely with skills development organizations to ensure we have a deep understanding of their program models and co-design evidence approaches that align with their needs and goals.
By pursuing these new directions, we are positioning ourselves to generate the evidence needed to help our skills development ecosystem be more resilient and responsive to the changing world of work.
Read more about our evidence generation approach for the Future Skills Centre in our Evidence Summary 2021 and 2020 Annual Evidence Report.