The tyranny of Graduate Outcomes
On the release of the Graduate Outcomes data I thought it would be useful to reflect on some of the complexity and unintended consequences inherently linked to this dataset. This started as a Twitter thread, but I felt it justified a slightly longer form article…
Careers services have a pivotal role but there is not always causation between careers service provision and Graduate Outcomes; in this article I will explore some of the other factors that influence performance and ask if there is ever a good Graduate Outcomes result for a careers service?
Regional labour market conditions
Does the region have a wide range of employers from different sectors with a suitable volume of graduate level roles?
This is particularly pertinent for institutions that recruit locally and have graduates that predominantly will stay locally post-graduation.
Although the rise of remote-working may start to alleviate this issue, graduate mobility will remain a structural challenge.
This also plays out within the Russell Group. This year, the six top performing Russell Group institutions on the High Skilled Employment metric are all based in the South of England.
Course mix
Institutions with specialisms in Nursing, Education and STEM areas will tend to perform quite strongly, this is not necessarily linked to the quality of careers service provision but more based on the vocational focus of their courses and labour market demand for some key disciplines.
Graduate Outcomes for courses which service the creative industries tend to be more acutely impacted by recessions and, even in less tumultuous times, careers in those disciplines can take longer to come to fruition. We also know that part-time and self-employment feature more heavily in these disciplines and the Graduate Outcomes metrics are quite blunt when it comes to the nuances of these employment routes.
Uneven student growth
The UK Higher Education sector is based on student-led demand, and, despite the obvious policy intentions of this Government, is not currently in any meaningful way linked to labour market demand.
If you recruit heavily in some Humanities disciplines but your STEM numbers remain static, it is likely you would see a drop in Graduate Outcomes and LEO data. That is simply counting what we can measure – the positive impact Humanities graduates have on society is harder to neatly define but should never be underestimated. We must also be aware of the tyranny of averages as thousands of Humanities graduates will outperform STEM peers on GO and LEO measures.
Without wanting to sound fatalistic, strategic recruitment decisions, such as significantly lowering entry requirements or a sharp growth in numbers in disciplines that are more challenging for Graduate Outcomes, can be as influential to League Table metrics as anything that happens through the delivery of courses or through the work of careers services.
There are always improvements that can be made to provision and impact that can be delivered but more recognition and radical honesty about the challenges of uneven growth and how inputs inevitably impact outputs would be a helpful starting point to strategic planning discussions.
Graduate Outcomes is part of a much wider discussion about attainment gaps and cannot be solved in isolation.
The survey
Then we have the survey itself. Factors around survey response, or lack of, in high-performing areas can impact performance, just one example is the challenges that emerged around securing responses from NHS workers during the pandemic.
As can disproportionally high response rates in areas where high-skilled outcomes are more challenging to deliver within a 15 month timeframe. Just this week I spoke to colleagues who were reflecting on a particularly keen and engaged academic department that had managed to successfully encourage their graduates to respond to the survey in very high numbers, but this clearly well-intentioned effort had actually ended up being detrimental to the overall institutional Graduate Outcomes performance.
There is of course lots of SOC code nuance at play and a common criticism of the centralised survey is callers are less invested in unpicking this nuance, this won’t always be fully justified but there will be a number of questionable cases.
What even is a good result in Graduate Outcomes for careers services?
There are some incredible careers services with sector leading provision that won’t always obviously play out in Graduate Outcomes performance often due to regional labour market or course mix factors. There are also some institutions with comparatively limited careers service provision that have stronger than expected Graduate Outcomes often due to similar factors such as comparatively favourable course mix, student demographics and regional labour market conditions.
You could argue there is no such thing as a good result for a Careers Service from Graduate Outcomes. Those careers services with limited resource and provision can lose any form of burning platform to drive improvements when GO goes the institutions way (see also TEF Gold).
The flip-side is of course those that have slightly disappointing outcomes, they are at the mercy of knee jerk reactions from senior leaders who believe they have a simple solution to a wicked problem having only sporadically engaged in the employability agenda.
This can take many forms. The most obvious single solution is an ambitious target, often unfeasible and undeliverable due to regional and sectoral employer capacity and demand (see also T Levels), for growth in placements having had sight of the correlation (not necessarily causation) between placements and positive destinations in Graduate Outcomes.
The importance of strategic influencing
There is a distinct issue of senior leaders struggling to effectively benchmark careers service provision beyond Graduate Outcomes performance - they often have no idea if the careers service is sector leading or bang average, this is why strategic influencing is crucial for careers leaders.
Strategic influencing is for life not just for Graduate Outcomes. It can of course be challenging to stay on top of whilst putting out a million operational fires but some of the ways this can be done include:
- Consistently educate colleagues on the nuances of the employability agenda and provide detailed labour market insights
- Evidence impact (not activity), build a bank of case studies and always share successes
- Seek external recognition, quality accreditation and enter industry awards processes. Rightly or wrongly, even achieving shortlisting for national awards can reassure senior leaders you are delivering quality provision
- Take PR seriously. Be omnipresent on social media and through internal and local press articles
- Create employer advisory boards, share their market intelligence widely and formally capture their positive feedback and the issues they identify
There are of course universities that have achieved improved performance through increased institutional focus on employability and stellar work by careers services, don’t shy away from taking credit for your successes. I am of course not in any way seeking to diminish those success stories.
All you can do is control the controllable. Consistently analyse how you can drive more impact, deliver at scale and provide targeted support to those students that need it most.
Don’t be defined by any single metric.
I would be interested in your thoughts.
Head of Careers and Enterprise Service at Queen Mary University of London
3yGreat article Mike - can identify with so much of it and with the comments esp TEF Gold. Your point about correlation with placements is also spot on. I'd say the same if now true of many activities with a time lag in GO and lower response rates. Employability is everyone's business within a university and encouraging students to connect both internally and externally is key to their success. Who can ever say who or what helped that student to get the graduate level job; whether it was in curriclum, co-curricular or extra-curricular? But that shouldn't stop us from what we instinctively feel would help them whether we can pinpoint causation or not.
People Support Manager
3yLucy Wilson-Whitford
Learning & Education | Delivering high impact workforce and improvement Initiatives
3yBuilding social capital is critical to successful GOs. Careers services can assist students in understanding what this is and how to build over a lifetime. Difficult to measure (alumni data?) but much more valuable.
Head of Careers at University of Bath
3ySome super insights here - I particularly like the section on benchmarking and the 'power' (or lack of!) of GO for influencing. Excellent suggestons from here and the comments - thank you!
Academic and researcher in vocational psychology and career development, University of Melbourne
3yIt’s a very interesting question about the value of careers services and how we measure the impact. Graduate outcomes is absolutely not the measure that should be considered. As you have highlighted, there are other factors that have a greater impact on graduate outcomes. Labour market supply/demand and institutional reputation are well known factors that explain most of the employment outcomes. The other question I would ask, is how much impact would you expect a careers service to have on the institution level graduate outcomes? Let’s say the service directly supports 30% of the student population. We could imagine that one-third of those students accessing support are seeking assistance in making a career decision. The assistance will mostly have an impact on choice of occupation rather than increasing employment outcomes. Another third of students may seek assistance with preparing a resume and other aspects of job seeking. The service will support some of these students obtain a job more quickly than they would have done without assistance. Some of these students will get a better quality job because of their improved presentation. They are likely to also feel more confident in their job search process, which could be assessed through measures of job search self-efficacy or perceived employability. The remaining third might seek guidance on changing subjects. As a result of the guidance, some will change their subject, others will continue, and some will pursue postgraduate studies. In this hypothetical example I have given, we can see that the careers service has benefited significant numbers of students, however it is very unlikely that we would see a measurable impact on graduate outcome statistics. We do need to consider what is important in measuring the performance of careers services and move away from the single focus on employment as the only measure of success.