Which light came on? Understanding KEF dashboards

Written by Alison Price

 

The new KEF (Knowledge Exchange Framework) dashboard (released March 2021) seeks to showcase the impact of English HEIs across business, partnerships, and the community activities.  However, the attractive online presentation appears to be dimming the long-held hopes that KEF could be the much-needed driver to long-term institutional commitment.

This blog will explain what the dashboard shows (and what it doesn’t) and explains how keeping your own records remains critical.

The KEF data was released within an interactive dashboard webpage which provides sector scores that allows you to have a detailed focus on one institution, delve into each of the 7 perspectives or 7 HEI groupings (Clusters) or even directly compare 2 institutions on one page.  The data is presented in percentage bands, averaged over three years, and normalised to help with the institutional comparisons (by using HEI income or student numbers as FTEs to help reflect the size of institution).

Research England describes ‘Knowledge Exchange’ as including a set of activities, processes and skills that enable close collaboration between universities and partner organisations to deliver commercial, environmental, cultural, and place-based benefits, opportunities for students and increased prosperity.  KEF creates an overview (dashboard) of knowledge transfer activity as described by 7 different perspectives of knowledge exchange (below).

  1. Research partnerships
  2. Working with business
  3. Working with the public and third sector
  4. Skills, enterprise, and entrepreneurship (SEE)
  5. IP and commercialisation
  6. Local growth and regeneration
  7. Public and community engagement

With a commitment to reduce the admin burden in introducing a new annual, institutional metric (joining TEF and REF) Research England sought to develop a metric that draws upon existing university returns (such as HE-BCI Higher Education Business & Community Interaction and the HEIF Higher Education Innovation Fund, and its derivations such as HESA data).

Inevitably, Enterprise Educators are drawn to Perspective 4 (which depicts skills, enterprise, and entrepreneurship – the ‘SEE’ perspective) which is collated from 3 measures:

  • CPD-CE learner days delivered (normalised by HEI income)
  • CPD/CE income (normalised by HEI income)
  • Taken from (HE-BCI) the registered student and Graduate start-ups (normalised by student FTE)

But this has required some unpicking by enterprise educators, who can feel that the SEE perspective is a pooling of apparently unrelated data that does not match the perception created by the title as the specific metrics selected can create diverse results by combining start-up with activities that most enterprise teams/units are not responsible for delivering.

For example, the KEF dashboard can show one institution scoring in the top 10% for Skills, Enterprise, and Entrepreneurship (SEE), where this “top 10%” scoring comes from strong CPD/CE and learner days, rather than graduate starts.  The HE-BCI data evidences graduate starts in the bottom 30%, but their average across all three SEE metrics places them in the top 10% as CPD boosts the results.  Conversely, institutions falling into the bottom 30% for the SEE perspective overall, have found that a strong start-up record which acknowledges them as in the top 30% but at first glance, the main dashboard metric disappoints.

However, this is the first time the sector has seen the HE-BCI data, normalised by student numbers and the results are of note.  For example a review of each specific cluster underlines the need for a wide range of KE activity to score well but also highlights the arts sector (cluster of institutions) delivering a strong performance in terms of business starts when compared across the sector.

Ultimately the dashboard serves as a reminder that this is not what KEF is not, and was not, intended to be. This is a knowledge exchange scorecard.  It might not necessarily be the clear driver to a heightened appreciation of our agenda, but the role of enterprise educators is key to demonstrating a good institutional response to this agenda. With #ented enterprise, entrepreneurship and skills forming part of teaching, learning, extra-curricular and knowledge exchange, our agenda forms a key role in supporting all institutional metrics (REF, TEF and KEF) and can be the missing piece of the jigsaw in developing a ‘metric sweet spot’ (Powell 2018) which supports the student experience, on or off campus. 

Where next?

The KEF dashboard evidences the need for metrics to be agreed within our community.  This is the challenge that EEUK is taking up and seeking your support with.  You are invited to explore the KEF dashboard and respond directly to the KEF survey (on its accessibility and presentation) but please share your metric insights with us so that we can pool our learning and response across the sector.  Appreciating metrics will be critical as we move into understanding the sector within the ‘new-normal’ and, as we absorb 2020 data. In a year when many graduate businesses have decided not to start, or have delayed start to pivot away from their original intention, our Chair, Gareth Trainer, wonders what will be the impact on these three-year averages?

Inevitably it means that keeping records of our impact and impact case studies and student stories remains key and your work will also feed into  the wider KEF work which continues with at least 126 pilot institutions where the 8  KEF Concordat principles are fleshing out KE practice.  If you want to find out more, this collaborative work being shared through regular online ‘deep dives’  which are worth a visit as you seek to understand your role in the jigsaw of HE practice.

Get in touch with me to share your metric insights.

Alison Price

Head of Policy and Professional Development

EEUK