Do we need a Higher Education Academy?

25 July 2014

Probably not any more, according to my recent piece for Research Fortnight (subscription required to view, but a version of it appears below):

Ten years ago the Higher Education Academy was set up as a single UK-wide organisation to support teaching and students’ learning experiences. It combined the Institute for Learning and Teaching in Higher Education, which pushed an ideological agenda that all lecturers should become registered teachers, and which had enjoyed a fairly disastrous reception across most of higher education, and the network of subject centres, which had been received much more favourably. Many of the subject centres were led by successful international scholars and they went with the grain of most academics’ beliefs, seeking to support them within their disciplines and accepting that teaching and learning were inseparable from research and subject content.

Combining two organisations with such different cultures and personnel was never going to be easy. Through a series of painful restructures and changes of senior staff, the Higher Education Academy eventually created a working model, but deep divisions remained under the surface and hampered the development of a coherent reputation and value for money. During the first few years many areas continued to be over-staffed and inefficient; there was always at least one interested external body that fought against their reform.

It faced other challenges too. First was the need to adapt services to the differing needs of the four home nations. It also had to cope with frequently changing funding council priorities and directly competing and much better-resourced initiatives such as the Centres for Excellence in Teaching and Learning, a scheme which ultimately failed to deliver its promised benefits. It had to win over divergent communities such as educational developers, Universities UK and professional subject organisations. And at the same time it had to establish standing as a credible research-led organisation.

The first evaluation of the HEA in 2007, by the Higher Education Funding Council for England, recognised that it had had to overcome major challenges in establishing itself as a distinctive organisation and balancing expectations from a wide-ranging and demanding set of stakeholders. It found that the academy had had a positive influence on teaching and the student experience, but that it had yet to realise its full potential. It had to work harder at engaging with partners and customers, in evaluating impact and value for money, in managing subject centres more consistently, and in campaigning for better learning and teaching.

But progress towards these goals continued to be hampered by irreconcilable vested interests. It became abundantly clear that proper change would require radical surgery, and this led to a plan for a leaner organisation and sweeping restructure. The subject centres, despite some significant successes, would have to go; they were simply too devolved for efficient management and much too expensive. The plan was implemented between 2010 and 2013 and the HEA has since become more efficient and more focused.

It came as something of a surprise to me, therefore, to see that a more recent review, published in June 2014, has identified issues that are familiar from several years ago. On the positive side, the HEA’s greatest success has undoubtedly been the establishment of the UK Professional Standards Framework for teaching in higher education, and its associated and expanding professional accreditation services. Other achievements of note include a continuing record of support for individual academics through their disciplines and a range of survey services.

But, according to its reviewers, it has yet to establish better communications with institutional leaders, particularly in pre-1992 universities. It has failed to demonstrate impact clearly enough and still tries to spread itself across too many areas. It has not had notable success in influencing policy, except in some key areas such as its pro-vice chancellor network, and the quality of its research is variable. It comes across as an organisation that is still more concerned with managing its internal tensions than meeting its customers’ needs.

What of the future? The Higher Education Academy announced in April that its public funding (which accounts for 95 per cent of its £16 million annual income) would end in 2016. Its business development model for a sustainable organisation involves increasing subscriber and consultancy income, but it has already fallen short of its targets in this respect. Its chances in a competitive environment for higher education consultancy must be regarded as slim, unless it can appoint staff with immediate experience of the realities and uncertainties of a private sector business model.

More fundamentally, do British universities need a Higher Education Academy any more? Higher education institutions have come a long way since 2004 in improving the quality of their students’ experiences and engagement. Australia abolished its equivalent organisation a couple of years ago. The British version provides services, knowledge and expertise that institutions think are important. However, these valued functions could be delivered by opening up the remaining market for specialist support services to a range of providers. A small office attached to the funding councils could support competitive tendering by firms and universities for projects. The day of a central, taxpayer-funded body to support the enhancement of teaching in higher education may well be over.

Paul Ramsden is a key associate of PhillipsKPA, an educational consultancy based in Melbourne, Australia. He was the founding chief executive of the Higher Education Academy from 2004 to 2009.

– See more at: https://www.researchprofessional.com/0/rr/he/views/2014/6/Is-the-HEA-fit-for-purpose-.html#sthash.bN11od6m.dpuf

Advertisements

Published: Review of NSS

4 July 2014

I have been involved in a review of the National Student Survey commissioned by the funding councils. The report came out this week.

Here’s a short summary of some of the conclusions that are not always explicit in the report:

1. A campaign to get the NSS dumped in favour of the U.S. National Survey of Student Engagement (NSSE) has failed. The NSS is valued, valid and impressively helpful as a way of enhancing teaching and the student experience. Universities and colleges don’t want to lose it. They don’t want it replaced by a survey that focuses on student engagement rather than on the quality of teaching.

2. The NSS is not a ‘satisfaction’ survey. It was designed as a student evaluation instrument (there is only one question about overall satisfaction).

3. A falsehood has been widely circulated that the NSS is not related to academic achievement or ‘learning gains’. Although the results have not been made public, it is certain that higher scores on the NSS are associated with better degree results, even after controlling for students’ entry qualifications.

4. Any modifications to the NSS will need to be carefully trialled and extensively tested to ensure that changes do not compromise the strengths of the survey and its considerable value to higher education institutions. Minor changes include the potential inclusion of a small number of extra questions about students’ engagement with quality processes and learning, many of which are already available in the optional set of questions.

5. The NSS has probably been the most effective and best value single policy initiative in the area of improving the UK student experience in the last 10 years.


Change of career!

20 December 2012

Having read the interesting ideas of Professor Howard Hotson about higher education policy, I realise I could benefit from a change of career.

Howard knows a lot about Early Modern Intellectual History and teaches at Oxford. He isn’t an expert on higher education. Although that hasn’t stopped him giving speeches about how it’s in a global crisis and writing about it for the Guardian.

Following Howard’s exemplary lead, I am going to stop talking about things I know about and shift to something I know nothing about.

I thought early modern intellectual history might do the trick.

Watch out for my forthcoming books and papers on Protestant Europe in this period. I rather fancy a special emphasis on international intellectual developments affecting Germany between 1555 and 1660. I know nothing about it at all.

I think I might supplement this with some stuff on traditions of intellectual innovations connecting late Renaissance humanism to the new philosophies of the 17th century; pretty much head to head with Howard, in fact.

One last thing, Howard. I’m already negotiating a generous advance on my next book on the revival of millenarianism in early modern Europe.


No apocalypse for higher education

14 December 2012

The future for English higher education is bright, not gloomy — if only it can escape its addiction to big government and take charge of its own destiny. My piece in Times Higher Education, here.


Non-news story of the week

18 May 2012

Even by the standards that now characterise policy debate and government pronouncements about the student experience, media coverage of the recent HEPI report plumbs new depths.

The report in question and the media recycle similar assertions to those made when the first report came out in 2006 – that variations in the contact time students have with lecturers are grave matters, that students are dissatisfied if they get fewer contact hours, that if students pay more in fees then they should expect and get more contact time (why?), that time consumed in studying is linearly and causally related to the quality of learning achieved, that students in the UK are getting a bad deal and probably inferior degrees compared with those in the rest of Europe…

These are claims that fly in the face of evidence and logic.

Just a few points:

  1. What students say they get is not a measure of how much they actually get. Self reports on time spent are a poor substitute for hard data.
  2. The new report somewhat disingenuously says that it and the previous ones are not mainly about contact time. Apparently it was all the fault of universities for thinking that they were.
  3. The fees are not extra money; they replace taxpayers’ money. Whether or not more contact is a good thing, it is illogical to assert that the same amount of resource should lead to more teaching time. This is a rhetorical sleight of hand on HEPI’s (and the government’s) part.
  4. Beyond a certain minimum, amount of time is unrelated to quality. It is not how many hours you study or spend in lectures that matters: it is the value you add to your learning during that time.
  5. The report itself provides evidence that there is no relation between national student survey results and either contact time or private study time. However, it invites us to draw the opposite conclusion.
  6. Does anyone really believe that students in Italy and Spain receive a better education than in the UK because they spend more time in class?

In fairness, quite a few comments on the Guardian’s story about this report pick up similar issues and articulately shred the whole media hype into fragments.

The report does raise some interesting issues, though. I stand by the response I made to the first HEPI report back in 2006 (except the parts that are a naked self-promotion of the HEA, of which I was then chief executive):

As a social science undergraduate, I studied statistics. Being something of a duffer at maths, I spent five hours private study on what my friend, Kevin (maths A-level), did in 30 minutes. I was at every lecture 10 minutes early. He skipped half of them. We both passed. We were both happy with the value for (taxpayers’) money the course gave us.

The point is that the number of hours you study does not tell you about quality of learning, student satisfaction, or value for money. It does tell you about students’ experiences, and we need to know more about them. That is why the Higher Education Academy funds research that tells us about those experiences.

The Higher Education Policy Institute (Hepi) report gives us a snapshot of how around 15,000 first and second-year students in English universities, taking a range of subjects, perceive the services and inputs they receive. A striking conclusion is the high level of student satisfaction with their academic experience. This is consistent with the findings of the National Student Survey.

What do the reported differences in hours spent tell us about the student experience in English universities?

We learn something about inputs, but very little about outcomes. People learn in different ways and at different paces. The relationship between hours invested and students’ learning outcomes is intricate. Research into students’ approaches to learning suggests that a high number of contact hours and private study does not automatically mean they learn better. You can fritter away 30 hours in front of the computer or your books and emerge with very little to show for it. You can go to a lecture and remember nothing significant. What is more important than time is the quality of the engagement – the degree to which you try to understand the material.

How any one student learns is a complex mixture of motivation, ability, peer pressure, available learning resources, previous knowledge, learning opportunities outside the classroom and other factors – as I found when I sat next to Kevin in statistics classes all those years ago.

The students surveyed by Hepi were broadly satisfied above and below a certain number of hours of set teaching time. It would be good to find out whether it is the hours themselves or the quality of what they do with those hours that affects their view. Are students who study longer hours more or less likely to be positive about their overall HE experience and to succeed?

Nor does the report tell us much about the degree system in English universities. Assessment does not depend on how many hours it took someone to complete a programme of study. A strength of the UK sector – not just in England – is the freedom it gives universities and colleges to set their own parameters and students the chance to find the method of learning that best suits them. It would be a pity if a crude cut of the data reported on hours spent became another form of league table (longer hours equals harder degree).

What the report does do very successfully is open up a number of policy areas around widening participation and definitions of “full-time” students. There is striking variation in the proportions of students at the different institutions who do differing hours of paid employment. This survey found that the impact of paid work on student satisfaction and on academic outputs is relatively small. The greatest reported impact is on perceptions of value for money among students who have paid jobs on top of their studies. This is an area crying out for further investigation.

The Higher Education Academy funded Hepi’s research because it is important to understand how students say they use their time. These are student experiences – not facts about quality. The quantitative benchmarks established, while limited, will provide a valuable basis for comparison as the impacts of fees and the next stages of the widening participation agenda begin to filter through to universities. We still need to find out more about the student experience: this report raises important questions about it.


How to widen participation. Really.

5 May 2012

Why we should allow universities to enrol as many students as they want to. Here.


Must try harder for £315m

16 March 2012

I have a piece in Times Higher this week about the fiasco of the “Centres for Excellence in Teaching and Learning” in England. Read it here.