Why we survey doctors about training across the UK

Each year, we run a national survey to ask all doctors in training for their views on the training they receive. In 2016, we also asked for feedback from medical trainers. But why and how do we do it? Tom Bandenburg, Project Manager of the national training survey at the GMC, explains.

Why does the GMC run the national training surveys each year and how are the results used?

The surveys are our way of reaching 55,000 doctors in training, and 45,000 trainers, and asking them whether our standards for medical education and training are being met on the ground.

Essentially, the national training surveys are powerful screening tools that help identify concerns about training locally. We publish the survey results within a couple of months so concerns can be investigated quickly.

Within the GMC, we use the survey results to identify where doctors in training and trainers feel their training environment isn’t meeting our standards.

Within the GMC, we use the survey results to identify where doctors in training and trainers feel their training environment isn’t meeting our standards. We then work with postgraduate deans and local training programme directors to investigate concerns.

We also use the survey results to identify patterns. As we survey each year, we can see whether actions taken to address concerns raised in previous years have made a difference to doctors in training.

In response to concerns raised in the survey we have, for example, inspected and reported on training in emergency departments, and carried out a review into bullying and undermining in medical education and training across different sites. We will continue to use the survey to identify and focus on issues affecting doctors in training, as pressures in the NHS continue.

How would you rate the quality of clinical supervision in this post? - National training survey key findings report 2016, p. 26

‘How would you rate the quality of clinical supervision in this post?’ – National training survey key findings report 2016, p. 26

Why does the GMC produce a report a few months after the data has been released?

The survey results are published in our online reporting tool every summer. The tool is primarily designed for those who are responsible for medical education and training in a particular deanery/LETB, trust or board. They extract their local data, look at their red and green flags, and plan how to tackle issues raised by their trainees and trainers.

We report these findings back to the deaneries/LETBs and trusts to drive improvements.

What this release doesn’t do is provide a high level analysis of trends or patterns, so that’s why we produce a report later in the year.

We carefully analyse the results for trends that have been indicated by both doctors in training and by trainers. We also cross-reference some of the statistics to look for any potential links. For example, we looked at whether those doctors who report heavy workloads are the same doctors who raise patient safety concerns. We report these findings back to the deaneries/LETBs and trusts to drive improvements.

How have survey results driven improvements for doctors in training?

The survey results are taken very seriously. Across the UK, postgraduate deans and their teams work with local education providers to investigate concerns that doctors in training and trainers raise. They then have to report back to us on the actions they’ve taken to address issues. Every year, improvements are made across hundreds of training environments.

Every year, improvements are made across hundreds of training environments.

For example, in 2015, survey results showed a hospital with red flags for handover in rheumatology. Doctors in training worked across multiple sites, making face to face handover difficult. The local education provider acted on the results, introducing a number of changes including a daily formal virtual handover via a shared database. Thanks to this action, handover in the department got a green flag in the 2016 results. That’s just one example how the survey results lead to improvements – you can read more case studies in this year’s report.

Process map of how national training survey results are used

How the national training survey results are used by local training providers and others

Sometimes the survey raises issues that require significant interventions – for example, consultant vacancies mean that appropriate supervision isn’t always provided for doctors in training. If appropriate supervision can’t be put in place, doctors may be removed and re-posted to another training provider that can provide safe supervision and good education. These are often difficult decisions but we are absolutely clear that training environments must be safe for patients and for doctors in training.

It’s really important that doctors in training flag up problems because they’re essentially safeguarding that post for doctors in training who come into it in the future.

Since doctors in training rotate across different training providers they don’t always see the improvements made in response to the survey, and the impact they’ve had by flagging up problems. It’s really important, though, because they’re essentially safeguarding that post for doctors in training who come into it in the future.

Do you think the survey really shows how doctors feel about their training?

This year we were concerned that the survey results didn’t appear to reflect how doctors told us they were feeling.

This year we were concerned that the survey results didn’t appear to reflect how doctors told us they were feeling.

The results showed most trainees continued to be very satisfied with their training, which seemed at odds with the many concerns that doctors were raising about working in the NHS, and with the industrial action in England. So, we decided to use focus groups and other opportunities to ask doctors in training what they thought about this apparent paradox.

Those we spoke to told us that they saw the quality of their training as separate from their feelings about being employed by the NHS, and on the NHS as a whole. So, they weren’t surprised that the results hadn’t shown any significant dips in overall scores.

This really demonstrates the professionalism showed by doctors in responding to the survey – they didn’t allow their frustration with the system to cloud their responses, which were based on their individual, local experiences of their individual post, and training site.

Those doctors we spoke to told us that they saw the quality of their training as separate from their feelings about being employed by the NHS, and on the NHS as a whole.

The survey hasn’t been set up to measure views of the NHS or government policy and when we asked doctors whether the survey should attempt to measure morale more generally, many felt that it shouldn’t. They told us it was helpful that the survey is designed specifically to measure and address the quality of their training posts.

Of course the high overall satisfaction score doesn’t mean that training is fine everywhere; when you look at local results you can clearly see there is variation – places that doctors have told us training is excellent, and others where they have reported it isn’t meeting our standards.

How does the GMC make sure that doctors in training are involved in developing the survey?

Every year, to make sure we’re keeping in touch with the people we survey and to keep the questions relevant, we speak to doctors in training about our work to develop the survey.

This really demonstrates the professionalism showed by doctors – they didn’t allow their frustration with the system to cloud their responses.

To do this, we use a group of around 10,000 doctors in training who volunteered (via a tick box in the survey) to help us develop and refine the survey questions, through online piloting and focus groups. The feedback we get is really valuable and always shapes the final questions. We also work with training organisations to evaluate results on the ground.

One of the key areas we’re working on now, for next year’s survey, is rota design, because doctors have told us that poor rota design has a huge impact on a doctor’s training experience.

You can read this year’s report on the key findings from the 2016 national training surveys here

Related posts

Prof Terence Stephenson, Chair of the GMC, outlines our review of how we can work with others to make postgraduate medical training more flexible for doctors 

Kirk Summerwill, Head of Intelligence and Insight at the GMC, talks about the potential of the state of medical education and practice report to drive improvements for doctors 

Leave a comment

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s