Alternative Medicine League Table - a ranking based on foundation programme score

Who even cares about rankings, right? Chances are that, however you feel about the validity of the CUG or Guardian tables, you will definitely be influenced by a universities rank on the league tables. Even the universities themselves pride themselves on strong rankings, whether national or international. But are the current medical school rankings reflective of the five years ahead? I decided to compile my own rankings based on foundation school applications, so read on if you are interested!

Right about now, you are probably doing some window shopping on what medical schools you would like to go to. Chances are, you have gone to a couple of open days and either loved/hated the place, and are trawling around the internet for any information that could help you pick exactly where you are going to spend the next 5/6 years of your life.

So when my sister needed help in making the decision and asked me about whether league tables were accurate or not, I didn't really know the answer. Naturally, I gave her an extremely biased viewpoint of how imperative league tables were, purely because UCL has ranked quite well in recent years!

However, I then pondered as to whether they really matter. Is there really a 'better' medical school? Does one university train you to become a better doctor? Will going to a specific medical school improve my career prospects?

I went on a mission to find some evidence to see if a university's league table correlated at all with the doctors they were producing at the end of the course. After wasting way too much of my summer looking for data,  I thought I would publish what I actually found, incase anybody else is actually interested in what I stumbled upon.

OK so for those of you that don't know, the foundation programme is what you go into after you get your Medicine degree. You are scored based on your exam performance in medical school, any additional degrees/publications you have, as well a Situation Judgement Test (SJT) like you had to sit in the UKCAT (but probably significantly harder).

What do these numbers even mean?
An alternative medical school ranking based on
average foundation application scores

I feel that I should explain what the numbers actually mean, otherwise they seem kind of random. Basically, I ranked all the universities based on the average FP score that their graduates achieved - I did this for each year. I then took the average of the 2013, 2014 and 2015 ranks, and this average churns out the mysterious number listed next to the university name. So if a university ranked 2nd, 3rd and 4th in the three different years, then their average rank would be 3.0.

If you asked me where I got this graphic from, you probably wouldn't believe me if I told you that I made it myself, would you? Ever since I found out about Canva, I can't stop looking at anything without wondering what it would look like on an infographic. I am willing to admit, I went a little bit overboard, but when you don't have a summer job, time is not exactly a scarce resource. Who says medics can't be creative?!

So as you can see, it pains me to admit that Oxford has taken the top spot in all three years. London universities also seem do well - King's and St. George's fare quite poorly on the traditional league tables, but do well when it comes to the foundation programme. Warwick being so high up may surprise you, but as I shall explain later, it makes total sense.

Just as a side note, if you actually care about the numbers, then don't hesitate to contact me - I will be happy to give you the spreadsheet where all the number crunching has been done!

Now, I am not for one second suggesting that you should only use this to inform your decision. I just thought I would post them because it is just another way to measure medical schools, and nobody else seems to have written much about them. The reason for this might because they don't even matter one little bit, but I thought I would just put them on here.

Why these rankings mean absolutely nothing...

I have thought about why these rankings don't really mean that much, and I came up with a couple of reasons:

Universities like Oxbridge/UCL/Imperial tend to have higher entry requirements and take more academically able students. Therefore, it is not the university teaching itself, but the fact that the more able students are going to these medical schools.
I think that this is partially true. I only have data for half the medical schools, and this is only for one year, but based on this small amount of data, there does seem to be a significant positive correlation (0.75, but remember there is limited data) with the percentage of students getting an A* at A level and the average score on the SJT.

Some may also say that the more traditional medicine courses focus more on exam technique, and so there could be some kind of practice that means the medical school is specifically teaching students to perform well the SJT exam, not for training in general.
This is a wild idea, and frankly I have NO idea as to whether this even happens. Even if it did, how could we prove that this kind of practice happens?
If anyone does have any suggestions, then I would love to hear about them! Please get in touch with me, as this is a topic that I am finding really interesting.

The intercalation effect...

The so-called 'top' universities make you do an intercalated degree. There is therefore an argument that the medical schools that rank highly on this table have more people intercalating. As Warwick only takes in students with a previous degree, you can understand why their average score is higher. You may have noticed Swansea is not on here, and that is because they have only had one year of graduates, and so I can't include them on here.

OK, so let's take away the 'intercalating factor and just look at the average SJT scores achieved by the cohort of each medical school. Even when you only look at these SJT scores, the rankings stay pretty much the same as the ones shown on the image. However, you do get a few shifts up/down, including Newcastle and QUB, which actually improve their relative ranking (which may indicate that not many people intercalate at these universities).

They are completely unrelated to any other rankings, right?

The short answer is yes and no. I decided to look at the correlations between the rank based on FP scores and the ranking on the Guardian, the Complete University Guide and the QS rankings. What the numbers show is that the ranking tables we currently have in place aren't very predictive of FP score ranking, but some have a greater 'value' when it comes to this.

The lowest correlation coefficient is with the Guardian rankings. There is a lot of controversy with the Guardian rankings, and many people think that they are quite poor at determining which medical schools are 'better', and based on the data here, they seem to be the least useful ranking to look at.
In the middle, we have the QS ranking. When looking at the overall QS rankings (not the Medicine ones), we get a correlation of 0.47.
The greatest correlation is with CUG, with the most correlated being the 2015 CUG ranking with a correlation 0.55 (but the average correlation with CUG is a little bit lower).

Essentially, because the rankings use different methodology, there will be some differences. If you believe that everything I have said is total rubbish, then the best rankings you could use are probably the Complete University Guide rankings.

Why these rankings shouldn't be totally dismissed

Following on from the point I made earlier, a lot of people say that the current league tables' methodology is not very relevant when it comes to measuring how good a medical school is. I then decided to delve into the methodology for both the Guardian and the Complete University Guide; they both of these take into account satisfaction, entry standards and spend per student, but assign slightly different weightings to them (mainly because the Guardian doesn't consider research, so the other measures have a higher weighting).

So the Guardian doesn't take into account research for Medicine rankings, but a strong research department may influence the number of people doing intercalated degrees, or it may result in more publications by students. Maybe this is part of the reason as to why the CUG rankings have a higher correlation with FP scores? 

Also, I don't see how either of the two rankings accurately measure the effectiveness of clinical training. Maybe there isn't a good enough measure out there, and so the league tables have nothing to go on?

But is there really much of a difference? 

To be honest, not a massive amount. The variation in average SJT Score is 2.5 (between the top and the bottom), which isn't a huge amount. While it is true that a lot of people miss out on their first choice foundation school by 1/2 points, it doesn't necessarily mean that if you had gone to the 'best' university, you would have got those extra 2.5 points. It entirely depends in the individual.
Any other variation is pretty much due to intercalated degrees. At Oxbridge, UCL and Imperial (and now Edinburgh), the students have to do an extra degree, so naturally the average score would be pushed up!

To sum up...

Does all of this mean you should ignore this table completely? I think that there is a small difference in the teaching at different medical schools. Each university and hospital trust has their own way of doing things, and it is clear that some prepare medical students a little bit better than others. Due to the fact that I am not in my clinical years yet, I wouldn't be able to suggest any reasons for this, but maybe I will write a follow-up post in a couple of years.

I have tried to keep this post as impartial as possible. I don't mean to offend anybody going to any particular university - I am just writing the post because I thought I would just throw the information out there and justify to myself that this whole exercise was not just a waste of time! Remember that a low score for your medical school does NOT mean you will become a rubbish doctor - it is just an average figure, and the difference is not even that much anyway!

The bottom line is, if you are a hardworking student and continue with this same attitude in medical school, it doesn't really matter where you go! Also remember that statistics aren't applicable to the individual; they are just a numerical way to explain general trends.

Although this whole thing does highlight the benefit of doing an intercalated degree. The extra year will be well worth it for your foundation programme - you may just get those few extra points and end up working in the area you want to. On top of that, it is essentially free (as the NHS pay for your 5th year onwards), and you get to further explore a subject that you really love; something that you don't really get a chance to do on the standard five year MBBS course!


Popular posts from this blog

BioMedical Admissions Test (BMAT) - everything you need to know

How Imperial use the BMAT - a guide for Medicine (A100) applicants

Some boring yet useful admissions statistics...