NIRF (or the National Institutional Ranking Framework) for the Uninitiated
So, the NIRF 2020 was released in June and the usual chest-beating and we-are-better-than-you took place. But we are all aware that the history is littered with university ranking schemes, some notorious for providing rank-for-money.
The then Director of IIT Kanpur
famously made the statement in 2013: “An amount of one lakh and fifty thousand
dollars needs to be paid to get a good ranking in such lists”, when his
institute was ranked 295 by the ‘QS World University Rankings’. It was argued
by QS that the Director was confused with their ‘Star program’ (where a “5 Star
Rating” is put next to the University rank after receiving a payment), but the
damage was done in India. Their latest rankings put IIT Bombay as the highest
rank at 162 for an Indian institute while IIT Kanpur is at 283. (Side note:
the “Institute ranking” page on IIT Kanpur’s official website still carries QS
World University Rankings as it is the highest when you look at all the
different international rankings.) A Saudi Arabian university gets a 189 rank
on the same list, must be a really good one. A 2009 World Bank report on “The
Challenge of Establishing World-Class Universities” had a whole page about the
IITs being centres of excellence.
Anyways, this strange and
incomprehensible world of university rankings was only part of the reason the
Ministry of Human Resource Development (MHRD) decided to initiate the NIRF,
with the first rankings list being released in 2016. The main reason was that
there was no quantitative measure of the quality of the government-funded
institutions in the country which meant that the government had no idea how
those institutions were faring. That does not necessarily mean that NIRF is the
best solution to that problem, but it is definitely a start.
Before the rush of the new IITs
and IIMs and other IIXs, a very simple rule was followed by the government; be
proud of the IIXs in media and let the rest fend for themselves (with some
exceptions made for the old and well-known ones). Also, there had been
historically no effort in comparing the apples with the oranges (one wonders
why…), i.e. how the technical institutes of 20th century India
compared with the British Era universities. But more importantly, the
institutions in the ignored parts of our vast nation had no way of knowing how
they fared against the ‘mainstream’ institutes. NIRF does all that and more.
So, let’s try to understand what exactly NIRF covers…
- The NIRF covers only centrally funded institutions/universities of the Government of India, which have a total of at least 1000 enrolled students.
- Since 2017 the institutions are being given an overall rank as well as a discipline specific rank.
- “Highly focussed institutions” with a single main discipline (Engineering, Medical, Law, Management, Pharmacy or UG degree colleges in Arts, Science and Commerce, etc) with less than 1000 enrolled students are given only a discipline specific rank.
- Institutions are not automatically included in the discipline specific rankings as they need to register for it.
- Open Universities and Affiliating Universities (State/Centre approved/funded) are not registered for ranking, however, if such universities have a teaching/research campus they are allowed to register only with the data related to that physical campus.
- Institutions only which have graduated at least three batches of students are considered.
The onus is on the institutions
to submit the relevant data to NIRF directly and post the data on their
official websites, both being required. Data for parameters such as Research
and Patents is taken from internationally available Databases (Eg. Scopus, Web
of Science, the Indian Science Index, etc). NIRF has even been empowered to carry
out physical checks on the institution records and audited accounts (apparently
“where needed”) to ensure that the ‘principles of ethical behaviour’ are being
adhered to.
In a recent article, a flaw has
been pointed out in the NIRF marking scheme which, on the face of it, seems somewhat
plausible. So, NIRF builds a single score from five categories: teaching,
learning and resources (TLR), research and professional practices (RPP),
graduation outcomes (GO), outreach and inclusivity, and perception. Focusing on
these broad categories, in a university system, TLR is technically the ‘input’
and RPP and GO are ‘output’. Now, the NIRF adds the scores from all these five
heads to obtain the final score. Hence this article claims that NIRF violates
the basic principle of performance analyses: that performance is based on the
input score and that quality is based on the ratio of output to the input. And
also that “outreach, inclusivity and perception relate neither to academic nor
research excellence, but these are added to the over-all score as well.”
My personal peeve is the
“perception” score. What is this ‘perception’ all about? The NIRF document says
that the ‘Perception Score’ will be created after “a survey conducted over a
large category of Employers, Professionals from Reputed Organizations and a
large category of academics to ascertain their preference for graduates of
different institutions.” So, now academic politics is added into the fray.
Furthermore, Graduation Outcomes
(GO) includes a factor directly related to the number of students passing the
university examinations (GUE) and a factor for number of Ph.D. students
graduating (GPHD). I agree with the criticism that at least GUE is definitely
an outcome that cannot be used in ranking. Again, taking the example of IIT
Kanpur of a couple of decades back, most national rankings gave it the top spot
and it even figured highly on global lists. At that time it was notoriously
difficult to get good marks in various technical courses there. It was not very
uncommon that a number of students would have to stay an extra year
(undergraduate). So, had NIRF been around at that time, it would rank IITK much
lower.
On the other hand, there are some excellent additions to the NIRF system. The Outreach and Inclusivity (OI) score is unique and definitely required in India. It includes factors like:
- Percentage of Students from Other States/Countries (Region Diversity RD)
- Percentage of Women (Women Diversity WD)
- Economically and Socially Challenged Students (ESCS)
- Facilities for Physically Challenged Students (PCS)
For India the above are an
excellent idea to have. Someone looking at the OI score can get a good idea on
where the institute stands w.r.t. these important social issues. Although why
PCS is not a separate score beats me.
So, what do we find in the latest
NIRF?
The original five IITs and
IISc make up the first six ranks.
Seriously?
Does that mean that decades of supporting
all other institutions has yielded little to no result? Or, have the same
institutions kept up their standards over the years. If the international
university rankings are taken even on a relative basis, they tell a very
different story.
The below table shows the global rankings of the highest ranked (only top 500) Indian institutes as per the four most popular world university rankings:
As far as trends go, there is
only one. That the Indian Institute of Science in Bangalore is the top
ranked Indian institute.
Even in the US News top
engineering colleges ranking, no Indian institute features in the top 100. IIT
Delhi at 106 was the top ranking.
The inconsistency in the rankings
clearly shows that the global university ranking system may not be at all
accurate when it comes to India. It is either perception based or based on
which institute filled out the relevant forms. And why not. These global university
rankings were created for a reason, to lure international students to the top
institutions. In fact budgets of many US institutions depend heavily on the
fees paid by international students. The ranking companies, in turn, gain from
these institutes. But since Indian institutions are not really in that
situation hence these global rankings are not really focusing on Indian
institutions as much. This gives a clear reason why something like NIRF was
required. This gives a clear reason why something like NIRF was required.
NIRF has its flaws, some of which
were briefly discussed in this blog. But any new complicated system has flaws
and NIRF is, self-admittedly, correcting itself with every year.
The current rankings also may not
indicate the good work done by many institutes other than the usual suspects
who figure on the top, but I am hopeful that it will change.
As long as politics is kept out
of NIRF and the main goal is clearly seen to be the betterment of the academic
institutions, NIRF can really make a difference by giving an understanding of
their shortcomings as well as their strengths.
References
- http://www.iitk.ac.in/
- The Challenge of Establishing World-Class Universities, The International Bank for Reconstruction and Development / The World Bank, 2009.
- NATIONAL INSTITUTIONAL RANKING FRAMEWORK, Methodology for Ranking of Academic Institutions in India Ministry of Human Resource Development, Government of India, 2017.
- https://science.thewire.in/education/a-flaw-in-the-nirf-rankings-and-a-fix/

Comments
Post a Comment