How We Ranked the Top Universities for Blockchain
Last year, CoinDesk released its inaugural Blockchain University Rankings in an attempt to both recognize the role that academic research has played in the development of blockchain technology, as well as quantify the impact of individual schools. Our goal was simple: offer the most rigorous and nuanced window into universities’ impact on the blockchain field.
Naturally, there were limitations, and as we began working on our 2021 rankings, mitigating as many of these limitations as we could was a primary objective.
To this end, we’ve made two major changes to our methodology this year. First, we included not only more schools, but a wider variety of schools – our field has expanded from 46 U.S.-based universities to more than 200 schools (230, to be exact) from around the globe. Second, we factored in “cost of attendance” to reflect a metric of growing concern for many current and future students.
Read more: The Top Universities for Blockchain by CoinDesk 2021
Above all, we want to ensure that these rankings do what they’re intended to do: offer a holistic snapshot of the intersection between this transformative technology and institutions of higher education. We believe that a transparent, intellectually defensible ranking can help condense what ends up being an incredible amount of difficult-to-find information (with innumerable factors) down into a more manageable format.
In the open-source spirit, we’d also like to reiterate our commitment to integrity and data transparency. We are more than happy to discuss and/or share our data, our methods or anything else about the project upon request.
Sample size
Our official sample size for these rankings was 230 individual schools, which is not nearly the total number of universities that exist around the world. To determine on which institutions to focus, we added schools to the list according to their ability to meet any one of three criteria.
First, we included any school that was listed in the top 100 of any one of the USNWR Best Global Universities (2021), the QS World University Rankings (2022), the ShanghaiRanking’s Academic Ranking of World Universities (2021), or the World University Rankings (2022). We also included any school that had been considered last year (2020) that was similarly based on aggregating outside rankings. This gave us a large initial sample.
This setup, however, if limited to just these two criteria, could pose a problem: What if a lower-ranked school (as judged by USNWR, QS, ARWU, or THE) is doing amazing work but fails to be considered simply because a few outside sources happened to overlook them in their global rankings? This is far from a desirable outcome.
On the other hand, we simply don’t have the resources to closely examine every school in existence, especially when relatively few of them are engaged in the kind of impactful blockchain work that is likely to lead to a place on our rankings.
Chart by Shuai Hao / CoinDesk
To balance these considerations, our third criteria was a compromise: When we released our qualitative survey, we also included a call for any school, anywhere in the world, to request inclusion/consideration in our rankings. By opening our criteria but placing the burden of requesting to be included on the schools themselves, we were able to remove any artificial limitations on which schools were considered while simultaneously maintaining a high level of confidence that any school that took the affirmative step of asking to be evaluated would ultimately be worth our time and resources to examine closely.
These final 230 institutions represent some of the best schools in existence today, and our final sample saw a mix of large, traditionally “elite” research institutions and smaller schools, from public to private, from free to expensive, with every continent (with the exception of Antarctica) represented.
Methodology
To determine final scores, we looked at five primary categories: (1) an institution’s strength in research and academic contributions to advancing the field; (2) the existing blockchain offerings on campus, whether in the form of classes, educational centers, clubs, etc.; (3) employment and industry outcomes; (4) cost of attendance; and (5) overall academic reputation.
Each category comprises multiple sub-categories, offering a holistic picture of a university’s presence in the blockchain space. For a final score, we assigned points to each institution proportional to their performance in each category, and normalized their final point totals on a scale from 0-100.
1) Scholarly impact: To determine a school’s scholarly impact score, we relied primarily on the Clarivate Web Of Science database. We took the total number of publications (all subjects) from each school, and narrowed them to include only blockchain- or cryptocurrency-related papers published between 2019-2021 (including forthcoming papers slated for 2022). From this set, we generated citation reports and created subsets in which the first author of the publication was affiliated with the university in question. The resulting data gave us the key metrics of (1) total blockchain research papers published by university affiliates, (2) how often these papers were cited, and rough numbers on (3) how often the primary researcher on a paper comes from a given institution (the “first author” convention being, of course, discipline dependent).
Raw numbers, however, don’t always tell the full story. A bigger school, with a larger faculty and a hefty endowment, may be putting out more blockchain research overall (while still managing to devote a relatively small percentage of its resources to the field), while a tiny school that dedicates a much more impressive percentage of its overall resources to blockchain research may end up with fewer papers simply due to a smaller overall headcount.
To account for this, we also normalized each data point (where applicable) against the total institutional output. When normalized in this way, a smaller university that is devoting a larger proportion of its research to blockchain will be rewarded relative to a more massive university who is able to pump out a greater quantity of research with less investment. In recognition of the fact that both raw output and targeted output are valuable metrics, both are factored into our rankings, along with the aggregated H-Index of a school’s blockchain publications. For anyone interested in reproducing our dataset, please ensure that a) you have full access to the Web of Knowledge and all Clarivate subscriptions; and b) use our query to filter the results: “cryptocurrenc* OR blockchai* OR bitcoi* OR ethereum OR stablecoi*”
2) Campus blockchain offerings: To arrive at a school’s blockchain offerings score, we examined multiple facets of their existing campus infrastructure. Campus course offerings are the largest single subcategory that we looked at. The number of available classes (especially when spread over multiple departments, providing an opportunity for a more robust education) shows a deep investment into the space both in the present and for the future. Faculty must be hired, curricula must be developed and administrative buy-in must be achieved. These are not done on a whim, and are usually quite permanent.
The second-largest factor in our rankings is the presence of a dedicated blockchain research center, although we also separately considered smaller initiatives and student-run clubs. Research centers and initiatives often offer unique opportunities for students to get involved in academic work or obtain hands-on experience, and can serve as a gravity well for novel ideas and thinkers (especially when these entities take the additional step of organizing conferences, summits or other educational events). Research centers, initiatives and clubs all allow students, faculty and the larger community to connect with other enthusiasts, and tend to provide a crucial tether between academia and industry.
Lastly, to round out this category, we gathered data on the nascent but ever-growing set of universities that offer blockchain-related degrees, whether at the graduate or undergraduate level and sometimes as a concentration within another degree. As a whole, the Campus Blockchain Offerings category is the most consequential component of our methodology.
!function(){“use strict”;window.addEventListener(“message”,(function(e){if(void 0!==e.data[“datawrapper-height”]){var t=document.querySelectorAll(“iframe”);for(var a in e.data[“datawrapper-height”])for(var r=0;r<t.length;r++){if(t[r].contentWindow===e.source)t[r].style.height=e.data[“datawrapper-height”][a]+”px”}}}))}();
3) Employment and industry outcomes: A university’s ability to place students into relevant jobs is an important metric for two reasons: one, it says something about an institution’s cache in the industry, either due to name recognition, personal connections, or institutional pipelines; and two, this is of particular importance to current and incoming students.
A student’s primary goal in obtaining a college education is, after all, often to secure a job in industry. To discover which schools are placing the most graduates in the blockchain field, we looked at the LinkedIn footprint of over 200 of the largest and most influential companies in the space, as well as their thousands and thousands of employees. To mitigate biases, we factored in both raw and normalized numbers. Raw numbers are useful for highlighting schools that are placing a high number of graduates into jobs, but larger schools in larger countries will tend to have an advantage simply because of sheer size.
Normalized numbers paint a more nuanced picture of hiring practices. To shed some light on our data, we tweaked our data in two additional ways. First, because we relied heavily on LinkedIn as a source, we found it prudent to get a sense of how accurate LinkedIn might be for different countries. To do this, we used each country’s size, higher education levels, and LinkedIn use to generate a multiplier for each university based on the expected number of hires that we may have missed. Countries with lower proportionate levels of LinkedIn use got a boost in terms of raw numbers.
Second, we also recognize that raw numbers can easily be inflated simply due to the size of a population. The University of Buenos Aires, for example, with its ~300,000 students, is much more likely to place 200 alums into blockchain jobs than someplace like Rockefeller University with its ~213 students.
The University of Buenos Aires placing 200 grads into blockchain is expected even with zero investment into the field, whereas Rockefeller placing that same number would be indicative of something closer to a school that is focused entirely on blockchain (highly unlikely, as Rockefeller is well-respected bio/medical sciences university). To account for this, we normalized against school size, as well.
To gather qualitative data, we also surveyed industry stakeholders and other non-students, non-academics to get a sense of how institutions are (subjectively) viewed by those who consider themselves to be outside of academia. This data was quantified numerically, as was information about the number of active industry partnerships (including sponsored research) maintained by each university.
4) Cost of attendance: To calculate a school’s Cost of Attendance score, we looked at both overall cost and a normalized construction of overall cost of attendance. We assumed here that lower tuition was preferable, and feel that we ought to acknowledge the important caveat that we only considered the base price of a university, while in actual practice, grants, scholarships, opportunity costs, and even residency can completely change an individual’s calculation. Along similar lines, tuition is a purely student-facing concern, while we hope that these rankings find use by non-students. Because of these concerns, our cost metric is, by weight, the least consequential component of our methodology.
Two pieces of data were factored in to generate this score. The first is tuition, with one note. Whenever possible, we assumed that an attendee would be from within the country but out of state when calculating tuition costs. Of course, some universities just have one flat fee. Others, however, charge different amounts of tuition for in-state (versus out of state), and have yet another fee schedule for international students. To capture the largest number of likely students, we consistently applied the “out of state but not out of country” tuition rule whenever necessary.
The second piece of data is a normalized cost of attendance. To determine this, we employed both salary data for the country in which the university is located and an external cost-of-living chart as proxy data to build a combined country-specific cost of living index. We then ran raw tuition data against this hybrid index to assign ranked scores to each university.
5) Academic reputation: In a perfect world, rankings would emphasize merit, and anonymized, quantifiable data would be sufficient to judge a university’s impact in the blockchain space. Realistically, however, the intangibles of a school have an outsized impact on everything from a student’s job prospects, to their ability to get a foot in the door of an internship, to the caliber of speaker that will spend their limited time giving a talk at any given school.
To pretend that reputation doesn’t matter, that history is insignificant, is to do a disservice to our rankings. The effect of a school’s academic reputation on our methodology, however, is dwarfed by every other category except for cost, reflecting both the recent shift away from credentialism and the greater weight that we assign to more tangible, productive metrics.
To determine an institution’s reputation score, we looked at two criteria: (a) existing, overall reputation as calculated by USNWR, THE, ARWU, and QS; and (b) reputation as determined by our own qualitative surveys, which asked both practicing academics and current students to evaluate schools. This data was split according to whether it came from a student or an academic, and quantified numerically.
Similar to last year, there are two common threads in our methodology. First, in keeping with our goal of rigor, defensibility and reproducibility, we used externally verified, quantitative data whenever such data was available, and normalized this data where appropriate to add as much nuance into our rankings as possible. When we required qualitative data, we sent out open, public, shareable surveys through all available channels and did our best not to limit participation in any way.
Second, we made every attempt to examine each data point from as many angles as possible. As is often the case, any given data point can be seen as a positive in some situations but a negative when seen through a different lens. Normalization is one tool to combat this, but so are things like common sense and a dispassionate analysis of the landscape. Data tells a story, and our goal was to let our data tell as complete a story as we could.
On rankings in general
As a final note, we’d like to echo a sentiment expressed last year and address the project of creating university rankings in a more general sense. In important ways, ordinal rankings are incredibly useful for showing very specific data or reducing large amounts of information down into a digestible format, but are also both narrow and inherently malleable.
Even small changes to the methodology can have outsized effects on the final result, as can outlier data or even researcher-introduced errors. To state that rankings are vulnerable to criticisms of subjectivity and malleability is not intended to marginalize our data or the larger project at hand; rather, we hope that by highlighting the limitations of our output, these rankings will be more useful to a greater number of individuals.
We are very willing to discuss our methodology, share data, answer questions, and address concerns. Interested readers are encouraged to contact Joe Lautzenhiser (joe.lautzenhiser [at] coindesk.com).
As a final note, it’s worth noting that we hope these rankings serve as the foundation for a living, breathing resource that goes well beyond an ordered list of schools. We have started and will continue to do this research, but we’re not naive enough to believe that we can build this particular monument alone.
But we believe this resource illuminating one small corner of the blockchain universe has tremendous value – for students seeking a more traditional path into the industry, for academics hoping to collaborate with like-minded individuals, for companies wondering where specific research is being done. As a first step, we’ve started filling out profiles for some of the top universities, but we’d eventually like to have every school represented.
Students can contribute to this by checking their school(s) and having an authorized university representative (e.g., a member of the media/communications/etc. team) contact us if any information is outdated or missing, or if their schools do not yet have a profile. Individuals can help by highlighting important research and projects, or novel approaches to blockchain education. Schools can help by examining these rankings and using them as a signal for how to improve. Ultimately, the answer is simple: devote resources to educating students, faculty, and the community about blockchain technology.
table {
border-collapse: collapse;
font-family: “Neue Haas Grotesk Text Std”;
width: 100%;
}
tr{
border: 1pt solid black;
}
td{
padding:5px;
text-align: center;
}
caption {
font-size: 24px;
font-family: “Roslindale”;
padding: 10px;
}
Ranking | School | Score |
---|---|---|
1 | National University of Singapore | 100.00 |
2 | Royal Melbourne Institute of Technology | 97.65 |
3 | University of California Berkeley | 93.26 |
4 | University of Zurich | 91.66 |
5 | Massachusetts Institute of Technology | 91.57 |
6 | Hong Kong Polytechnic University | 84.30 |
7 | UCL | 81.54 |
8 | Tsinghua University | 79.20 |
9 | Chinese University of Hong Kong | 75.30 |
10 | ETH Zurich | 75.04 |
11 | Nanyang Technological University, Singapore | 74.98 |
12 | Stanford University | 68.41 |
13 | UNSW Sydney | 66.29 |
14 | City University of Hong Kong | 66.13 |
15 | University of Oxford | 65.47 |
16 | Shanghai Jiao Tong University | 65.18 |
17 | Cornell University | 63.98 |
18 | Delft University of Technology | 63.85 |
19 | University of Hong Kong | 61.97 |
20 | University of Sydney | 61.48 |
21 | École Polytechnique Fédérale de Lausanne (Switzerland) | 60.78 |
22 | University of Illinois Urbana-Champaign | 60.10 |
23 | University of Cambridge | 58.69 |
24 | Hong Kong University of Science and Technology | 58.51 |
25 | University of California Los Angeles | 58.40 |
26 | Korea Advanced Institute of Science and Technology | 57.87 |
27 | Sun Yat-sen University | 57.18 |
28 | University of British Columbia | 55.80 |
29 | Peking University | 54.15 |
30 | Arizona State University | 51.86 |
31 | Technical University of Munich | 51.78 |
32 | University of Edinburgh | 51.77 |
33 | Carnegie Mellon University | 51.10 |
34 | University of Melbourne | 50.95 |
35 | Worcester Polytechnic Institute | 50.77 |
36 | Georgetown University | 50.40 |
37 | Fudan University | 49.95 |
38 | University of Southern California | 49.57 |
39 | Korea University | 48.85 |
40 | Imperial College London | 48.59 |
41 | New York University | 48.55 |
42 | Tokyo Institute of Technology | 47.37 |
43 | University of Warwick | 47.19 |
44 | Fordham University | 46.89 |
45 | Columbia University | 46.46 |
46 | Seoul National University | 45.72 |
47 | King Abdulaziz University | 45.59 |
48 | Monash University | 44.05 |
49 | Harvard University | 43.89 |
50 | Zhejiang University | 43.37 |