Can you measure esports greatness? We have a good idea on how to do that for CS:GO and Dota2 esports.
A few years ago, we introduced the ESL CS:GO World Ranking with the aim to expand our toolkit when it came to measuring the strength of professional Counter-Strike teams. We are now building on what we’ve learned from it to create the ESL Dota 2 World Ranking and bring both into a dedicated platform with an improved user experience.
We know that people will always have their own ideas about the current skill ranking among the best teams in the world, and while we are always striving for the best possible ranking, despite our best efforts to continuously improve it, it will never be THE perfect ranking. And that’s ok.
Our rankings are not meant to be the end-all-be-all esports tier list, quite the opposite. We wanted to add to the conversation, not end it. Other rankings have been widely embraced by the community and do a great job at showing the overall state of their respective games too.
However, as a tournament organizer dating back to the CS 1.6 and Dota Classic days, we feel like we can contribute and improve the discussion with a very transparent and data-driven approach. This is what the ESL World Rankings have to offer:
Tournament performance above all
Our rankings were built to measure teams’ performances at tournaments first and foremost. It is not an ELO-based ranking dependant on the results of specific matchups. In a nutshell, every relevant tournament in the planet that follows a rule set in line with the basic agreed rules of competitive CS:GO and Dota2 and fairplay, regardless of who organizes it, counts towards the ranking.
Each event is then graded based on two criteria:
-Size: Huge (22+ teams), Large (14+ teams), Medium (10+ teams), Small (6+ teams), Tiny (4+ teams). A minimum of top-ranked teams for each category is also determined in our rules.
-Quality Rating: TI, AAA, AA, A, B, C, D. These are determined based on the number and ranks of top 12 teams participating.
The size determines how many points will be up for grabs at the event and their distribution based on placement. The Quality Rating acts as a multiplier, where only AAA events grant full points, with all other categories granting decreasingly less points.
The International (TI) grants an extra 50% on top of the highest category due to its special status within the Dota2 ecosystem.
Head over to the ranking our complete rulebook for the full detailed breakdown of our event grading system and points distribution.
Some practical examples:
-CS:GO: The IEM Katowice CS:GO Major 2019 was graded at “AAA Huge” so winners Astralis took home the maximum number of points possible for a single event: 1150.
-CS:GO: Later in the year, IEM Sydney 2019 was classified as “A Large”, granting winners Team Liquid 600 points. DreamHack Masters Dallas 2019, at “AA Large”, granted 800 points for winners Team Liquid.
-Dota2: The International 2019 was (automatically) classified as “TI” quality and (based on the format) as "Large" size (18 teams, 16 of them Top24), so winners OG took home the maximum number of points possible for a single event: 1500.
-Dota2: Just 4 weeks before that, Dota Summit 10 was classified as “D Small” (6 participants, 5 of which where Top24 but none of them top12) granting winners Alliance 105 points.
Players at the heart of the ranking
In any esport, teams are susceptible to roster shuffles and transfers. This is why even though the overall ranking is based on organizations’ total points at any given moment, it is actually the players who score points at each event.
There are a few exceptions to account for extraordinary circumstances around substitutes lined out in our rules, but normally all five players will split the points earned by their organization at any given event equally: 20% each.
Points also “decay” with the passing of time. For the first six weeks after an event, points keep 100% of their value. After that, they lose 5% of their value every week until reaching 0% after 25 weeks. This means that the ranking should be seen as a measure of form over a (close to) six-month span.
The debate continues
To wrap up, we would like to reinforce that we don’t see this ranking as the ultimate, most accurate way of determining who the best teams on Earth are. We are trying to enrich the debate around the game by giving you, the community, more information to work with.
In the same open spirit that rules the CS:GO and Dota2 scenes, we have gone the extra mile to be objective with our mathematical approach and as transparent as possible in our justifications laid out in the rules. So much so that anyone could calculate this same ranking just with our rules and a spreadsheet at hand.
We’ve been around long enough to know that esports evolve at a breakneck pace and will continue to monitor the scene to ensure our ranking reflects its reality. Your input is also very valuable for us as we balance the ranking, so be sure to let us know through social media and Reddit threads what you think.