Articles:Objective Score - Analyzing Which Teams Prioritize Objectives In The NA LCS

From Leaguepedia | League of Legends Wiki
Jump to: navigation, search
Objective Score - Analyzing Which Teams Prioritize Objectives In The NA LCS

Statements such as “CLG doesn’t care about objectives” or “C9 are extremely objective focused” have been echoed by LCS casters, friends, and redditors, but there has been very little evidence or statistics to back up these claims. I am not saying these things are not true, but rather quantifying this data will grant us a clearer understanding of what exactly makes these teams different. That being said, it is very hard to quantify how a team treats objectives. This article tracks my previous attempts to quantify objective priority in teams and my proposed solution for a statistical representation of objective prioritization - Objective Score.

Past Attempts: First Blood/Towers/Dragons

Stats collected from week 1-9 of summer split

In my first attempt to tackle the issue of quantifying objective priority, I turned to statistics such as “First Dragon,” “First Tower,” and “First Blood.” These statistics usually give a decent overview and typically reaffirm common claims, like Counter Logic Gaming’s low dragon priority as they only get the first dragon in 25% of games, or Cloud 9’s push strategy as they get first tower in 71% of games.

However, when collecting these statistics, I realized how inaccurate they can be. Two examples come in mind: two teams trading early towers, or a team scoring first blood from a botched level 1 invade. These don’t translate into how much teams prioritize objectives: in the push situation both teams valued pushing the same, one only got it a few seconds before. In the invade situation, it was just happenstance that they got First Blood. This data was very interesting, but I thought there had to be more statistics out there in order to show how much a team values objectives.

Next Attempt: Comparing Total Dragons and Towers

As I was looking for other options to describe objective priority, I turned to total dragons/towers killed in the Summer Split for these teams. I thought this would be a good measure in regards to how teams emphasized these objectives throughout the game, not just the first kill. However, after tallying the stats thanks to Leaguepedia, the data was extremely underwhelming.

For example, Cloud 9 had 253 tower kills while Velocity had only 122. I knew this wasn’t because C9 prioritized towers twice as much as Velocity, but because C9 won so many more games (thus securing more towers and dragons than losing teams). What I had to do was find some context to these numbers. They couldn’t be compared to each other due to the high disparity from win-loss and game time; instead, they should be compared to the “average team” estimated using win-loss averages scaled based on a team’s game time.

Solution: Comparing total dragon and towers to weighted estimates


As seen above, I collected the data and created these averages/estimates based on Win-Loss Ratio and Game Time to compare with the actual data as performed by the teams. I calculated the average towers/dragons taken in a win and in a loss (ex. winning team averaged 9.5 towers while losing team averaged 3.7), and using this data created an estimated score given a team’s win loss ratio. Then scaled this estimate by the team’s game time compared to average game time (so CLG’s expected to get more dragons due to having way more game time in which dragons respawn, while C9 due to short games are expected to get less dragons). After charting this data a relationship starts to form between the actual data grabbed from games and the estimated data. With this relationship between these two variables we can start to draw conclusions and in combination with the previous data (first dragon/first tower/ first blood) we can see some concrete evidence.

In reading the above chart about dragon kills we can start to see who prioritizes them and who doesn’t. TSM and C9 both have a really high priority on dragon, both exceeding their estimates greatly, as well as earning the first dragon in a majority of games (75% and 57% respectively). Within the same dragon chart, we can see that CLG’s dragon priority is very low, and Dig’s and CST’s dragon priority is moderately low. This reaffirms the first dragon data as CLG and Dig have the lowest percents at 25% and 36% respectively. With similar reading of the tower chart we can see that C9, TSM, and Dig are the big pushers while CLG, CST, and VES end up with lower than average towers for their respective win-loss ratios.

Conclusion - “Objective Score”:

Focusing on the relationship between estimated values (based on win-loss and game time) and actual values gathered from games, we can create a “score” that accurately articulates a team’s objective priorities. Subtracting these values (Actual-Estimate) can get a figure I like to call “Objective Score,” which follows this scale:

  • OBJ Score: 15 or higher = Very high objective priority
  • OBJ Score: 5 to 15 = High objective priority
  • OBJ Score: -5 to 5 = Neutral objective priority
  • OBJ Score: -15 to -5 = Low objective priority
  • OBJ Score: -15 or lower = Very low objective priority

This objective score can be easily compared on the horizontal graph seen below:


This scoring method creates a nice standardized measurement in order to judge teams. This evidence could be very powerful to support things like “CLG’s low prioritization of objectives” or to even just see how much a team values objectives for teams not in the spotlight (such as Vulcun or slightly even Dignitas).

Let me know what you guys think about this article in the comments!

Written by Spellsy - @SpellsyLoL
Graphics by Eric “Vesca Violette” Womack - @VescaViolette
Edited by Alex “amagzz” Magdaleno - @amagzz
and Adel “Hype Algerian” Chouadria - @hypealgerian

Add your comment
Leaguepedia | League of Legends Wiki welcomes all comments. If you do not want to be anonymous, register or log in. It is free.


51 months ago
Score 0++

The Graphs are really great for people who want the statistics, and it's overall a great thing to sum it all up!


51 months ago
Score 0++

Great graphs to go with the informative article!


51 months ago
Score 0++

I think that the presentation mode is very user friendly. Perhaps this could be expanded to other regions?


51 months ago
Score 0++

amazing graphics for people who love statistics!

waiting for more


51 months ago
Score 2++

I find it interesting that this score gives C9 and TSM, two of the top teams in the LCS this split who tended toward shorter games, a very high objective score, and Vulcun, a similar top team who tended towards longer games a fairly neutral one. Similarly, among the bottom 3 teams, CST and VES have slightly negative objective scores and tended towards shorter games, while CLG, who tended toward longer games, have a far more negative objective score.

I feel like the conclusion suggested by this data is that objective-focused teams tend to win more quickly, even with comparable results, and that teams who are not focused on objectives tend to lose unless the game goes on for a long duration. The general trend either suggests that objective focused teams will be more successful, or that the formula used insufficiently accounts for the effects of victory on the number of objectives taken. Perhaps the dependence of objectives taken on game time would be more accurately modeled with a non-linear function, perhaps even a logistic function (for towers at least), as there are a finite number available to be taken. Just some mathematical speculation in response to this - curious to be answered by people who agree or disagree!


51 months ago
Score 0++

wow this is a very interesting idea i never looked at ! i kinda agree, it makes sense. it kinda explains why clg can make their games so consistently long, by stopping focusing on objectives it makes it less polarized (less to win cause less objective kills, but also less to lose). very interesting.


51 months ago
Score 0++

Is there any chance you could look at the objective score of teams playing against each team? It might be the best way to see how game length affects the score.

Anonymous user #2

51 months ago
Score -3++

this is crap

Anonymous user #1

51 months ago
Score -4++

Never trust a Statstic u haven't faked urself


51 months ago
Score -2++

I like this articules, but stats are not a good indicator


51 months ago
Score 2++

Hello Spellsy, I'm a fan of your work. That said, I must agree with Mapp07 above me that I think this statistic introduces significant biases.

I'd like to suggest an alternative that may be worth considering: the number of dragons/towers/kills within the first 15 minutes, normalized for how much a team is winning/losing at that point in the game.

I believe this offers significant advantages for the following reasons:

-> By putting it at a specific point in time it normalizes for game length without having to do anything fancy. I'm not sure if 15 minutes is the perfect number, but I choose it since it's enough time for 2 early dragons (3 in very rare circumstances), before baron, and before almost any game will be over. -> Since the time is set at a specified point you only need to consider how badly one of the teams is winning at that point, not which team ultimately wins/loses. As long as this is done loosely you should have a relatively stable estimate that accounts for stomps, without splitting the data too thinnly.

In terms of how to normalize for stomps, I think my first approach would be to split the data into 3 chunks: team winning by > 5k, teams within 5k, team losing by < 5k. Maybe more chunks are necessary, or slightly different splits, but this or something similar would probably do a reasonable job. In fact, looking at how a team got their gold lead might be interesting in its own right and be another statistic getting at the same idea.

The only downside to this technique is if you have edge cases where a team say gets a tower at 15:01. I can think of some ways to minimize this (such as giving partial credit that tapers over say an additional minute, or evaluating edge cases manually for exclusion), but at some point things should be good enough.

Anyways my 2 cents. Keep up the good work.


51 months ago
Score 2++

Hey Spellsy, I love your stuff but I have to say I'm pretty skeptical of this stat. It strikes me that a lot of what your formula is measuring is how badly a team tends to lose by. Coast and Ves didn't just lose the most, but they also often lost the most handily. Saying they didn't get as many objectives as you would expect could very easily just mean that they lost really badly, not that they didn't prioritize objectives. Similarly, Cloud9 often stomped in their games and thus you would expect a high score for them simply because of the quality of their wins. It looks very much to me like that's all we're really getting out of this stat.

The other potential problem I see relates to CLG and Vulcun, both of whom have lower than expected scores given my hypothesis, and both of whom tended to play longer games. You didn't say exactly how you adjusted based on game length, but while it's true that a longer game means more potential dragons, it's also true that the dragon becomes less and less valuable as the game goes longer and in the really late game teams tend to more-or-less ignore it. The fact that CLG's score is so low and Vulcun's score so mediocre could be because your game-length adjustment doesn't take that into account.


51 months ago
Score 2++

Good job with these graphs, guys! Very interesting to notice that Team Vulcun does not seem to have a high priorization of objectives even though they had a 20-8 score in the Summer Split, meanwhile Team Curse have a similar OBJ Score, but had less than 50% win ratio.


51 months ago
Score 2++

What a great article! Love those graphs!

Click Here to return to the Articles Index.