Archive:Leaguepedia Articles/Objective Score - Analyzing Which Teams Prioritize Objectives In The NA LCS

From Leaguepedia | League of Legends Esports Wiki
Jump to: navigation, search
Objective Score - Analyzing Which Teams Prioritize Objectives In The NA LCS

Statements such as “CLG doesn’t care about objectives” or “C9 are extremely objective focused” have been echoed by LCS casters, friends, and redditors, but there has been very little evidence or statistics to back up these claims. I am not saying these things are not true, but rather quantifying this data will grant us a clearer understanding of what exactly makes these teams different. That being said, it is very hard to quantify how a team treats objectives. This article tracks my previous attempts to quantify objective priority in teams and my proposed solution for a statistical representation of objective prioritization - Objective Score.


Past Attempts: First Blood/Towers/Dragons

Stats collected from week 1-9 of summer split


In my first attempt to tackle the issue of quantifying objective priority, I turned to statistics such as “First Dragon,” “First Tower,” and “First Blood.” These statistics usually give a decent overview and typically reaffirm common claims, like Counter Logic Gaming’s low dragon priority as they only get the first dragon in 25% of games, or Cloud 9’s push strategy as they get first tower in 71% of games.

However, when collecting these statistics, I realized how inaccurate they can be. Two examples come in mind: two teams trading early towers, or a team scoring first blood from a botched level 1 invade. These don’t translate into how much teams prioritize objectives: in the push situation both teams valued pushing the same, one only got it a few seconds before. In the invade situation, it was just happenstance that they got First Blood. This data was very interesting, but I thought there had to be more statistics out there in order to show how much a team values objectives.


Next Attempt: Comparing Total Dragons and Towers

As I was looking for other options to describe objective priority, I turned to total dragons/towers killed in the Summer Split for these teams. I thought this would be a good measure in regards to how teams emphasized these objectives throughout the game, not just the first kill. However, after tallying the stats thanks to Leaguepedia, the data was extremely underwhelming.

For example, Cloud 9 had 253 tower kills while Velocity had only 122. I knew this wasn’t because C9 prioritized towers twice as much as Velocity, but because C9 won so many more games (thus securing more towers and dragons than losing teams). What I had to do was find some context to these numbers. They couldn’t be compared to each other due to the high disparity from win-loss and game time; instead, they should be compared to the “average team” estimated using win-loss averages scaled based on a team’s game time.


Solution: Comparing total dragon and towers to weighted estimates

SumSplitTower.png
SumSplitDragon.png


As seen above, I collected the data and created these averages/estimates based on Win-Loss Ratio and Game Time to compare with the actual data as performed by the teams. I calculated the average towers/dragons taken in a win and in a loss (ex. winning team averaged 9.5 towers while losing team averaged 3.7), and using this data created an estimated score given a team’s win loss ratio. Then scaled this estimate by the team’s game time compared to average game time (so CLG’s expected to get more dragons due to having way more game time in which dragons respawn, while C9 due to short games are expected to get less dragons). After charting this data a relationship starts to form between the actual data grabbed from games and the estimated data. With this relationship between these two variables we can start to draw conclusions and in combination with the previous data (first dragon/first tower/ first blood) we can see some concrete evidence.

In reading the above chart about dragon kills we can start to see who prioritizes them and who doesn’t. TSM and C9 both have a really high priority on dragon, both exceeding their estimates greatly, as well as earning the first dragon in a majority of games (75% and 57% respectively). Within the same dragon chart, we can see that CLG’s dragon priority is very low, and Dig’s and CST’s dragon priority is moderately low. This reaffirms the first dragon data as CLG and Dig have the lowest percents at 25% and 36% respectively. With similar reading of the tower chart we can see that C9, TSM, and Dig are the big pushers while CLG, CST, and VES end up with lower than average towers for their respective win-loss ratios.


Conclusion - “Objective Score”:

Focusing on the relationship between estimated values (based on win-loss and game time) and actual values gathered from games, we can create a “score” that accurately articulates a team’s objective priorities. Subtracting these values (Actual-Estimate) can get a figure I like to call “Objective Score,” which follows this scale:

  • OBJ Score: 15 or higher = Very high objective priority
  • OBJ Score: 5 to 15 = High objective priority
  • OBJ Score: -5 to 5 = Neutral objective priority
  • OBJ Score: -15 to -5 = Low objective priority
  • OBJ Score: -15 or lower = Very low objective priority

This objective score can be easily compared on the horizontal graph seen below:

ObjectiveScoreTowersNumbers.png
ObjectiveScoreDragonsNumbers.png


This scoring method creates a nice standardized measurement in order to judge teams. This evidence could be very powerful to support things like “CLG’s low prioritization of objectives” or to even just see how much a team values objectives for teams not in the spotlight (such as Vulcun or slightly even Dignitas).

Let me know what you guys think about this article in the comments!



Written by Spellsy - @SpellsyLoL
Graphics by Eric “Vesca Violette” Womack - @VescaViolette
Edited by Alex “amagzz” Magdaleno - @amagzz
and Adel “Hype Algerian” Chouadria - @hypealgerian


<comments />

Click Here to return to the Articles Index.