Thursday - June 19, 2008
I tried, really I did.
I wanted to do something I thought was interesting. In fact, I'm still going to do it, but it's left me frustrated. I've never really looked at ESPN's recruiting ratings much, but I wanted to look into them a bit. I wanted to see if there was a difference, a bias, a trend of any sort; is there a particular position or school or region that they are more bullish on than Scout is or than Rivals is.
Step 1 seemed simple: try to map the ESPN rating system to the same scale Rivals uses. Rivals has this scale that goes to 6.1 and lets you seperate not just the 5* players from the 4* players, but the high 4* players (6.0) from the low (5.8). I figured I would take the ESPN ratings, map them to the Rivals system and then identify players with a significant difference in rating (0.2 or more).
First step, define the mapping. Should be easy: Rivals is fairly consistent in how they apply theirs, and it's approximately
I dumped out ESPN's ratings for 2008 and figured I'd find the rating that was the cutoff for the top 35 players, and that's 6.1. I'd find the cutoff for the top 90 players, and that's 6.0. And here's what I found, using the 2008 ratings
Cutoff for 6.1: between 83 (23 players) and 84 (49 players)
Cutoff for 6.0: 82 (91 players)
Cutoff for 5.9: between 80 (240 players) and 81 (143 players)
Cutoff for 5.8: 79 (acutally 365 players)
But just to make sure I wasn't using a bad mapping, I tried it on the ESPN data for 2007 and 2006, too. And here's what I found: a chart of how many players *achieved or exceeded* a given ranking in each year, according to ESPN
Think about what that means. They are saying that the 2nd best player of the 2007 class (Chris Galippo, LB) would have been, at best, the 14th best player in the country if he'd graduated in 2006. That's a pretty steep dropoff in top talent. The 11th best player in 2007 (Marvin Austin, DT) would have been no better than 35th in 2006.
But as striking as the numbers at the top are, they are at least in some strange world plausible. We've all seen NFL drafts where people are excited about 3 possible #1 picks, and others where no one wants that top pick at all. A 13 to 1 change is a bit out of the realm of reason, but it's not nearly as bad as what happens when you compare 2007 to 2008. When you compare 2007 to 2008, you have to believe one of two things:
1. ESPN thinks that there was approximately twice as much top shelf talent in 2008 as their had been in 2007. 10 players at 88 or higher compared to 4 (a 150% increase). 91 players at 82 or higher compared to 53 (a 72% increase). 240 players at 80 or higher compared to 109 (a 120% increase). The 250th best player of 2007 would be borderline top 400 for 2008.
2. ESPN is being wildly inconsistent in how they grade players.
I think the latter is more believable. I think that if they gave a kid an 80 in 2007, they were saying he was a true stud recruit, a top 100 kid, what Rivals would call a "6.0". But if they gave a kid an 80 in 2008, they are saying something less; they are saying he is what Rivals would call a "5.9", or maybe even a "5.8" ... mid or maybe low 4*. It makes it hard to pay attention to ESPN's analysis if all you've got is a number which changes meaning and their verbal descriptions.
And while I'm at it, a couple more jabs at ESPN's ratings:
1. They have something called a "ESPN 150 Watch List". This is something other services have done before, too. Rivals used to do a "Pre-Evaluation Top 100 Watch List" which contained the names of 100 players to keep an eye on as potential candidates for the first ordered list. You'd expect ESPN's is the same, right? Except ESPN's "150 Watch List" contains a whopping 543 players. Why call it a "150 Watch List" if it's got 543 players? I know what they'll say ... "these are players that may make the Top 150", but really, *I* may make the Top 150 if I suddenly go back to high school and throw for 52 touchdowns this year. Don't give us a "watch list" with everyone on it. I could dump out the rosters of every team in 1-A and call it my "Heisman watch list", but there's really no value in it.
2. This one isn't just for ESPN, it's for anyone who has a numerical ranking scale and doesn't use it. Rankers are so focused on getting #1 vs. #2 vs. #3 right, and just generally throwing the rest into broad categories, that you often wind up with lists like ESPN's, where the first 15 points in their scale are sparsely populated, and then the rest are just crammed full. DaQuan Bowers got a 95. DJ Grant got an 85. There are 14 players between them. TJ Bryant got an 83. Keenon Cooper got an 81. There are 120 players between them. I understand that stars seperate, but that's not real.
And Rankers *never* start their scale at 0. I guess it offends people. Movie reviewers give 1* to movies they detest. Rivals gives an automatic 4.5 out of 6.1 to anyone who knows how to buckle a chin strap. Why would you come up with a ranking system with a bizarre top grade like 6.1 if you are going to start it at 4.5? I guess because giving a 4.5/6.1 sounds charitable, but 0/1.6 sounds mean.
Anyway, I'm going to go ahead. I'm going to do my mapping based on averages across 3 years, I guess, and report back in a couple of days. We'll see.