Especially after reading this post of Kyle’s, I’ve done some more thinking about Sunday Morning Quarterback’s analysis of the correlation of certain statistical performance categories and wins/losses.
It gnaws at me a little that SMQ’s chart ranking all of the D-1 teams that finished the last season with at least 10 wins shows Georgia and Hawaii to be almost functionally equivalent (although it should be noted that he places more distance between the two in his chart of ranked teams) and that Kyle has riffed off of it as to Ohio State and BYU in the premise of his post. I watched all four of those teams play at least twice last year, and my (admittedly subjective) perspective is that Georgia was a far better team than was Hawaii and that OSU was a better team than BYU.
All of which led me to consider tweaking SMQ’s yeoman’s work in assembling his numbers. And did, hence this post.
Where I started was with his chart of 10+ win teams. I noticed that while he compared the rankings of those schools in nine categories, those weren’t the same nine categories that he found most closely correlated with wins and losses. When I substituted the two stats that had been omitted (3rd down conversions for offense and defense), here’s the chart that I got:
School |
Rush D |
P/E D |
Total D |
3d Dn D |
T/O Mar |
P/E O |
Total O |
3d Dn O |
Pass D |
|
|
|
|
|
|
|
|
|
|
Kansas |
8 |
9 |
12 |
31 |
1 |
7 |
8 |
9 |
49 |
LSU |
12 |
3 |
3 |
14 |
2 |
37 |
26 |
29 |
9 |
W. Va. |
18 |
28 |
7 |
8 |
9 |
11 |
15 |
36 |
14 |
S. Cal. |
4 |
6 |
2 |
28 |
41 |
36 |
29 |
8 |
15 |
Ohio St. |
3 |
4 |
1 |
17 |
76 |
12 |
62 |
14 |
1 |
Okla. |
17 |
43 |
26 |
5 |
25 |
1 |
19 |
13 |
59 |
Boise St. |
35 |
66 |
25 |
3 |
47 |
6 |
12 |
17 |
26 |
Hawaii |
41 |
21 |
34 |
10 |
93 |
3 |
3 |
6 |
37 |
BYU |
9 |
18 |
10 |
21 |
93 |
28 |
25 |
32 |
32 |
Va. Tech |
5 |
5 |
4 |
85 |
14 |
53 |
100 |
4 |
31 |
Georgia |
16 |
36 |
14 |
24 |
18 |
61 |
74 |
23 |
36 |
Cincy |
19 |
34 |
50 |
66 |
6 |
8 |
16 |
26 |
89 |
Ariz. St. |
21 |
15 |
30 |
68 |
38 |
17 |
56 |
11 |
61 |
Missouri |
25 |
45 |
59 |
2 |
11 |
13 |
5 |
82 |
96 |
Bos. Coll. |
2 |
23 |
19 |
44 |
30 |
59 |
33 |
47 |
88 |
Texas |
6 |
70 |
52 |
11 |
47 |
30 |
13 |
63 |
109 |
UCF |
69 |
27 |
49 |
71 |
54 |
67 |
45 |
55 |
69 |
Tenn. |
73 |
66 |
70 |
56 |
27 |
29 |
54 |
72 |
73 |
Tulsa |
108 |
94 |
108 |
6 |
92 |
4 |
1 |
62 |
108 |
Just to save you the trouble, here’s how the teams ranked on that raw data after averaging.
- Kansas 14.89
- LSU 15
- West Virginia 16.22
- S. Cal. 18.78
- Ohio State 21.11
- Oklahoma 23.11
- Boise State 26.33
- Hawaii 27.56
- BYU 29.78
- Virginia Tech 33.44
- Georgia 33.56
- Cincinnati 34.89
- Arizona State 35.22
- Missouri 37.56
- Boston College 38.33
- Texas 44.56
- UCF 56.22
- Tennessee 57.78
- Tulsa 64.78
I didn’t stop there, though. I thought that to get a more representative sampling that the averages should be weighted on the basis of how closely each statistical category correlated to wins/losses. In other words, SMQ’s work shows that rushing defense is almost twice as significant an indicator as passing defense. To me, it only makes sense to give more weight to a school finishing first in rushing defense than in passing defense.
The way I went about doing this was to award points on an inverse scale to finish (a team finishing first in a category received 119 points; one finishing last received 1 point) and then multiplied the points a team received by the factor that SMQ assigned to that statistical category. Thus for example, Kansas finished eighth nationally in rushing defense. The Jayhawks received 112 points for their finish; those points were multiplied by .437 (SMQ’s factor) and that product (48.94) was the figure that was used to average Kansas’ statistical performance.
With that, here’s the order of the schools:
- Kansas
- LSU
- West Virginia
- S. Cal.
- Ohio State
- Oklahoma
- Boise State
- Hawaii
- BYU
- Virginia Tech
- Georgia
- Cincinnati
- Arizona State
- Boston College
- Missouri
- Texas
- UCF
- Tennessee
- Tulsa
The only change that resulted from all that was to flip Boston College and Missouri in the order.
The final step I took was to factor in strength of schedule. I did this by multiplying each school’s weighted average by Sagarin’s SOS rating (not ranking). I then ranked the schools by the number of points generated. Here’s the end result (with the number in parenthesis being SMQ’s ranking of the averages):
- LSU (2) 2860
- West Virginia (4) 2716
- Southern Cal (3) 2691
- Kansas (1) 2562
- Oklahoma (6) 2541
- Ohio State (7) 2514
- Georgia (14) 2342
- Arizona State (11) 2322
- Virginia Tech (15) 2315
- BYU (8) 2290
- Missouri (9) 2276
- Boston College (13) 2208
- Cincinnati (10) 2115
- Texas (16) 2016
- Boise State (5) 2005
- Hawaii (12) 1906
- Tennessee (18) 1700
- UCF (17) 1489
- Tulsa (19) 1218
Those numbers seem more in line with what I saw last season. In another post, I’ll try to do the same with the teams ranked in the AP, to see how this matches up with the results that SMQ obtained.
A few observations:
- Everytime I play with team stats, LSU winds up on top. Again, over the course of the year, the Tigers look like the best in the country.
- West Virginia was underrated.
- Ohio State may have been a little overrated, but not to the extent that so many would have you believe after the BCS title game.
- BYU looks like it should have been the school to crash the BCS party last season, not Hawaii.