Projecting the 2019 AFL Season

With the AFL season now hours away, the attention of fans and pundits everywhere turns to not only the action in week one, but who will sit on top of the ladder after the next 23. For the fifth year, HPN is attempting to answer the latter question, trying to work out how the ladder will shake out by the time the Home and Away season wraps up.

As per last year, HPN will take two different approaches to predict the season. One will focus on the underlying team elements and luck involved in football, and the other on the output of the players that make up each side. Think of it as balancing the systems of football against the talent underpinning it.

The team based approach is a variation on Pythagorean expectation, popularised for sport prediction by Bill James and adapted for AFL by Matter of Stats, adjusting for fixture imbalance.

The talent-based approach uses player-by-player projections using PAV and mPAV to build a model for projecting teams. Readers may recall our PERT model last year for picking games, which did surprisingly (but maybe not repeatably) well. This year, the player based model will attempt to predict the entire game-by-game fixture, accounting for strength of schedule and (for the first time) interstate travel.

We’re also running PERT again, a footy tipping model based on team selections, which you can track here, via our Twitter handle, or at Squiggle.

The Team Model

Underpinning the team model, as mentioned above, is Pythagorean expectation. Pythagorean expectaion is a calculation which takes the year’s results and tells us how many games a team “should” have won. As a result, the concept inherently accounts for factors such as luck in close games. To that baseline projection, we add adjustments for the difficulty of opponents in 2018 and opponent expected strengths in 2019. If you chuck all of that together, it comes up with a ladder like the following:

For the first time in memory, this method throws up the same top 8 as last year. However, there is a strong four team chasing pack not very far adrift, with 12 sides projected to be winning over 12 games.

The order of last year’s 8 is also rearranged, most notably with Geelong and Melbourne looking like they were a lot better than their win-loss record last year. Both sides are predicted as entering the top 4, above both grand finalists. In Melbourne’s case, due to close game losses, in 2018 they undershot their apparent underlying quality by enough that it outweighs a much harder 2019 fixture.

The Cats get a boost from the fixture as well as being projected as 3 wins stronger than their 2018 record. Sydney’s 14 wins looks like it was one too many last year, but their extremely difficult draw of 2018 isn’t repeated, and that effect roughly balances them out back to 14 wins for 2019.

Teams who jump here include St Kilda and Brisbane. Brisbane look like they were substantially better than 5 wins last year, perhaps by nearly 4 wins. It’s hard to have a win loss record like the Lions did and still have a percentage of 89. The Saints were probably as good as a 5.5 win team rather than a 4.5 win team, but get a boost based primarily on their draw looking considerably easier.

The Player Model

If the team model is the bastion of stability, the player model is happy to throw a bit of chaos into the mix.

The player model was developed by giving a marginal Player Approximate Value (mPAV) to every player currently on an AFL list, based on previous performance, draft position and age. HPN first adjusted each team list for actual player availability – players with season ending injuries or doping suspensions in 2019 have been removed. HPN has then attempted to account for “missing games”, by looking at the played games in previous years for each spot on the list. This means the best player for each side is expected to play a much higher proportion of games than the 25th player on the list, though nobody is projected as 100% available.

As such, the depth of each list is tested – beyond what some sides will face in the upcoming season. To create a baseline, we have used 38 expected players for each side – more than some clubs will use, less than some of the most unstable will field. We have then created a version of the HPN Team Ratings with the resultant products, and simulated each match in the upcoming season, accounting for Home Ground Advantage for interstate games only (some neutral venues have been excluded). Interestingly, this last step only had a minimal effect.

Enough words, more projections:

A clear break emerges between the top 8 and the rest when using the player model, and also between the top 4 and the rest of the 8. More than one and a half wins divides Collingwood in 4th and Adelaide in 5th, and over a full projected win separates GWS in 8th from Hawthorn in 9th.

This projection shows the impact of some key player losses, such as Hawthorn’s loss of Tom Mitchell to a broken leg, who was around 8% of Hawthorn’s total Player Approximate Value in 2018 and around 14% of its midfield value.

Distribution of quality across the park also matters to this projection. The midfield rating tells us about the balance of inside 50s teams should expect, meaning teams who do well there shut off opposition opportunities to score and get plenty themselves.

Sydney have the best defence in this model, with Port Adelaide also looking fairly strong, but both teams suffer due to deficiencies elsewhere on the ground. Both are projecting to struggle to get enough wins to make finals. Some decent looking forward lines like Fremantle’s also look set to suffer from lack of midfield supply.

Fixturing also plays a role. Collingwood and Geelong both rate nearly identically in terms of strength, but the Cats are being given a half-win head start as a result of a somewhat easier fixture.

The conclusion

There’s a fair chance that the end result will be something in the middle, like this:

Charting our two methods of projection looks like this:

As Ryan Buckland noted today in The Roar, there appears to be clear tiers this season. However, sides in each tier seem to vary. HPN’s two methods of projection produce different looking tiers – the team model gives an even chasing pack outside of the top couple of sides all the way down to 12th. On the other hand, the player model gives an elite 4 and clear top 8. There always shocks and surprises throughout the season, especially with the rule changes that have been brought in.

On the topic of rule changes, there was a good piece on ABC Grandstand today (OK, HPN wrote it with James Coventry, but it’s still good) that suggests the impact of some headline rule changes might be much less than expected. Check it out!

Now, bring on the season!

 

Leave a Reply

Your email address will not be published. Required fields are marked *