Sunday, April 7, 2013

Why doesn't paying win like it used to?


For a while in the late 1990s it seemed like the final standings in baseball were a foregone conclusion. The relationship between payroll and winning topped out in 1998 and 1999 with a correlation of .68 and 0.71. In 2000 and 2001 the relationship dropped to .32 and .31 as the Moneyball A’s racked up impressive win totals on a low budget. While this is a topic that has been addressed at length, I would like to add one more dimension to the discussion: locking in free agents.

Note that correlations differ somewhat throughout - whether calculated on opening day salary or end of year and all related variations. I am going by Dave Studeman's via the next link below.

Free agency only arrived in baseball in the 1970s so data on this phenomenon are already constrained. The collusion of the owners in the late 80s further clouded the issue and drove correlation down to extremely low levels. Dave Studeman at Hardball Times has the explanation:

In the first few years of free agency—the latter half of the 1970s—teams did take advantage of new opportunities by signing top talent to big bucks. It's no coincidence that this period coincided with the Steinbrenner Yankees' return to glory and the introduction of two bottom-dwelling, low-pay expansion teams (the Mariners and Blue Jays). These developments exacerbated the differences between the have's and have-not's.

Beginning around 1980, however, the picture changed as young, lower-paid talent began to make an impact on the pennant races. Players such as Eddie Murray and Cal Ripken in Baltimore, Rickey Henderson in Oakland and George Brett in Kansas City changed their team's fortunes before changing their payrolls. The Mets developed a gaggle of phenomenal, "cheap" young talent. This influx of top young talent helped change the picture in the early part of the decade. At the same time, bad contracts started appearing. The Angels became the first team known for its bloated, underperforming contracts.

Something else happened in the 1980s: collusion. In 1985, 1986 and 1987, free agents such as Andre Dawson, Tim Raines, Jack Morris and many others found no market for their services. It turns out that commissioner Peter Ueberroth had convinced major league owners that they should work together to refuse expensive, long-term contracts. The owners reportedly established standards of no more than three years for position players and two years for pitchers. As a result, average payroll actually declined in 1987.

The impact on the economics of winning was stark, and the correlation between wins and payroll reached two of their lowest points in 1986 and 1987 (0.17 and 0.15, respectively). Money was losing its power and competitive balance seemed possible. Trouble was, this was illegal. In three different cases, arbitrators ruled that the owners had colluded and eventually ordered them to pay damages.

What changed?

What I’m interest in is why the post-collusion, post-strike period of high correlation broke down. After the A’s (and others) pushed correlations down in 2000 and 2001, they popped back up into the 0.50 range by 2004 and hovered there for a while. The last several years, however, have drifted down to a 0.18 correlation in 2012.


As an Indians fan going back to the late 80s, I can remember some terrible teams and I will always have a soft spot for John Hart. As Cleveland’s GM in the early 1990s, Hart pioneered the practice of signing his young players to long-term extensions and buying out years of arbitration (and sometimes a year or two of free agency) while giving the player downside protection. This practice has evolved to the point where teams like the Rays have given extensions just a few weeks into a player’s big league career.

The next evolutionary step in this practice is keeping your own players, at least the ones who you believe will do well, even in their free agency years. The player’s own team should have the best information on whether they are likely to regress, get injured or otherwise underperform. This would play out in the data as a higher proportion of elite players being with the team that controlled them prior to free agency.

The data

I pulled the last 20 years of top ten in WAR for both hitters and pitchers from baseball-reference.com and layered on the contract status of each player (Pre-Arbitration, Arbitration, Free Agent (same team as arbitration) and Free Agent (other team)). These are not perfect but they give us a rough cut at the factors I’m looking for. Players traded during their pre-free agency years were still considered Free Agent (Same) if they subsequently re-signed with the team to which they were traded.

Once I had the data compiled I pulled summary data for average years of service, average age, percent of free agents who are on other teams (%FAO) and number (out of the top ten) of free agents of either type. 


Looking at these top players should show us where elite performances were coming from and how much teams had to pay for it. Just these ten players represent between 65 and 80 wins per year out of roughly 1000 available wins above replacement.

The outputs



For hitters, the correlations are almost pitifully weak outside of percent of free agents who are on another team (correl = 0.49). Since age was not a factor (correl = -0.05), this strongly supports the idea that teams locking up the best free agents is a major reason for lower correlation. In the chart it is striking how the composition changed over the 1995-2007 period (hmmm….) before reverting to something like the 1980s. 



For pitchers, both of the age-based metrics showed much stronger correlation (age = 0.59 and years of service = 0.56) than %FAO, which showed a correlation of only 0.12. The chart for pitchers shows the evolution of a hall of fame cohort (Roger Clemens appears 14 times, Randy Johnson 11, Greg Maddux 9) at the same time correlations were peaking. This may not represent anything as strong as causation, but those players were drawing pretty hefty paychecks toward the end of their careers.

When you throw all this together in a two variable regression, the %FAO for hitters isn’t quite significant at the 90% level but it’s very close and the age of the top pitchers is significant to 98% with a 0.014 p-value. The coefficient of the %FAO term is 0.13, representing the amount the correlation would be increased for 100% of free agent hitters in the WAR top ten being with a team that did not control them in arbitration. The coefficient for pitchers' age is 0.056 but the pitchers only vary between 26.6 and 32.3 - a range of 0.32 in wins/payroll correlation. At the minimum average age of the top ten pitchers with 0% of free agent hitters with other teams, the predicted correlation is 0.24. The overall regression clocks in with a 0.63 R and 0.39 R-squared.

According to this, the way to win as a low-payroll team (or the way that the relationship between wins and payroll is disrupted) is to selectively sign your own players - at least the ones who will be good - and develop some young pitching. The pitching dominates the hitting in the data set with the range of pitching ages being roughly 2.5x the range of hitters in terms of impact on the wins/payroll correlation.


Maybe it’s not as easy as it sounds, but interesting to see what correlates with the wins/payroll correlation. Just make sure the one salary you lock in isn’t Travis Hafner.

No comments:

Post a Comment