Remember me

Measuring Changes in the Quality of Play based on the Decline Rates of Hitters

May 21, 2019
 

Measuring Changes in the Quality of Play

Based on the Decline Rate of Hitters

 

            I was trying to use hitter decline rates to evaluate the speed at which the quality of play in the majors is improving, and has improved over time; that was the theory.  It didn’t exactly work.  I had a twitter exchange with Mike Petriello in which I thought that he was making an unsupportable claim about major league players today doing things that players couldn’t do years ago, and in the middle of this discussion Voros offered the thought that if the quality of play in major league baseball has improved dramatically in recent years, it would be difficult to explain how the 40-year-old David Ortiz and the 38-year-old Adrian Beltre could perform at the levels they have.   I thought "Oh, yeah, I ought to actually do that study"—"that study" being a study of the decline rates of hitters over time.  In other words, almost all hitters are better at age 27 than they are at age 37, right?   There are a few exceptions, hitters like David and Adrian and Nelson Cruz who just keep hitting as they age.   But if the quality of play improves significantly in a short period of time, that should lead to an increase in the decline rate of aging hitters, relative to the league.  

Hitters reach their peak at age 27, right?  26 to 28, let’s say.   In  measuring "decline" in hitting performance, then, we start at the peak, at ages 26 to 28.   Referencing David Ortiz and Adrian Beltre, I decided to measure declines in ten-year increments.   My study group, then, was all hitters in the years 1920 to 2008 who had 500 plate appearances in a season at age 26, 27 or 28.  1920, rather than 1919 or before, because the game before 1920 was just too different to be relevant to the study; 2008, because a ten-year decline rate cannot be established for a player from 2009 or later.  The idea was that if hitters declined more between 1960 and 1950 than between 1940 and 1950, that would suggest that the improvement in the quality of play was greater in the 1950s than in the 1960s. 

            My intention with this study was to measure the decline rate of aging hitters over time, and to use that to see if we could chart relative increases in the quality of play over time.   As I say, that was the theory; it didn’t really work.   It didn’t work because (a) there isn’t enough comparison data to make it work, and (b) it’s actually unclear how those changes which can be measured should be interpreted.   But I’ll go ahead and report on the study; maybe the next study of the issue will find something.  (By the way, as a note, Dick Cramer studied more or less the same thing in 1975.  His method was a lot different from mine, but he did some of the same things.  Just giving credit where credit is due.)  Anyway, you can measure the decline rate of hitters between age 26 and age 36; you can measure the decline rate of hitters between age 27 and age 37, because, over time, there are enough players who are in the major leagues at ages 26, 27, 36 and 37 to make those estimates.   But you can not measure the decline rate of players in ten years of aging between 2008 and 2018, because there simply are not enough players in the major leagues aged 36 to 38 in the year 2018 to get anything remotely resembling meaningful data out of studying them. 

             

            There were 2,741 players in the study—901 26-year-olds, 938 27-year-olds, and 902 28-year-olds.   The problem is, very few of those players could be evaluated for decline rates using a straightforward approach.   Most of them, ten years later, were not in the majors; that’s all we know, they’re not there anymore.   What is the decline rate of those players?

            You can say that their decline rate is 100%, but that’s misleading.  They don’t have ZERO ability to hit major league pitching at age 37; they merely have INSUFFICIENT ability to hit major league pitching at age 37.   Also, if we assumed that the decline rate was 100%--which I did not—but if you do, it doesn’t help; it simply gives you average decline rates so close to 100% that the differences are not interesting. 

            There is a massive difference between ability and value.   The value of a 37-year-old player who is not in the majors is easy to see:  it is zero.   But the ability level of a player who is not in the majors is difficult to see:  it is masked by the fact that he is not in the majors at the moment.   If he were in the majors, he wouldn’t hit .000; he just wouldn’t hit enough to hold his job. 

            I first stated the ability of each hitter by his runs created per 27 outs, relative to the league ERA.  If a player created 5.00 runs per 27 outs in a league in which the ERA was 4.00, that would be a ratio of 1.25.  (Of course the league runs/game would be preferred here, but I don’t have that in the data, and, as a practical matter, it makes no difference as long as you do one thing consistently.)   In the simplest case, suppose that a player has 500 plate appearances at age 27 and again at age 37, then I compared his "ratio" at age 37 to age 27.  Ian Kinsler in 2008, for example, created 99 runs while making 381 outs, which is 7.01 runs per 27 outs.  The American League ERA in 2008 was 4.35.  That’s a ratio of 1.61 to 1 (7.01 to 4.35), so we record Kinsler at 1.61 in 2008.   He was then 26 years old.

            In 2018 Kinsler, now 36 years old, created 53 runs while making 395 outs.  That’s 3.60 runs per 27 outs.  The American League ERA in 2018 was  4.27.  That’s a ratio of .843 to 1.  Between 2008 and 2018, Kinsler declined, as a hitter, from 61% better than the league ERA to 16% worse.  That’s a decline of. . .well, it depends on whether you state the decline relative to the league ERA, or relative to Kinsler.   If you state it relative to the league ERA, it’s 77%.   If you state it relative to Kinsler’s base line, his 2008 performance level, it is 48%. 

            The problem is that there were only six major league non-pitchers who (a) had 500 plate appearances in 2008, (b) were 36, 37 or 38 years old in 2018, and (c) had even one plate appearance in 2018.   There are Kinsler, Pujols and Granderson, who played a lot, and Adrian Gonzalez, Matt Holliday and Brandon Phillips, who played a little bit; that’s it.   There were a couple of other players in that age range who were still around although they weren’t regulars in 2008, and there were a handful of players around who were even older than that, but in terms of my organized data, there just isn’t enough that you can do anything with it.  

            I thought perhaps that I could solve this problem by substituting a "presumptive decline rate" for players who were no longer in the major leagues; in other words, a "known" decline rate for the guys like Kinsler and Pujols, who are still around, and a "presumed" decline rate for the players who have declined to the point of uselessness.   But what happens is that, since the overwhelming majority of the players get the presumed decline rate, you wind up with group averages which are some number very close to the presumed decline rate.  

            I thought perhaps that I could work around this problem by using player’s Winning Percentages based on their offense and defense, rather than just focusing on the hitting, but that just wasted more of my time before I ran into the same problem.  If you can’t figure out what your data means, it is useful to look at it in some other way, but if you just don’t have enough data to substantiate any conclusions, then there’s no way to work around that.  

            Well. . . I’ll report what I did learn.

 

(a)  Of the 901 26-year-olds in the study, all of whom had 500 plate appearances, 119 had 500 plate appearances again at age 36.   Those 119 players had an average decline in Win Shares Winning Percentage, between ages 26 and 36, of 90 points.   They had an average winning percentage of .632 at age 26, and .542 at age 36.  

 

(b)   Of the 938 27-year-olds in the study, all of whom had 500 plate appearances, 79 had 500 plate appearances again at age 37.   Those 79 players had an average decline in Winning Percentage, between ages 27 and 37, of 111 points.  They had an average Winning Percentage of .641 at age 27, and .530 at age 37.

 

 

(c)   Of the 902 28-year-olds in the study, 52 had 500 or more plate appearances at the age of 38.  Those 52 players had an average decline in Winning Percentage, between ages 28 and 38, of 153 points.   They had an average Winning Percentage of .655 as 28-year-olds, and .501 at age 38.

 

All of that information makes intuitive sense.   It is useful information which could provide the basis of future study.  If you compare the hitting performance of 38-year-olds, on average, to 28-year-olds, on average, you will find that there is little or no difference.  The reason this is true is that, as hitters decline, they drop out of the majors, thus drop out of the data pool—and also, because they are also losing defensive value, the offensive level at which they drop out of the pool increases with age.  This creates an anomalous situation in which, even if every player in the group is declining—which basically they are—the average does not decline.  

            I wrote about this problem at length in one of the early Abstracts, I think the 1982 Abstract, illustrating the problem with ascending and descending lines showing how it was possible for this to happen.   A few of you will remember that article.   But from then until now, this issue has blocked further explanation, because we could not measure the "peak quality" of those who remained vs. those who had dropped out.   Does that make sense?  The .500 players drop out of the pool at age 30, the .520 players at age 31, the .540 players at age 32, etc.   But until now I could never see how this effect could be measured.   But I see now, as a consequence of this study, how this could, in theory, be measured, and I have actually measured a little bit of it.  The players who are still in the majors when they are 36 were .632 players when they were 26; the players who are still in the majors when they are 37 were .641 players when they were 27, and the players who are still in the majors when they are 38 were .655 players when they were 28. 

(d)   Of the 901 26-year-olds in the study, 34 played better at age 36 than they had at age 26.  The other 867 either did not play at all at age 36, or did not play as well.   96% of them declined.

 

(e)  Of the 938 27-year-olds in the study, 28 played better at age 37 than they had at age 27.   The other 910 either did not play at all at age 37, or did not play as well.   97% of them declined.

 

 

(f)    Of the 902 28-year-olds in the study, 16 played better at 38 than they had at age 28.  The other 886 either did not play at all at age 38, or did not play as well.  That’s a decline percentage of 98%--a high 98%.

 

There is a little wrinkle there that I haven’t explained, and I suppose I should.  Suppose that a player doesn’t have 500 plate appearances at the more advanced age, but plays well while he is on the field? 

            It depends on how well he plays and how much he plays.  Jim Gilliam, for example, had a very poor season in 1955, when he was 26 years old, but a much better season in 1965, when he was 36.   He had 500+ Plate Appearances at age 26, but only 432 when he was 36—but, because 432 plate appearances isn’t that far from 500, and because he played significantly better, I still counted him as playing at a higher level at age 36 than at age 26.   Rick Monday, on the other hand, played better in 1982, when he was 36 years old, than he had at age 1972, when he was 26.  However, since he had only 246 plate appearances at age 36 and wasn’t all that much better, he counts as playing at a higher level at age 26 than at age 36.   Basically, there is a "presumptive decline" that applies to the "missing plate appearances".   I hope that makes sense, because I’ve run out of patience in explaining these things. 

            Anyway:

(g)  The five most notable cases within this study in which a player played better at age 36 than at age 26 are (1) Ozzie Smith, (2) Eddie Joost, (3) Jim Gilliam, (4) Ellis Burks, and (5) Luke Appling.  Ozzie Smith hit .222 with a .550 OPS at age 26, but .285 with a .747 OPS at age 36.  Since four of these five players are middle infielders with no power (at age 26) but very good strikeout to walk ratios, that could provide a clue for additional study.

 

(h)  The five most notable cases in which a player played better at age 37 than at age 27 are (1) Jeff Kent, (2) Adrian Beltre, (3) Edgar Martinez, (4) Bill Bruton and (5) Gene Woodling.   Jeff Kent hit .278 with a .791 OPS at age 27, but .289 with an .889 OPS at age 37.

 

 

(i)     The five most notable cases in which a player played better at age 38 than at age 28 are (1) Ron Fairly, (2) Edgar Martinez, (3) Steve Finley, (4) Frank White, and (5) Davey Lopes.  Ron Fairly hit .220 with a .616 OPS in 1967 (aged 28), but .279 with an .827 OPS ten years later.  Even though he was in a much better hitter’s park in 1977 and had much more defensive value in 1967, that’s still a much better player at age 38 than at age 28.

 

(j)    The largest DECLINES in performance level between ages 26 and 36 were (1) Victor Martinez, 2005 to 2015, (2) Sammy Sosa, (3) Tony Oliva, (4), Todd Helton, and (5) Chase Utley. 

 

 

(k)  The largest declines in performance level between ages 27 and 37 were (1) Reggie Jackson, 1973 to 1983, (2) Albert Pujols, (3) Ivan Rodriguez, (4) George Sisler, and (5) Magglio Ordonez.

 

(l)     The largest declines in performance level between ages 28 and 38 were (1) Victor Martinez, 2007 to 2017, (2) Brooks Robinson, (3) Albert Pujols, (4), Barry Larkin, and (5) Pee Wee Reese.

 

  

Although the data is inadequate to reach the systematic conclusions that I hoped to reach, there are some things that are apparent.   Even these, however, are more troublesome than helpful.  The conclusions which can be drawn tend to undermine the method as much as they support it.

It is clear, for example, that the decline rates are unusually small at the outset of the steroid era, 1990 to 2002.  This is not unexpected, of course; we know that steroids did in fact help many players stave off the effects of aging.  The data, while very limited, is sufficient to measure an obvious truth in this case.

            But a small decline rate is supposed to indicate a small improvement in the quality of play—but is this true?   Well, no; that’s not what it indicates at all.   The steroids—while I am strongly opposed to the use of steroids by baseball players—did not make the quality of competition worse.   They damaged the GAME, but not the quality of competition.  The small decline rate in that era doesn’t indicate what it is supposed to indicate, at all.

            World War II. .  .there it does work.   There is a very small decline rate 1933-1943, 1934-1944, and 1935-1945.   Obviously that IS instructive about the (notoriously poor) quality of play in that era.   There are a bunch of older players succeeding during World War II because the younger men are off fighting the war.   That checks. 

            But the expansion era.   If the theory is true that an increase in the performance of older players indicates weakness in the league, then the years 1961-1962 (compared to 1951-1952) should show low rates of decline, just as the World War II era does.   But in fact, the decline rates in those years are HIGH, not low.  

            The data, to the extent that any conclusions can be drawn from it, does not support the theory that a rapid increase in the quality of competition would be accompanied by a larger than normal decline rate for aging players.   It’s just not there. 

            I’ll open this up for comment tomorrow.  Thanks for reading.

 

 
 

COMMENTS (13 Comments, most recent shown first)

KaiserD2
I decided to investigate changes in the age distribution of regulars, using just two data points: 2018, and 1982 (picked pretty much at random.) Using the AL for both years, I looked at the top 9*14 offensive players in plate appearances for the former year and 9*15 for the latter, in other words, trying for nine regulars per team. I broke down ages into areas of three years, looking for the percentage of these years in the different years. The results showed a change.

In 1982 4% of the players were 20-22. In 2018, 3% were.

In 1982, 17% of the players were 23-5; in 2018 it was 22%. That surprised me because I think it takes too long to get players to the majors nowadays.

For 26-8 (prime) it was 28% for 1982 and 33% for 2018. So, in 1982, adding up, 49% of all offensive regulars were 28 or under. Last year it was 58%.

There were more 29-31 year olds in 1982 (27% to 22%), more 32-34 year olds (16% to 10%), and essentially the same number over 35 (11 in 1982 with the smaller league, 12 last year.) So based on this very small sample, very old players are remaining regulars at the same rate, but more mediocre players are losing their regular jobs age 29-34. and more players 23-28 are earning jobs. Those trends make sense given what we know about performance.

It would be interesting to add some performance data (as well as more years, obviously). I might do the latter going back further in time but I don't think I'll be adding the performance data any time soon.

David K
3:57 PM May 25th
 
evanecurb
To those who suggest using appearances, at bats, innings, etc. by age as measures: I think these numbers are skewed due to economic factors. Young players are paid less, have more potential for future development, and are a better value for the saem production. That doesn't mean they are necessarily better players as a group than their older cohorts, but it does mean they are likely to get more opportunities given the same level of ability.
10:57 AM May 24th
 
willibphx
wdr1946, I am with you. If the "younger" or "newer" players are of higher quality would this not show up in an accelerated decline in the number of ABs/IPs by players as they age. Much more of a crude measurement but might be enlightening to see the percentage of increase or decrease in playing time each year for each age. This would not answer a detailed player level quality decline while they are in the majors but would measure how well they hang on to their jobs which is a derivative of the quesion.
6:31 AM May 23rd
 
wdr1946
Wouldn't a simpler way of approaching this question simply be to see what percentage at bats/innings pitched were by players over 34 (or whatever age), and how this changes over time? This would involve far less heavy lifting of the data, and, by definition, anyone in the Major Leagues is considered to be of Major League quality. My guess is that this would show that a higher percentage of players now and in the recent past are older than in the 1800s or early 1900s.
1:12 AM May 23rd
 
bjames
This is not true. This is not what the study says, at all. ALL 26 to 28 year olds were included in the study in one way or another, whether they were still playing ten years later or not.


KaiserD2
As Bill makes clear, because of the way he designed the study, it tells us what happens to very good players as they age. The vast majority of the players in the study were very good or great in their mid-20s which is the only reason they were still playing ten years later​
11:36 AM May 22nd
 
KaiserD2
As Bill makes clear, because of the way he designed the study, it tells us what happens to very good players as they age. The vast majority of the players in the study were very good or great in their mid-20s which is the only reason they were still playing ten years later. His figures show they were usually only a little better than average then but that's good enough to play, particularly if you have a great reputation.

I think it would be at least as interesting (and probably pretty easy for Bill) to do a study as follows: break down 26-28 year olds in a given year by their Win Shares winning percentage (or WAA or whatever overall measurement you want to use) and then ask, for each group (superstar, star, average, below average), how many of them were still regulars 3 years later? 6 years later? 9 years later? That would be very important information for teams or fans to know.

The point about steroids is an important one and illustrates that in that era, players could become and then remain great at a much more advanced age than ever before. I wish Hall of Fame voters had noticed this, but they clearly haven't.

David Kaiser
7:50 AM May 22nd
 
evanecurb
I think the issue with the study might be the point to point methodology. Instead of measuring 2008 to 2018 in isolation, you may have to follow every 27 year old each year in order to get an accurate picture of decline.
7:39 AM May 22nd
 
FrankD
We could compare the age distribution in MLB with the age distribution in other sports. Have the age distributions of Olympians, sprinters, lugers, etc., changed with time? Many sports are 'just' physical ability whereas others are both physical ability and experience. We could compare baseball with boxing, football, etc., as to age statistics.
11:02 PM May 21st
 
FrankD
Has the age entry level of MLB changed through time? Perhaps an overall age distribution each year would yield something useful. We know that athletic performance declines with age. Would an overall shift in age distribution throughout MLB history, if it exists, show us anything?
10:55 PM May 21st
 
hotstatrat
One of the things that has helped modern players be of higher quality than their predecessors is the level of fitness they have attained. We know so much more now about strength and conditioning than we used to. These advances probably help the older athletes proportionally more than younger athletes.

And, I'm sure players today don't smoke as much as players did in the past - not that seemed to hinder Henry Aaron's later years.

So, yeah, there would be a lot of noise in such a study, but it is a subject I am very interested in.
4:18 PM May 21st
 
formersd
I think the key challenge to this problem is survivorship bias. As Bill pointed out, it's not that the player who don't make it to 38 suddenly lose all their ability, it's just their ability dips below the threshold that keeps them as major league players. We know that over 10 years, the number of players who can still hold major league jobs drops, so we know the aging effects are real but quantifying those at a deeper level is difficult since we only have the survivors left and they likely don't represent the broader trend.

As for the expansion era, I think the other significant change happening at the same time as expansion is the dramatic increase in the number of black and Hispanic players. This improvement in the depth of the talent pool should offset much of the impact of the expected negative impacts of an expanded player pool.
11:30 AM May 21st
 
StatsGuru
I am trying to understand the original hypothesis, that increasing quality of play increases the speed of decline due to aging. I take it that the idea is that the increase in quality is due to younger players coming into the league are better than the previous generation, say ten years before. So on top of getting old, the previous generation is also facing tougher competition than when they entered the league.

Often the increase in competition makes everyone better, so I don't see why the above would be the case. I would think players in their 30s would train harder to keep playing, and slow their declines.
8:47 AM May 21st
 
evanecurb
Hey Bill,

This is a problem worth pursuing. Would it work better if you tried five year decline rates instead of ten years? Measure the difference between performance at ages 26-28 vs. the same hitters at ages 31 to 33. It would give you a bigger sample size to work with.
7:30 AM May 21st
 
 
©2024 Be Jolly, Inc. All Rights Reserved.|Powered by Sports Info Solutions|Terms & Conditions|Privacy Policy