Remember me

I. The Strikeout Push Effect

April 30, 2012

                     Do you know why strikeouts, over time, only go up?

                Strikeouts, over time, always increase, for this reason.    Strikeout pitchers are more effective than pitchers who don’t get strikeouts, therefore teams are always looking for pitchers who can get more strikeouts, and also looking to deploy those pitchers they have in such a way that they will get the most strikeouts.  This effect would be offset by the tendency of teams to look for hitters who don’t strike out, if hitters who did not strike out were also better hitters.  However, hitters who strike out are generally not less effective than hitters who do not strike out; hitters who strike out are generally just as effective as or more effective than hitters who don’t strike out.   Thus, there is no pressure to find hitters who don’t strike out.    This asymmetry pushes strikeout totals higher over time.

                It occurred to me in January that I should be able to measure this asymmetrical effect, and thus measure the upward pressure on strikeouts at any point in baseball history.    This can be measured in the following way:

                1)  Measure the extent to which the high-strikeout pitchers are more effective than the low-strikeout pitchers.

                2)  Measure the extent to which the high-strikeout hitters are less effective than the low-strikeout hitters (if they are), or measure whatever effect there is there.

                3)   Combine these two measurements into one. 

                I’m going to jump right to the conclusion here.  I started this study in an optimistic frame of mind, thinking that the study might show us that we have finally reached an equilibrium, where there was no more upward pressure on strikeouts, or a near-equilibrium, where the upward pressure on strikeouts was minimal.


                I don’t know if you know. …strikeouts, which generally only increase over time, have increased enormously in the last eight years.    In 2003 the major league average was 6.4 strikeouts per nine innings pitched.   In 2011 it was 7.3 strikeouts per nine innings.   Rarely or never before have strikeouts increased at such a rapid pace.

                I thought I had seen signs that this era of rapidly increasing strikeouts might have reached an end.    Unfortunately, if my study is correct, then. . .no such luck.

                Let’s go back to the beginning.   My study covered the seasons 1916 to 2011, a 96-year period.    Let’s take the pitchers in the first three years of that period, the years 1916 to 1918.   In those years there were 373 batters who had 300 or more plate appearances (373 batter/seasons.)   Suppose we divide those into four groups:  the top 25% in strikeouts per at bat, the middle 50%, and the bottom 25%.

                In that era (1916-1918) the hitters who struck out the most were the least effective hitters.   There was little power in the game; the hitters who struck out the most could not compensate for that by hitting home runs, because there just weren’t very many home runs.   In that era the high-strikeout hitters created 4.20 runs per 27 outs, on average.   The mid-range strikeout hitters created 4.23 runs per 27 outs, whereas the low strikeout hitters—who included Ty Cobb, Tris Speaker, Joe Jackson, Eddie Collins and George Sisler—created 4.72 runs per 27 outs.  The "strikeout push" effect of this can be measured at negative .52 (4.20 minus 4.72).

                On the other hand, the high-strikeout pitchers of that era—also studying only those who faced 300 or more batters in a season—the high-strikeout pitchers of the 1916 to 1918 era allowed an average of 3.09 runs per nine innings.    That, again, is the top 25% in strikeouts per inning.   The mid-range 50% allowed 3.57 runs per nine innings, and the bottom 25% in strikeout rates allowed 3.73 runs per nine innings.

                The runs allowed rates for pitchers are lower than those for hitters because we are only studying the regulars and near-regulars.    If we included the batters with 100 plate appearances and the pitchers who pitched 25 innings, the runs allowed rate for the pitchers should be the same as the runs created rate for the batters, but that’s not important right now.

                Anyway, the high-strikeout pitchers had a runs-allowed rate of 3.09; the low-strikeout pitchers had a runs-allowed rate of 3.73.   That’s a "strikeout push" effect of 0.64 (3.73 minus 3.09).   If we add together the strikeout push effect of the batters (negative 0.52) and the strikeout push effect of the pitchers (0.64), the total is 0.12.   The "Strikeout Push Effect" for the years 1916 to 1918 is 0.12. 

                That’s a very low figure; I’ll get ahead of myself and tell you that that’s a very, very low figure; it creates only minimal upward pressure on strikeouts.   If we included more factors and if we had included the part-time players and the failed pitchers, a Strikeout Push Effect of only 0.12 might actually equate to a DOWNWARD pressure on strikeouts.   I believe that it would.

                At any rate, strikeouts did go down.  In 1916 there were 9,525 strikeouts in the major leagues; in 1922, in basically the same number of games, there were 6,915 strikeouts in the majors (as best we know.   The standard of record-keeping in that era is so poor that the league strikeout totals for hitters don’t jibe with the strikeout records for pitchers, often by several hundred strikeouts.)   Our method shows that in that era there was little upward push on strikeouts, and strikeouts were going down.

                OK, we’re going to say that the strikeout push effect for 1917 was 0.12, and I’m going to run through this again to make sure you understand what that means:           





High Strikeout Batters






Low Strikeout Batters






Batter Strikeout Push







High Strikeout Pitchers






Low Strikeout Pitchers






Pitcher Strikeout Push







Combined Strikeout Push Effect




                This number began to creep upward.  In the years 1922 to 1924 the Strikeout Push Effect was +0.34—still very low.   In the late 1920s, however, this number exploded.   Following the example of Babe Ruth, more and more hitters began to swing hard, risking strikeouts to get some home runs.    In 1926 (1925-1927) the Combined Strikeout Push Effect was 0.52.   In 1929 (1928 to 1930) it was 1.57.

                We could have predicted, then, that strikeouts would increase, and they did.   In 1930 there were 7,931 strikeouts in the major leagues.   In 1940 there were 9,056—still less than there had been in the Walter Johnson era.

                In 1918 Ty Cobb, Joe Jackson and Tris Speaker were in the low-strikeout group, tending to push strikeouts down.   In 1930 Babe Ruth, Hack Wilson and Jimmie Foxx were in the high-strikeout group, tending to push strikeouts up.   If the best hitters are striking out a lot, you don’t worry about strikeouts, and when you don’t worry about strikeouts, strikeouts go up.

                After the 1930 era, the upward pressure on strikeouts began to abate gradually.   By 1935 the Push Effect was down to 1.13 (from its high of 1.57); in 1941 it was down to .84, and in 1947 it was back down to .44.   The strikeout curve flattened out.     In 1940 there had been 9,056 strikeouts in the major leagues; in 1949 there were 8,956.   Still less than in the Walter Johnson era.

                The three great hitters of the 1940s—Musial, DiMaggio and Ted Williams—were guys who DIDN’T strike out much.   If you look back to the 1920s and early 1930s, the league leaders in batters strikeouts were Babe Ruth, Hack Wilson and Jimmie Foxx.  In the mid-1940s they were Vince DiMaggio, Pat Seerey and Chet Laabs.

                Soon, however, another generation of muscle men/strikeout guys emerged.   Ralph Kiner led the National League in strikeouts in 1946, Hank Sauer in 1948, Duke Snider in 1949, Gil Hodges in 1951, Eddie Mathews in 1952.  Among the American League leaders were Larry Doby and Mickey Mantle.  The good hitters were striking out again.

                Breaking it down further, in 1935 the Strikeout Push Effect was +1.22 for pitchers, but negative .09 for hitters, thus +1.13 total.   In 1947 Push Effect was +.89 for pitchers, but negative .45 for hitters, thus +0.44 combined.   By 1956 it was +.70 for pitchers, negative .09 for hitters, the net effect edging back up to +0.61.   In 1959 it was +0.71; in 1962 it was +1.04.   There was strong upward pressure on strikeouts once again.

                The redefinition of the strike zone in 1963 came in the context of a game in which strikeouts were already increasing, and there was already strong upward pressure on strikeouts.

                To break into our story with a note of caution, I have been assuming here that the Strikeout Push Effect is predictive.    I haven’t actually proven that it is predictive.   It is possible that the effects I am measuring for 1962 are not predicting what will happen in the game in 1963-1972, but reflecting what has happened in 1953-1962.  It’s possible.   I am assuming that it is predictive because

                a)  I designed it to be predictive in theory, and

                b)  It is obvious that the increases in the Strikeout Push Effect do in fact track with the changes in the strikeout rate, that when the Strikeout Push Effect goes up, strikeouts go up.

                But the chicken-and-egg question is mathematically a much harder question, and I haven’t actually gotten into that.    Just don’t want to mislead you on that issue.

                 OK, in 1962 the Strikeout Push Effect was at 1.04, the highest it had been since 1938.   In 1965 it was 1.23.    Despite the lowering of the mound in 1968—or perhaps because of it—the Strikeout Push Effect continued to ascend, up to 1.30 in 1971 and 1.41 in 1977, the highest it had been since 1930.

                From the mid-1970s until the early 1990s, the Strikeout Push Effect declined, but remained high by historical standards.   By 1991 the Strikeout Push Effect was down to +0.88—but +0.88 is not a low figure.   +0.88 indicates that there is still significant upward pressure on strikeouts.

                Strikeouts per game increased only from 5.2 per game in 1977 to 5.6 in 1992, relatively modest increases.   As baseball entered the steroid era, however, strikeouts began to increase more rapidly.   By 2001 we were up to 6.7 strikeouts per game.

                In 2004 major league baseball had 6.6 strikeouts per nine innings—but a Strikeout Push Effect of +1.64.   The Strikeout Push Effect of 2004 was at an all-time high.

                And, as would be predicted by this theory, strikeouts did in fact explode after 2004.   The last eight years have had not only historic numbers of strikeouts, but historic increases in the rate of strikeouts. 

                Where are we now?

                Well. . .it ain’t pretty.  Since 2004 the Strikeout Push Effect has dropped slightly but steadily, down to a present figure of +1.47.   In the years 2009 to 2011, high-strikeout hitters created 5.04 runs per 27 outs, while low-strikeout hitters created only 4.93 runs per 27 outs.  On the other hand, high-strikeout pitchers allowed only 3.65 runs per 9 innings, while low-strikeout pitchers allowed 5.00.

                I would have much preferred to find that the upward trend in strikeouts was leveling off.   Unfortunately, that does not appear to be the case.    High-strikeout pitchers in today’s game are dramatically more effective than low-strikeout pitchers, while high-strikeout batters are also somewhat more effective than low-strikeout batters.   We are where we have always been, only worse.   Strikeouts, in my opinion, will continue to go up.


Trayvon III

                I was thinking about something else related to the Trayvon Martin case.   Remember that guy who showed up in the middle of the circus, placing a "bounty" on George Zimmerman?   Why wasn’t that guy arrested the next day?

                It’s obviously a crime, right. . ..making a terroristic threat.  If I offered a bounty for your murder, I would assume that I would be immediately arrested, and if you offered a bounty for my murder, I would hope that you would be immediately arrested.   Why wasn’t this guy arrested?

                Two reasons, I think.   First, the authorities didn’t want to expand the scope of their problems.   They’ve already got a shitstorm on their hands; they don’t need a second one.

                But that’s not a good reason, when you think about it, because of the example it sets to ignore the threat.  We’re going to have other controversies in the future, in which people should have been arrested but aren’t.  It’s pretty obvious that the precedent set by ignoring this kind of thing is more dangerous than the problem created by expanding the parameters of the immediate shitstorm.   I don’t think that’s the main thing.

                The main thing, I think, is they didn’t want to make that guy any bigger than he was.   The guy was obviously trying to use the media and the publicity associated with the Martin/Zimmerman case to make a name for himself.   If you arrest him, then he becomes the issue.

                If you can arrest him and put him away for ten years, OK, you’d probably go ahead and do that.   But. . .again, I’m guessing here. . .but I’m guessing that if he is arrested for making a terroristic threat, it’s probably theoretically possible to put him away for ten years but it’s probably not going to happen in the real world.   In the real world if you arrest him, he probably gets $25 million worth of publicity in exchange for three months in jail, negotiates some sort of a plea, and he’s back on the streets and a bigger problem now than he was before.   I’m guessing that’s why it wasn’t done.

                History may prove that that was a bad decision.   This guy isn’t going to go away; the next time there’s an opportunity, he’s going to be back in the middle of it, making himself an issue, making the problem a little worse.   It may prove, in the long run, that it would have been better to arrest him and start the clock rolling on the long-term prison sentence that lurks at the end of the block.   We’ll see.


The Perfect Age Study

                Back to baseball.   I do studies all the time that I don’t get time to write up.   I did these two studies in January, never got time to write about either of them until now.   I have 200 unpublished studies, will never get time to catch up.

                Let us say that the perfect age for a baseball player is 27 or 28.   (By the way, in February I had a meeting with a guy from England who I gather is the world’s foremost soccer researcher.  One of the questions I asked him was what was the peak age for a soccer player.   "27", he said immediately, and then explained that it varies with the position; goal tenders peak later because they don’t have to run as much, and certain types of players peak earlier because all they do is run around frantically, but basically. ….27.)  

                Anyway, let’s say that the peak age for a baseball player is 27 or 28, and let us say that a player has an "age score" which is:

                100 if he is 27 or 28,

                4 points less than 100 for each year that he is younger than 27, and

                2 points less than 100 for each year that he is older than 28.

                A player at 25 or 32, then, has an "age score" of 92; a player at age 20 or 42 has an "age score" of 72, while a player who is 17 or 48 has an age score of 60.   Jamie Moyer has an age score of 56, meaning that he is not really at the perfect age for a baseball player.

                In 1879 Monte Ward, who was 19 years old, won 47 games in the National League.   We can take this as an indicator of the quality of the league.   When you have 19-year-olds dominating the league, that indicates that the quality of the league could be a little weak.   It is among the common indictments of baseball in World War II that the game was played by teenagers and old men, and it is among the complaints about expansion that the 1963 Houston Colts had ten teenagers who played for them (two of whom were Rusty Staub and Joe Morgan, and the other eight of whom were not.)   When the quality of baseball goes down, the average age score goes down.

                This applies also to levels of baseball; if you strung out the minor leagues, the perfect age score would be higher in AAA than in AA, higher in AA than in A ball, higher in High A than in Low A, higher in Low A than in rookie ball, etc.   The Perfect Age score would be higher in college baseball than in high school.

                We can figure the perfect age score for each season in baseball history by simply multiplying each player’s plate appearances by his age score, and each pitcher’s batters faced by his age score, and finding the weighted average.   We not only could; I actually have.   Maybe I’ll start by presenting the data: 



                All of those numbers look a lot alike, don’t they?    I set up the system that way for a reason.   It is part of a system to measure the quality of a league at every level, not simply the relative quality of the majors in 2011 as opposed to 1950.    We could make a perfect age score for a league that would place major league baseball on the same scale with T-Ball.   Assuming that T-Ball kids are five, their "age score" would be 12.   A downside of doing it that way, however, is that the numbers all look alike.   We can "correct" for this by setting the third-lowest score on this chart equal to zero, and the third-highest score equal to a hundred.   That makes the patterns easier to see, and I’ll try to help that by marking "up" numbers in blue and "down" numbers in green:



                1)  The "Perfect Age Score" started out very low, and remained relatively low throughout the 19th century, although increasing steadily.

                2)  The number surged forward in 1900, when the National League disbanded four teams, cutting from a 12-team league to an 8-team league, and then dropped back down in 1901, when the American League opened for business.  

                3)  After a high in 1904 the numbers declined substantially, for reasons that I don’t understand.

                4)  The perfect age score reached its all-time peak in 1919-1921.  The players in that era were clustered more closely around ages 27-28 than at any other time in baseball history—still.  I don’t know why that happened, but it was a very dynamic situation, with the coming of Babe Ruth, the banning of the spitball, the interruption of the game due to World War I and the expulsion of the Black Sox and the other corrupt players.  

                5)    After 1920 the numbers declined steadily due, I believe, to economic forces.  Booming attendance drove salaries up, so that aging players could make more money as 35-year-olds than they had made in their prime seasons.   This kept a lot of older players in the game.

                6)  This is perhaps the key point:  the anomalies associated with World War II were tremendously overstated by the media in that era, and have been exaggerated by historians ever since.   The perfect age score did go down in World War II—barely.  The figure for 1945 was the lowest since 1913, but the general impression that baseball in the war years was played by teenagers and old men is just wrong.   The perfect age score was higher in 1943 than it was in 1928, 1929, 1930, 1931, 1932, 1933, 1940 or 1941.   It only really declined in 1945, the final year of the war, and even then the change is not all that notable.

                7)  After shooting upward after World War II, the number declined again due largely to the Whiz Kids, the Philadelphia National League champions of 1950, and the Bonus Baby Wars of the post-war era, which brought into the majors a substantial number of very young players.

                8)   The Perfect Age Score dropped sharply in 1960-1961 due to the influence of the "Kiddie Corps" in Baltimore.   In 1960 the Baltimore Orioles, managed by Paul Richard, used a starting rotation of Milt Pappas (aged 21), Steve Barber (21), Chuck Estrada (22), Jack Fisher (21), Jerry Walker (21) and one other, older pitcher.   They led the American League in ERA.   This made other managers, for some years, dramatically more willing to put 21-year-old pitchers on the mound. 

                9)  The Perfect Age score dropped further after expansion, reaching its lowest point since World War II, and then recovered in the late 1960s.

                10)  The numbers went steadily but slowly upward from the mid-1960s until the mid-1990s, reaching the highest peak in the mid-1990s since the early Babe Ruth era.

                11)  The steroid era changed aging patterns, kept older players in the game, and thus drove the numbers sharply lower, down to their lowest point since the 1960s.

                12)  Since the banning of steroids the Perfect Age Score has moved up.   The 2011 figure was the highest since the mid-1990s.


COMMENTS (13 Comments, most recent shown first)

It is interesting to consider the strikeout push effect in light of the recent theories of pitching that minimize the number of outcomes that a pitcher can control. My assumption is that high strikeout pitchers do not walk more batters than low strikeout pitchers, or yield more home runs. Therefore, the difference in effectiveness between high strikoput and low strikeout pitchers more or less adds up to their difference in strikeouts.

What you have happen is that high strikeout pitchers are more valuable than low strikeout pitchers when league-wide strikeout rates are high. Bill, you've organized your data into strikeout pitchers in the top 25% and bottom 25%, I'm sure for the sake of simplicity. But it is not true that these groups differ equally in strikeouts in different eras. Let's imagine an extreme cases, where strikeouts were as rare as triples. If that were the case, the different effectiveness of the groups, the runs allowed, would depend very little on the strikeouts themselves. So, it seems to me there's a self-perpetuating element. As the skill to strikeout batters increases, it becomes a more and more important skill. And with strikeout pitchers not offering a countervailing downside, the path is set for strikeouts to be ever increasing. In case the point is not clear, the difference in raw strikeouts between high and low groups in a 7 SO per 9 IP league is very different than the difference in raw strikeouts between high and low groups in a 3 SO per 9 IP league. The former will naturally show a larger push effect.

For simplicity's sake, I said that strikeout pitchers and non-strikeout pitchers only differ and always have differed only in terms of strikeouts, but I'm sure this is not true. I'd be really interested to see the concept behind the strikeout push effect applied to walks. We all know of Ryan and Feller, who issued many walks at points in their careers. Has this model of pitcher become less frequent? You could take the top 25% in terms of strikeouts per inning, and the bottom 25% in terms of strikeouts per inning, and compare their walk rates in different years.

Another question is whether high strikeout pitchers really have worse control when they issue a higher rate of walks than low strikeout pitchers. Specifically, what is the real meaning of the strikeout to walk ratio? Looking at walks and strikeouts from the standpoint of the batter, a guy like Darryl Strawberry did not have good strikeout zone judgment. Yet he drew a fair number of walks because his swings and misses kept him at the plate long enough for the pitcher to occasionally throw him four balls. Power pitchers must face the same impediment, right? Just by not getting contact, they must risk more walks. I do notice that walks in high strikeout times do not seem to increase along with strikeouts, so there is presumably an opposite pull that keeps walks down even as strikeouts increase.
7:32 AM May 9th
What would be the age score for the Kids and Kubs league?
7:57 PM May 2nd
If you spread it out to a 10/5 system you lose levels of play. If you spread it out to a 10/5 system every level of baseball below college is the same; they're all zero.
6:15 PM May 2nd
Interesting how baseball before 1894 (maybe using the pitching mound distance as an era boundary has something going for it) is *always* below 70, and it is only below 70 in the 1909-1913 era afterwards. Oh yes, 1880s baseball was so major league.
9:28 PM May 1st
SB, 3B peak earlier, and HR, BB peak later. In some eras, speed is more in abundance, and in other eras HR is more in abundance. It would seem therefore that for each type of player, there's a perfect age, which is alot of work to do. (Vince Coleman's perfect age is probably alot younger than David Ortiz's perfect age.) Or as a first step, simply take each year's actual average age, consider that the "marketplace perfect age", and see how far off everyone is from there using the 4/2 system (or really, you can spread it out to a 10/5 system).
10:19 AM May 1st
A normal career peaks now at 27 or 28, but a normal career peaked earlier in 1880. We're talking about serious athletes who took care of themselves, using the standard nutrition and training regimen of the time, not the syphilis and delerium tremens crowd.

Your numbers suggest that a normal career peaked later in the early 2000s because of new "advances" in nutrition. I suspect that that number will rise again once we figure out how to gain the advantages of of steroids without the health risks.

I'm just saying that 27 to 28 is not a fixed number. Peak performance was probably different for the Cro-Magnons, too. We're still evolving.
7:15 AM May 1st
But I'm not interested in what the ideal age was in 1876, if the "ideal age" was lowered by substandard nutrition or excessive drinking or whatever. 27-28 is the peak age if you take care of yourself and have a normal career. That's the only thing that's relevant.
9:51 PM Apr 30th
Perhaps 27 or 28 was not the ideal age for a ballplayer in the 1870s or 1880s. If nutrition has improved over the years, and training regimens have improved over the years, and the general population has grown several inches taller over the years, then it seems likely that the ideal age has changed as well. It makes sense to me that the best ballplayers were 25 or 26 then. The fact that older players dropped out of the National League sooner in 1880 is not evidence that the level of competition was lower. The thirty-year-olds were not leaving the National League because they were going to play a more competitive level of baseball elsewhere. They were leaving because they were broken down, and the kids could play better.
9:32 PM Apr 30th
Regarding the issue of whether baseball is skewing older or younger. ..since I had the data well organized to do this, I looked at that issue, and, as Trailblazer suggested, it is extremely helpful.
First, baseball started out young (1876), and got dramatically younger in its first four years. In 1876 the average age of a pitcher was 24.4 years; of a batter, 25.8. By 1880 these figures had dropped to 22.8 and 24.8. I don’t know that I fully understand these changes, but the league was small, and we could suggest that it was attracting its OWN talent for the first time, as opposed to holding on to the talent from the National Association.
From 1881 to 1907 baseball got constantly and very significantly older, the average age of both a pitcher and a hitter increasing almost every year. By 1907 the averages were 28.0 for a pitcher and 28.5 for a batter.
I mentioned that I didn’t understand the low “Perfect Age Scores” post-1904. What ESSENTIALLY happened is that the game got much older in the early part of that era, probably because salaries were increasing rapidly after the breakup of the owner’s collusion in the 1890s; when salaries increase rapidly the average age of the game generally gets older, as aging players fight hard to stay in the game. From 1901 to 1906 the average age of a player increased from 27.3 to 28.1, which is a large increase in five years even in the context of constant increases.
From 1907 to 1913 the average age dropped remarkably, from 28.20 to 26.62. I would suspect that the two things that were happening here were 1) the after-effects of the “salary/age-retention bubble” I mentioned before, and 2) it was a speed game. Teams stole 200 bases a year, which would tend to drive the average age down, perhaps. There is a REMARKABLE drop in the average age in this era, and I shouldn’t suggest that I understand it completely. There was also a bonus baby war about that time. …you guys remember the “$10,000 beauty”?.. .which was ended by the agreements with the minor leagues signed about 1914, which allowed the minor league teams, which were independent at that time, to be the “first signers” for almost everyone entering professional baseball.
Anyway, after 1913 player ages resumed getting older, reaching a peak of 29.01 in 1927, in my opinion because of the explosion of salaries. The average age declined slightly from 1927 to 1935, reaching a low of 28.31 in 1935.
World War II skews very, very high; although there were some teen-agers in baseball, famously, almost all of the increase is in older players. The average age reached a record 29.06 in 1943, then jumped to 29.94 in 1945. 29.94 is still the record today.
After World War II baseball got continuously younger for 24 years. In 1949 the average was down to 28.6, in 1957 to 28.3, in 1961 to 27.8, in 1965 to 27.5, and in 1969 to 27.20. The figure for 1969 was—and is—the lowest since 1917. There were several things happening here. After World War II there was a generation of “replacements” –the players born 1918-1925—that was decimated by the War, and it took a long time to fill in the gaps from that missing generation, which caused the average age to hold in the 28.2-28.4 range for a long time. The effects of that were overcome in the late 1950s, causing a “young” trend, which of course was emphasized by the massive expansion from 16 teams in 1961 to 24 teams in 1969. More on this later.
After 1969 baseball got steadily older, aging from 27.20 in 1969 to 28.70 in 1985. For about ten years then the numbers held steady (slightly lower than the 1985 figure, which is a bit of an aberration), but then, in the steroid era, the average age went way up, from 28.67 (1997) to 29.27 (2005).
Since the end of the steroid era the game has gotten much younger, with a 2011 average of 28.47, the lowest since 1991 (28.42). This is the third-strongest “youth surge” in baseball history, behind only the 1907-1913 change, which I don’t fully understand, and the end of World War II.
Now let’s look at the question of batter age vs. pitcher age. From the beginning of baseball to 1924, batters were always older than pitchers—every year, without exception. This is kind of instructive, when you think about it: one reason people over-value pitchers, I think, is that pitchers are more prone to get hurt, which creates a shortage of them, so you always have to work harder to replace them, which makes them appear to be more crucial to the success of the team than they actually are.
Anyway, beginning in 1925 the PITCHERS were older than the hitters for two generations, from 1924 to 1954. Since 1954 the age ratio has shifted back and forth, but batters have GENERALLY been older. Batters were older in 1955, pitchers in 1956, batters from 1957 to 1962, pitchers from 1963 to 1966, batters from 1967 to 1973, pitchers from 1974 to 1977, batters from 1978 to 1986, pitchers from 1987 to 1993 except 1989, and batters from 1994 to the present except 2005 and 2007.
Two final points. First, I had clearly over-estimated the role that the “Kiddie Corps” in Baltimore played in the age anomalies of the very early 1960s. In fact, as the data shows, it was batters who were getting dramatically younger in those days—pitchers too, but batters more than pitchers.
Second, there was some sort of a huge fluke in 2007. In 2006—unless I have some odd data glitch—the average age of a pitcher (weighted per innings pitched, of course) was 28.56. In 2007 it was 29.47.
If you think about it, that’s almost impossible. If every pitcher pitched the same number of innings in 2007 that he had in 2006 and there were NO new pitchers, the average age would jump by 1.000 years. It jumped by .91 years. It’s hard to see how that is possible. There must of have been some strong comebacks by several aging pitchers. The average age of a pitcher then dropped from 29.47 to 28.12 in two years, presumably as the Randy Johnson/Greg Maddux/Tom Glavine/Roger Clemens generation retired.
Thanks. . ..Bill

4:32 PM Apr 30th
Just fixed the big empty boxes. Sorry about that.
1:50 PM Apr 30th
To quote the legendary Crash Davis: "Strikeouts are boring ...and they're fascist." I wouldn't quite call 'em boring, but baseball with all strikeouts all the time is boring. I'm in favor of some slight rule changes about bat handle thickness, as Bill has suggested, to ease things back the other way a bit.
1:19 PM Apr 30th
I'm getting big empty boxes in the middle of the article, which makes following the details a little trickier.
12:43 PM Apr 30th
A skewness measure, that would indicate whether the age disparity was old or young, would help to understand some of the effects. 1965 and 66 have the first and third lowest scores since WWII, just as the first Baby Boomers have a chance to become major league regulars.
7:14 AM Apr 30th
©2023 Be Jolly, Inc. All Rights Reserved.|Powered by Sports Info Solutions|Terms & Conditions|Privacy Policy