What is the value of a stolen base, to each major league team?
Not all major league teams are the same, of course; there are some teams that, with a man on first and one out, should just sit back and play for a big inning, and there are some teams who might be better off trying to steal. Since we now have data on how often each major league team scores given each base/out situation—the 24 States Analysis in the Statistics section—I thought I would look at the question of whether those teams which SHOULD steal bases most often, actually DO steal bases most often.
On the two ends of the spectrum, the Detroit Tigers and the Chicago White Sox. The Tigers scored 1.09 runs per inning when they had a man on first and none out, and 1.22 runs per inning when they had a man on second and none out, so they apparently gained only .13 runs by a stolen base in that situation. They apparently gained even less by stealing second with one out, increasing their expected runs from .58 to .68, and gained virtually nothing by stealing with two out, increasing from .24 to .29. Overall, and assuming the three situations are equal, the Tigers appeared to gain only .091 runs by stealing second base, increasing the average of the three from .638 to .729.
The White Sox, on the other hand, appeared to have huge gains from getting the runner to second, increasing their expected runs from .80 to 1.38 with no one out, .62 to .80 with one out, and .23 to .36 with two out. One out, two out, or three out, the White Sox apparently gained far more from a stolen base than would the Tigers. Overall, again assuming the three are equal, the White Sox increased their expected runs, by stealing second, from .549 to .846, a gain of .297. The White Sox apparent gain on a stolen base attempt was more than three times larger than the Tigers’.
Then we look at the COST of a stolen base attempt, should the runner be thrown out. The Tigers, losing the base runner, go from
1.09 to .30 with no one out,
.58 to .12 with one out, and
.24 to zero with two out.
An average loss of .498 runs. The Tigers, then, have an average gain of .091 runs when they steal a base, but an average loss of almost half a run when they are caught stealing. For a stolen base attempt to pay off, then, the Tigers would need to be successful 85% of the time.
The White Sox, on the other hand, go down when the runner is caught stealing from
.80 to .34 with no one out,
.62 to .12 with one out, and
.23 to zero with two out.
An average loss of .397 runs. The White Sox, then, have a gain of .297 runs against a loss of .397 runs. The White Sox, then, could apparently profit by a stolen base attempt if the attempt was successful just 57.2% of the time.
The question I was trying to get to was: Do those teams which SHOULD run most often, by this method, ACTUALLY run most often?
No. Not at all; the method simply does not work. It does not predict actual stolen base attempts, at all, at least with 2008 data.
There are obvious flaws in the method. The biggest problem is that we are assuming that these measured outcomes are real numbers, that, because the Tigers DID score 1.091 runs per inning when they had a man on first and none out, that they could EXPECT to score 1.091 runs per inning when they had a man on first and none out. In reality, the season is not long enough to provide reliable data for each of the 24 states. The Tigers had 351 situations in which they had a runner on first and none out, and only 103 situations in which they had a runner on second and none out. To get a stable measurement of the actual relationship between the two, you’d have to have at least 2,000 game situations in each group.
That’s the biggest problem, but there are others. I assumed that the three out situations were of equal importance, which is not EXACTLY true although it is fairly close, and I assumed that there were no other stolen base situations, which of course is not exactly true. By doing this, we’re implicitly assuming that the manager knows on opening day what his team’s performance will be by the end of the season.
Still, I would have guessed, going into this study, that there would be some relationship between how often teams SHOULD attempt to steal, measured in this way, and how often they do attempt to steal. There is none. . ..well, there’s an inverse relationship, but there is no positive relationship. By this method, the four teams which should have been most willing to risk a base runner in a stolen base attempt were the White Sox, the Braves, the Diamondbacks and the Padres, in that order. In fact, those four teams finished 25th, 27th, 28th, and 30th in the majors in the number of stolen base attempts.
What do we take from this, other than the fact that our method doesn’t work? What I take from it is this: that the number of stolen bases you have tends to vary with how much speed you have, rather than with how much speed you need. The Padres, for example, could have benefited from stolen base attempts at a 62% rate, which is very low. But your base stealers are usually your shortstop and your outfielders. For the Padres, that’s Khalil Greene, Brian Giles, Jody Gerut and Chase Headley—none of whom can run. The Padres had a major-league low 53 stolen base attempts.
Which is not necessarily a bad thing. They could have generated SOME runs by stealing bases—a few. A dozen, a couple of dozen. They missed the playoffs by approximately 250 runs. If Chase Headley develops into a hitter with a .400 on base percentage, which he may do, that’s a lot more significant than stealing a few bases.