I do not believe in using FIP numbers, as I believe pitchers do (or some can) control their pitching effectiveness. Hence, previously I used Baseball Reference for their pitching WAA and WAR values.

Whitey Ford: https://www.baseball-reference.com/p...fordwh01.shtml

Zack Greinke: https://www.baseball-reference.com/p...reinza01.shtml

I noticed that in 2009 Greinke had one of his huge years, specifically a 2.16 ERA, 205 ERA+ in 229.33 innings. He led the league in ERA+. I also noticed in 1958 that Ford had a 2.01 ERA, 177+ ERA in 219.33 innings (also led the league in ERA+). Both had huge years, and I assumed Greinke would have a higher WAA, but that Ford would be perhaps 1.5 WAA lower.

I was absolutely stunned to see that Ford had a WAA of 2.6 and Greinke had a WAA of 8.2. Greinke had a WAA 3.15 times greater than Ford. How could that be? I could not understand it – luckily, Baseball Reference does show this math. In the end, I would be stunned again what their math suggests.

Here are some critical pieces of information for each player:

Whitey Ford: 219.33 innings, 62 total runs allowed, 49 earned runs allowed (13 unearned)

Zack Greinke: 229.33 innings, 64 total runs allowed, 55 earned runs allowed (9 unearned)

What I decided to do is to see how many fewer runs Whitey would need to prevent before he could match Zack.

In 2009, Zack had a RAA (Runs better than average) of 73, leading to a WAA of 8.2 (8.9 RAA per WAA). In 1958, Whitey had a RAA (Runs better than average) of 22, leading to a WAA of 2.6 (8.5 RAA per WAA).

To match Zack’s WAA, that is a difference of 8.2-2.6, or 5.6 WAA. That translates to 5.6 WAA*8.5 RAA/WAA, or 47.6 RAA., or to come up to 8.2, it will be 47.6+22=69.6 RAA. Alternatively, 22 RAA * 3.15 = 69.3 RAA. So Ford needs to be about 69.5 RAA.

RAA=(Innings pitched/9)*(RA9avg-RA9).

RA9avg is defined as “This is our best estimate what an average pitcher would do against these opponents, with this defense and in these parks.” RA9 is that pitchers Runs Average per 9 nine innings, so ERA plus the unearned runs. I have a serious issue with including unearned runs, as usually those are the fault of the defense, but that is an argument for another time.

Then RA9avg mathematically is defined as “RA9avg=(PPFp/100)*(oppRA9-RAdef+RArole)”.

RAdef being runs per 9 IP of support from defense (TZ pre-2003, DRS post then), and with PPFp being the park factor customized for parks the pitcher threw in, and RA9role being runs per 9 IP difference for starting pitcher and relief pitcher (post 1960 onwards). For 1958, the PPFp was 92.7, RA9role was 0.00, and RA9def was 0.36 (showing Yankees had a very good defense).

So a little reverse algebra to solve this equation: 69.5 RAA = (219.33 IP/9)*(RA9avg-RA9). We’ll plug in the site’s 3.44 for RA9avg. Doing the math, RA9=0.59.

That means Whitey Ford would only be able to have a Runs Allowed Average of 0.59 Runs per 9 innings to match Greinke. 0.59*219.33/9 is 14.4 Runs in a season. Subtract off the 13 Unearned Runs (which are assumed to be out of his control), and we find that Whitey Ford would be allowed 1.4 Earned Runs to make Zack Greinke’s “value”.

Think about what that is saying….Whitey Ford would be a worse or less valuable pitcher than Zack Greinke if he allowed

I’ve ranted on asymptotic limitations in the past, and this is a perfect example of it. Whitey Ford, to match the great WAA value of Zack Greinke would have to allow

Please check my math, to make sure I’m not wrong. If I’m not wrong, I’m convinced that pitching WAA and WAR is complete bunk and I’ll never use it again for anything other than perhaps comparing same values in-season. This seems such a serious outrageous unbalance in ratings, I’m having a hard time believing no one has ever looked at the math and the asymptotic issues before. Literally low scoring eras can’t even mathematically match the WAA and WAR for pitchers fortunate enough to be born in high scoring eras. No wonder guys from the 1900’s and 1970’s are so underrated, and pitchers from hitting eras are so overrated by WAA and WAR.]]>

Woba counts non-intentional walks but not intentional.

I don't understand the logic for doing so.

Doesn't a intentional walk have at least some value?

]]>

Think of it like a teenager with a wallet whose billfold could only hold 10 bills. In an ideal world, that unemployed student would want ten $100 bills there, but $100 bills are hard to come by. If the kid were paid $10 per hour of hour and got paid in cash, he would obviously prefer to be paid 4 x $20 at the end of each day rather than 80 x $1 since he could only carry around ten bills at any one time. In this scenario, the higher denomination bills are worth more to the kid than merely the face value of their currency. That's sort of what I'm looking at here.

Since the government can print whatever denominations it wants (including the creation of new ones), perhaps a better analogy would be a man with a horse and cart, hauling bricks of precious metals. Each brick weighs the same - humor me - but he would obviously rather have more platinum, gold and silver bricks on the cart than he would bronze, copper or tin(?). Think of that cart, or that wallet, as more than the sum of the parts.

In the same way, I am trying to brainstorm a method of valuing individual player seasons but where higher-value seasons receive an additional boost to their value to factor in the scarcity of such a season happening.

My problem, thus, becomes how to properly value each "level" of performance relative to the others? So far, I suppose, the best way I can figure is to use a dollars-per-WAR conversion method. I had considered Fibbonacci's sequence, but that spirals unrealistically high very quickly, I believe. I didn't want the relative marginal product to be incremental and uniform from one level to the next either so the dollars-per-WAR may be too close to that, but I dunno.

Does anyone have any suggestions for ways to look at this problem?]]>

At the same time, strikeouts are also at record highs. If the season ended today, this year would mark the fourth-most strikeouts in history, and barring a very strange week, that record will fall also. MLB players are striking out at a rate of 8.25 times per game, which is 2.7% higher than last year (the old record), which was itself 4.2% higher than 2015. In fact, the strikeout record has been broken every season since 2008, when the strikeouts per game record then was 6.77 - it's gone up 22% since.

None of this is news to anyone on this board, I'm sure. I'm just wondering - what are the thoughts here about why? ]]>

The Save Ruined Relief Pitching. The Goose Egg Can Fix It.]]>