ERA was written the way it was for a reason, and it wasn't just that starters had a lot of complete games. If a guy comes in and pitches three innings and gives up 1 earned run, that gives him an ERA of 3.00. Why? Because his contribution, prorated out, would mean a total of 3 earned runs for the entire game. If everyone else pitched exactly as well as he did, the team would give up 3 earned runs for the game, so if they give up something other than 3 runs, we know where to give credit or blame. Since the total number of earned runs allowed for the game is what's ultimately important, of course it makes sense that a pitcher's earned run allowance rate would be scaled in terms of the game's length.
Originally Posted by winningtheweapon
1 ER over 3 IP would give an ERA7 of 2.33. That, in words, means that allowing 1 earned run over the course of 1/3 of the game is, per inning, equally as effective as 2.33 earned runs over the course of 7/9 of the game. Why exactly do you think that's more intuitive than ERA (1 earned run over the course of 1/3 of the game is, per inning, equally as effective as 3 earned runs over the course of the game)?
"Hall of Famer Whitey Ford now on the field... pleading with the crowd for, for some kind of sanity!"