The earned run average is one of the most venerable and cited figures in baseball, a sport renowned for its love of statistics. The ERA dates back to the earliest days of the sport, though 19th century pitchers, standing much closer to home plate, had low, low ERAs that today’s hurlers can only dream about. Simply put, to calculate a pitcher’s ERA, multiply the total number of earned runs he has allowed by 9, and then divide by the total number of innings he’s pitched in the current season.
Some Runs Don't Count
The earned run average represents the number of earned runs that a given pitcher gives up on average in nine innings on the mound. Only runs that are scored because of the at-bat team’s actions are considered earned runs. Any run that is attributable to an error by the team in the field or a catcher’s passed ball doesn’t count against a pitcher’s ERA -- though interestingly, a run crossing due to a hurler's wild pitch is earned. A player who pitched for seven innings in one game and gave up four earned runs, and who after that pitched nine innings and gave up one earned run, would have an ERA of (4 + 1) x 9 divided by (7 + 9) = 2.81.
Statistics Don't Tell Everything
Though the ERA is designed to allow comparisons of pitchers across teams and leagues, several factors out of the pitcher’s control can influence his average. For example, fielders that are slower than their peers, or who cover less ground, will allow more hits – not charged as errors -- leading to more runs and inflating the pitcher’s negative statistics.