- Posts: 26
- Joined: Mon Dec 03, 2012 10:30 am
I would love to see a tweak to the game's algorithmic formula on calculating "games missed" once a player is injured. It appears to me that the length of time a player misses is predicated upon his ABs, not his games played or total plate appearances, which I believe would be a better and more realistic metric to use. The problem with this comes when a player's raw ABs are lower because he walked a bunch in a particular year. Case in point, I am currently playing the 1977 season and I have Reggie Smith. Reggie has an "injury 1" rating, played 148 of the Dodgers' 162 games in 1977 , but only had 488 ABs, because he walked 104 times that year...so he actually missed only 14 games in 1977 and accumulated a total of 592 plate appearances. My 1977 league has played 96 games of our schedule at the time of this post. Yet, Reggie Smith has now been injured twice and will miss a total of 25 games when he missed only 14 during the entire, REAL 1977 season.
If this change is made, a tweak that would need to be factored in is for the "period" leagues (90's, 80's) that have player's cards from strike-shortened seasons. I had a similar problem playing one season with Gorman Thomas' 1981 card. He played in 95% of the Brewers' games that year, but was constantly getting hurt and missed substantial time (43 games) for me because (I believe) the computer was looking only at his ABs from that year when doling out time missed.
Hope that helps, thanks for reading.
If this change is made, a tweak that would need to be factored in is for the "period" leagues (90's, 80's) that have player's cards from strike-shortened seasons. I had a similar problem playing one season with Gorman Thomas' 1981 card. He played in 95% of the Brewers' games that year, but was constantly getting hurt and missed substantial time (43 games) for me because (I believe) the computer was looking only at his ABs from that year when doling out time missed.
Hope that helps, thanks for reading.