adam77 wrote:
Avg-shots-per-kill is non-linear with respect to probability-of-kill (i.e. armour value). That's how 9% (increase in avg-kills-per-shot) turned into 25% (reduction in avg-shots-per-kill).
Isn't the linear scale more useful for intuitive comparisons of armour?
No. "Lies, damn lies and statistics" as Mark Twain put it. Intuitive doesn't make it right. The in-game effect of saving throws is, in fact, non-linear.
How much fire a unit can shrug off is not a function of the 100% scale that covers all possible results. It is a function of the inverse of the chance of a failed save. As the chance to fail drops, changing the overall average by the same amount cuts the chance to fail in a greater proportion - you're taking the same amount out of a smaller pie - and the inverse obviously changes by greater proportions as well.
6+ versus no save is a 16.67% difference in average kills per shot, but it takes 20% more firepower for the same number of kills.
5+ versus 6+ is 16.67%, but it takes 25% more firepower.
4+ versus 5+ is 16.67%, but it takes 33% more firepower.
3+ versus 4+ is 16.67%, but it takes 50% more firepower.
2+ versus 3+ is 16.67%, but it takes 100% more firepower.
1+ versus 2+ is 16.67%, but even an infinite amount of firepower cannot produce a kill.
If you are facing an enemy with 3+ versus 4+ in an actual game what's more important? Do you care about the 16.67% or the fact that you have to put 50% more fire on the target to kill it? Obviously, the 50% more firepower is the important point - the only point, really.