3d10 + 0
When you roll three ten-sided die, the result will likely be between 12 and 21 (usually around 17). The result will rarely be below 7, or above 26.
Obviously, there’s a bit of math involved in the calculator above, and I want to show you how it works. After that, I want to show you one application of the tool for D&D that’s gotten me pretty excited—the “Killable Zone”. First…
When you roll multiple dice at a time, some results are more common than others. For example, with 3d6, there’s only one way to get a 3, and that’s to roll all 1s. In contrast, there’s 27 ways to roll a 10 (4+3+3, 5+1+4, etc). As it turns out, you more dice you add, the more it tends to resemble a normal distribution. This means that if we convert the dice notation to a normal distribution, we can easily create ranges of likely or rare rolls.
In case you don’t know dice notation, it’s pretty simple. It follows the format AdX + B, where A is the number of dice being rolled, X is the number of sides on each die, and B is a number you add to the result. So, if you’re rolling three ten-sided die and adding zero, that makes A = 3, X = 10, and B = 0, or 3d10 + 0.
In order to find the normal distribution, we need to find two things: The mean (μ), and the standard deviation (σ).
The mean is the most common result. It’s the number which is the most likely total any given roll of the dice due to it having the most number of possible ways to come up. Here’s how to find the mean of a given dice formula:
mean = μ = (A × (1 + X)) / 2 + B = (3 × (1 + 10)) / 2 + 0 = 16.5
The standard deviation is how far everything tends to be from the mean. It’s the average amount that all rolls will differ from the mean. Here’s how to find the standard deviation of a given dice formula:
standard deviation = σ = √(A × (X² − 1)) / (2 × √(3)) = √(3 × (10² − 1)) / (2 × √(3)) ≈ 4.975
Now, you could put the mean and standard deviation into Wolfram|Alpha to get the normal distribution, and it will give you a lot of information. We don’t have to get that fancy; we can do something simpler.
Due to the 68–95–99.7 rule, for normal distributions, there’s a 68.27% chance that any roll will be within one standard deviation of the mean (μ±σ). So, for the above mean and standard deviation, there’s a 68% chance that any roll will be between 11.525 (μ−σ) and 21.475 (μ+σ).
Furthermore, there’s a 95.45% chance that any roll will be within two standard deviations of the mean (μ±2σ). Again, for the above mean and standard deviation, there’s a 95% chance that any roll will be between 6.550 (μ−2σ) and 26.450 (μ+2σ).
As you can see, it’s really easy to construct ranges of likely values using this method. If you’re rolling 3d10 + 0, the most common result will be around 16.5. About 2 out of 3 rolls will take place between 11.53 and 21.47. Only about 1 in 22 rolls will take place outside of 6.55 and 26.45.
There’s two bits of weirdness that I need to talk about.
First, I’m sort of lying. Only 3 or more dice actually approximate a normal distribution.
For two dice, it’s more accurate to use the correct distribution—the triangular distribution. I’m using the normal distribution anyway, because eh close enough. The results for μ±σ seem fine, even if the results for μ±2σ aren’t.
For one die, we’re dealing with the discrete uniform distribution, and all of these results are stupid. Maybe the mean is useful—maybe—but everything else is absolute nonsense.
Secondly, I’m ignoring the “Round Down” rule on page 7 of the D&D 5e Player’s Handbook. I’m using the same old ordinary rounding that the rest of math does. This means that things (especially mean values) will probably be a little off. It might be better to round it all down to be more consistent with the rest of 5e math, but honestly, if things might be off by one sometimes, it’s not the end of the world.
This tool has a number of uses, like creating bespoke traps for your PCs. But, I want to show you the reason I made this in the first place:
In stat blocks, hit points are shown as a number, and a dice formula. Most DMs just treat that number as “that’s how many hit points that creature has”, but there’s a more flexible and interesting way to do this.
The killable zone is defined as (μ−σ) – (μ+σ).
If your creature has 3d10 + 0 HP, the killable zone would be 12 – 21. Instead of a single static number that corresponds to the creature’s HP, it’s a range of likely HP values.
Once your creature takes 12 points of damage, it’s likely on death’s door, and can die. Most creatures have around 17 HP. The sturdiest of creatures can take up to 21 points of damage before dying.
This allows you, as the DM, to easily adjust combat encounters on the fly, but in a rules-as-intended way. Combat going a little easy? A little too hard? This lets you know how much you can nudge things without it getting weird.
There’s a bunch of other things you can do with this, such as time when your creatures die for the best dramatic impact, or make a weaker-than-normal creature (or stronger) for RP reasons.
For example, let’s say you have an encounter with two and one . Using this technique, you could RP one of the worgs as a bit sickly, and kill off that worg as soon as it enters the killable zone. The other worg you could kill off whenever it feels right for combat balance. And, you could RP the bugbear as hating one of the PCs, and when the bugbear enters the killable zone, you can delay its death until that PC gets the killing blow.
In closing, the Killable Zone allows for the DM to quantify the amount of nonsense that can take place in the name of story without sacrificing the overall feel or tension of the encounter. This allows for a more flexible combat experience, and helps you to avoid those awkward moments when your party’s rogue kills the cleric’s arch-rival. It can also be used to shift the spotlight to characters or players who are currently out of focus. To me, that seems a little bit cooler and a lot more flavorful than static HP values.