Consider this a horrifying primer in basic probability theory.
Since January 2013, there have been 39 shootings at K-12 schools in the U.S. If we assume a ten month school year, since that time, we have had 1.6 school years, for 24 shootings per school year. According to the NCES, there are 132,183 K-12 institutions as of 2009 – 2010 (the last year for which data are available), which means that the percentage of schools that experience a shooting annually is 0.018% (note that is this is a percentage, so for the mathy types, p = 0.00018). The probability of not experiencing a shooting in a given year is 99.982% (p = 0.99982).
Suppose a family has two kids three years apart. This would mean their kids (one or both) would be in the K-12 system for sixteen years. The probability that this family would not experience a shooting during those sixteen years is 0.9998216 or 99.71%.
Now, I’m not a complete idiot. Shootings aren’t randomly distributed: Georgia and Tennessee, relative to their populations, are fucking free fire zones. There are probably a bunch of factors that make shootings non-randomly distributed, though income doesn’t seem to be one of them. Hopefully, the frequency of shootings drops on its own (clearly, we’re not going to do anything directly about that). So let’s say, I’m off when it comes to your family (which, of course, means someone else’s family…), and it’s only 1 in 1,000 (instead of 3/1000). It’s still too damn high.
Let’s conclude with some Charles Pierce:
There are more flowers in more places and there is no peace in sight, because we have chosen as a country to slake our appetite for it with blood. The dead are not honored in this war. Only the instrumentality of their murder is, god help us all.
More like God damn us all, but your mileage may vary.