This post is the first in what will be an occasional series trying to identify trends in the statistical data I’ve gathered in the course of doing this blog. In today’s post, I answer some frequently asked questions.
1. Is it easier for small towns to achieve a high live release rate than for big cities? I did a statistical analysis on this question on my previous blog, comparing the average population size of the successful 90%+ communities to the average size of U.S. jurisdictions as a whole, and found that the 90%+ communities I listed actually had higher populations than the average for the U.S. as a whole. What people often don’t realize when they think of this question is that there are very few big cities compared to small cities and towns, so if population size does not have any influence at all on shelter success one would expect to see many more successful small communities than large ones.
2. How long will it be before all the shelters in the U.S. get to a 90% live release rate? I had to laugh when I saw a recent article stating, apparently in all seriousness, that at the current rate of success it would be 500 years before all shelters in the U.S. achieved the 90%+ mark. The author of the article, Pat Dunaway, tried to extrapolate the future date from the existing number of 90%+ shelters and the years since the first 90%+ shelter came into existence, but she forgot one crucial thing — algebra. Her crucial mistake was to assume that the graph of shelters achieving 90%+ was a straight line over time, which is wildly inaccurate. The rate of growth is not quite logarithmic, but it’s far higher than linear. The truth is that if one wanted to try to extrapolate a date for all shelters to be at 90% or more based on historical data, one would have to use some pretty fancy mathematics, including calculus. Adding, subtracting, multiplying, and dividing aren’t going to do it. We would also have to have identified all the 90%+ shelters currently in existence, which we aren’t even close to having done yet. File this one under “Embarrassing Mistakes Made By People Who Are Out Of Their Depth.”
3. Speaking of historical data, what municipal shelter in the U.S. was the first to achieve a 90%+ live release rate, and when did it occur? The short answer is — we don’t know, because many times shelters don’t announce it to the world when they are successful, and even when they do report we do not generally have independent verification. Among all the shelters I’ve studied where data is available, though, the first shelter to report a 90%+ save rate appears to have been Otsego County, Michigan, where credible reports give 1999 as the date that they achieved a 90%+ live release rate. A related question is what shelter has the longest winning streak — i.e., the longest period of time up to the present with a documented live release rate of 90% or more. Otsego County is also a contender in this category, with a documented streak going back to 2007 (they may very well have been at 90% or above every year since 1999, but were not able to supply me with statistics from 2001 through 2006 when I inquired). The Charlottesville-Albemarle SPCA, which has full statistics posted on its website, also shows a 90%+ live release rate going back to 2007.
4. Why do so many highly successful shelters hide their data? I’ve been told several times by officers of highly successful shelters that they do not want to make their success public because they are afraid of having people from other jurisdictions drop off animals in their jurisdiction or try to surrender animals to their shelter. The unfortunate effect of this is that there are many highly successful shelters that are not getting the recognition they deserve.
5. Is there one secret to success used by all the shelters listed on the blog? The short answer to this question is “no.” Some shelters achieve high live release rates by a high rate of adoptions, while others depend almost totally on out-of-state transfers. Some shelters are independent, while others could not succeed without their rescue partners. I frequently speak to shelter officials, and it’s rare for one of them to tell me that the shelter follows any particular program. On the other hand, many of the shelters I write about have characteristics in common, including a hardworking, dedicated director and lots of community engagement.
6. Why should we trust statistics that the shelters themselves report? My answer to this question is that we should not blindly trust any statistics provided by a shelter, because at this point we do not have any industry standard on how to collect and present statistics. Nor do we have any way of independently verifying the accuracy of statistics. Instead, we should look at the statistics provided by a shelter as just one tool to use in evaluating that shelter. That’s why it concerns me when I see people mischaracterize my blog as a list of “no kill” communities or shelters. It is no such thing. Instead, it is a list of shelters that report saving 90% or more of intake.
7. What relationship does the list on this blog have to the list posted on the No Kill Advocacy Center website? The NKAC cites my blog (with some additions and deletions) as the source for a list of communities that they call the “90% Club.” I am not in contact with the NKAC and I’m not responsible for any claims that the NKAC and its officers make about the number of successful communities in the U.S. If you doubt that there are 500 cities and towns saving 90% or more of shelter animals, take it up with them, not me! Although I will say that there are certainly more communities at a 90%+ live release rate (possibly far more) than I’ve identified so far.
Comments