A Winter Wildfire, Really?

By Kolby Hoagland | January 24, 2013

In a Jan. 22 article from USA Today, “Drought and heat triggers pre-wildfire season jitters,” I was surprised to learn that a wildfire that ignited in October continues to smolder in northern Colorado, despite a layer of snow blanketing the area. Last year’s fire season and this rare winter fire have “experts across the country bracing for what could be yet another smoke-filled summer” in 2013. Historically, the consecutive years of drought in the Rocky Mountain region and, moreover, the western U.S. has led to an uptick in the acreage destroyed by wildfires. Since 1960, while the annual number of wildfires has decreased, the total acreage burned each year has on average increased (1). Wildfires are becoming larger and more devastating (2). In the interactive map and graphs seen below and linked here, fire seasons from 2002 to 2012 are disseminated by location, size and cause of the wildfires. Depending on the prevailing climatic conditions, the number of wildfires and total acreage burned varies among fire seasons. Hotter, dryer years induce more destructive fires. One consistent trend across fire seasons is the major cause of wildfires. From 2002 to 2012, lightning strikes were the greatest cause of wildfires in each fire season, on average four times greater than campfires in second place. Forest fires are a natural and important part of our forest ecosystem; however, their destruction has become greater in recent years. Understanding the threat of wildfires - and where forest restoration efforts could effectively mitigate the potential for colossal damage - is not only important to the communities in these areas but also to bioenergy developers for the potential feedstock the forest restoration would produce.


2002-2012 Fire Seasons M&D_Blog