Furthermore, the electric grid itself has large temporal variations by hour, day of the week, and season. Different plants are on-line as "baseload" and are always on and others come on as needed. In some locations, the baseload generation is relatively "clean" (hydro and nuclear), and the peak load is relatively dirty (coal). In other locations it is exactly the opposite. Wind, solar, geothermal, and gas also play their roles to varying degrees.
The largest uncertainties in estimating GHG emissions from buildings come from the differences between design assumptions and actual building use and energy use intensity/performance. These could easily be on the order of a factor of 2 if they differ too much. For existing buildings, making retrofit or operational decisions, those uncertainties are far smaller and then the accuracy of the calculation of GHG emissions from electricity production probably contains the next highest source of uncertainty, although I have no data to support that assertion.
The time- and weather-dependent performance of the electric grid and of buildings requires an incredible amount of intensive manipulation of available data, some of which is of poor accuracy and low reliability. Matching historical data to the weather and time when the electricity was generate is non-trivial. Electric production is reported on an hourly basis in the U.S. but carbon emissions are reported on a monthly basis and are mostly based on calculation rather than measurement. There are huge discrepancies in the available reported data from the more than 4700 electricity generators in the U.S. These data are massaged by EPA to produce the annual average values that are used in the Energy Star program and for the U.S.'s international reporting.
Building use and operational assumptions are also important sources of uncertainty, and there are some efforts to address those issues in trying to improve the accuracy of building simulation models.
The only GHG emissions calculation tool that exists that can actually come even close to giving accurate numbers that reflect the temporal and climate dependencies of building operations and of GHG emissions from the grid is the California tool. If you haven't seen it, please do check it out -- it's free and downloadable. While it only works for California, the concept using a dispatch model can be replicated anywhere in the U.S. or elsewhere where future planning is reasonably reliable out into some number of future years. The GHG tool for Buildings, which E3 has been developing with the help of Martha Brook, is now available on the E3 website (the first link): http://www.ethree.com/E3_Public_Docs.html.
The alternative is to use historical data and to massage them to either create a representative year or to create a program that allows modeling under any assumed or projected set of climate and time schedule scenarios. If you look at the Synapse report, you will see that there are huge differences over the course of the year and even substantial differences over the course of the day. Based on analysis of data from the year 2005, Synapse showed that there can be differences as large as 60% between average annual values for GHG emissions/MWh electricity generation in some regions of the country. The report is titled: "ANALYSIS OF INDIRECT EMISSIONS BENEFITS OF WIND, LANDFILL GAS, AND MUNICIPAL SOLID WASTE GENERATION" can be downloaded from
The potential range of emissions in tons GHG/MWh is very large. The distribution of hourly average emissions from the grid in New England in 2005 (from the Synapse report) clearly shows the huge variations and why annual average values cna be very misleading in making design or operational decisions. Similar vaiations occur in many other regions, and the inter-regional differences are also quite large (>2X in some cases).
The ideal, and this was the Statement of Work that came out of our ASHRAE GHG calculation tool committee's efforts last year, is to look at both the results of calculations based on a dispatch model (like the E3 tool) and the adjusted historical databased model (let's call it a "Synapse plus tool")and determine whether there are significant differences -- let's say more than 5% to 10%. There are plenty of other sources of uncertainty, so cutting it too fine in this kind of modeling does not make sense. But getting a sense of the magnitude of the differences would help us design the "ideal" tool construct.
Jeff Haberl of Texas A&M has worked with the available data in Texas, and he has helped me understand the complexities of getting these things right even when and where the data are available. Many of these can be overcome where good historical data and dispatch models are available, but they will take some well-focused work. Either of the past ASHRAE GHG Tool PC contractors, E3 or Synapse, is capable of doing this kind of work, and Jeff is also very knowledgeable about how to meet some of the challenges. There is also a need to be clear about site energy use intensity and source energy intensity, something for which Mike Deru at NREL has published reasonably good data (NREL Technical Report NREL/TP-550-38617, June 2007).
The stakes are too high to be making bad or poorly informed decisions any more, so let's get something robust developed that can move us ahead toward well-informed design, retrofit, and operational decisions for buildings.