Skip to content

Most calculations in the U.S. and throughout the world are based on an average annual value for the grid region, sub-region, or nation as a whole. Looking at the annual reporting under the UNFCC, there is a huge range of values used for conversion of electric consumption to GHG emissions, and some of the values are clearly highly inaccurate. Even where reasonably accurate annual average values are used, they do not reflect the variations in building operation in response to weather and over the course of the day, week, and year.

Furthermore, the electric grid itself has large temporal variations by hour, day of the week, and season. Different plants are on-line as "baseload" and are always on and others come on as needed. In some locations, the baseload generation is relatively "clean" (hydro and nuclear), and the peak load is relatively dirty (coal). In other locations it is exactly the opposite. Wind, solar, geothermal, and gas also play their roles to varying degrees.

The largest uncertainties in estimating GHG emissions from buildings come from the differences between design assumptions and actual building use and energy use intensity/performance. These could easily be on the order of a factor of 2 if they differ too much. For existing buildings, making retrofit or operational decisions, those uncertainties are far smaller and then the accuracy of the calculation of GHG emissions from electricity production probably contains the next highest source of uncertainty, although I have no data to support that assertion.

The time- and weather-dependent performance of the electric grid and of buildings requires an incredible amount of intensive manipulation of available data, some of which is of poor accuracy and low reliability. Matching historical data to the weather and time when the electricity was generate is non-trivial. Electric production is reported on an hourly basis in the U.S. but carbon emissions are reported on a monthly basis and are mostly based on calculation rather than measurement. There are huge discrepancies in the available reported data from the more than 4700 electricity generators in the U.S. These data are massaged by EPA to produce the annual average values that are used in the Energy Star program and for the U.S.'s international reporting.

Building use and operational assumptions are also important sources of uncertainty, and there are some efforts to address those issues in trying to improve the accuracy of building simulation models.

The only GHG emissions calculation tool that exists that can actually come even close to giving accurate numbers that reflect the temporal and climate dependencies of building operations and of GHG emissions from the grid is the California tool. If you haven't seen it, please do check it out -- it's free and downloadable. While it only works for California, the concept using a dispatch model can be replicated anywhere in the U.S. or elsewhere where future planning is reasonably reliable out into some number of future years. The GHG tool for Buildings, which E3 has been developing with the help of Martha Brook, is now available on the E3 website (the first link): http://www.ethree.com/E3_Public_Docs.html.

The alternative is to use historical data and to massage them to either create a representative year or to create a program that allows modeling under any assumed or projected set of climate and time schedule scenarios. If you look at the Synapse report, you will see that there are huge differences over the course of the year and even substantial differences over the course of the day. Based on analysis of data from the year 2005, Synapse showed that there can be differences as large as 60% between average annual values for GHG emissions/MWh electricity generation in some regions of the country. The report is titled: "ANALYSIS OF INDIRECT EMISSIONS BENEFITS OF WIND, LANDFILL GAS, AND MUNICIPAL SOLID WASTE GENERATION" can be downloaded from
http://www.synapse-energy.com/Downloads/SynapseReport.2008-07.EPA.EPA-Indirect-Emissions-Benefits.06-087.pdf.

The potential range of emissions in tons GHG/MWh is very large. The distribution of hourly average emissions from the grid in New England in 2005 (from the Synapse report) clearly shows the huge variations and why annual average values cna be very misleading in making design or operational decisions. Similar vaiations occur in many other regions, and the inter-regional differences are also quite large (>2X in some cases).

The ideal, and this was the Statement of Work that came out of our ASHRAE GHG calculation tool committee's efforts last year, is to look at both the results of calculations based on a dispatch model (like the E3 tool) and the adjusted historical databased model (let's call it a "Synapse plus tool")and determine whether there are significant differences -- let's say more than 5% to 10%. There are plenty of other sources of uncertainty, so cutting it too fine in this kind of modeling does not make sense. But getting a sense of the magnitude of the differences would help us design the "ideal" tool construct.

Jeff Haberl of Texas A&M has worked with the available data in Texas, and he has helped me understand the complexities of getting these things right even when and where the data are available. Many of these can be overcome where good historical data and dispatch models are available, but they will take some well-focused work. Either of the past ASHRAE GHG Tool PC contractors, E3 or Synapse, is capable of doing this kind of work, and Jeff is also very knowledgeable about how to meet some of the challenges. There is also a need to be clear about site energy use intensity and source energy intensity, something for which Mike Deru at NREL has published reasonably good data (NREL Technical Report NREL/TP-550-38617, June 2007).
The stakes are too high to be making bad or poorly informed decisions any more, so let's get something robust developed that can move us ahead toward well-informed design, retrofit, and operational decisions for buildings.

HHS agencies support research to determine health effects of the increasingly popular use of e-cigarettes, also known as “vaping.”

It’s not uncommon these days to see people using electronic cigarettes (e-cigarettes) in restaurants, bars and parks, all while huge plumes of aerosol swirl around them. Also known as “vaping,” the use of these  hand-held devices has become common, and some teenagers, according to the CDC and FDA, are their biggest fans: More than 2 million middle and high school students use the products, which come in assorted flavors and forms, from devices that resemble regular cigarettes to those that resemble pens or flash drives.

According to preliminary data from the National Youth Tobacco Survey, the number of high-school age children reporting use of e-cigarettes rose by more than 75 percent from 2017 to 2018; and use among middle-school children increased nearly 50 percent. In a recent Washington Post op-ed, HHS Secretary Alex Azar and FDA Commissioner Scott Gottlieb called this an epidemic.

E-cigarettes are the most commonly used tobacco product among youth in the United States. Given their popularity, health officials see the fast-growing use of e-cigarettes as cause for concern among youth. E-cigarettes come with a small battery that heats a liquid that may contain nicotine, transforming it into an inhalable aerosol. Most liquids also feature flavors, including some kid-friendly flavors like bubblegum, gummy bear, and cotton candy, which can broaden their appeal to youth.

Yet, while e-cigarettes are less harmful than regular combustible tobacco products—and a possible pathway to tobacco-smoking cessation for adults—the evidence on the effectiveness of these products for helping adult smokers quit completely is still uncertain. Additionally, questions remain about the long-term health impact of e-cigarettes, including respiratory outcomes.  Smoking tobacco, for example, can cause chronic obstructive pulmonary disease, or COPD, the fourth leading cause of death in the United States.  However, it’s uncertain what impact e-cigarette aerosol exposure may have on respiratory health.

Click here to read more.

Do you want to know how your building is doing? EPA collected extensive indoor air quality data from 100 randomly selected public and commercial office buildings in 37 cities and 25 states. You can compare measurements made in your own building to those in this massive, scientifically-based study to identify how your building compares.

US EPA BASE study data now available on the web

Do you want to know how your building is doing? EPA collected extensive indoor air quality data from 100 randomly selected public and commercial office buildings in 37 cities and 25 states. You can compare measurements made in your own building to those in this massive, scientifically-based study to identify how your building compares.

Air inside public and commercial office buildings contains a wide variety of pollutants that can build up and possibly affect the health of people working there.  Indoor air problems have the potential to affect the health of many people and significantly reduce productivity.

Early research of the indoor air quality, or IAQ, of office buildings in the United States focused on evaluating problem buildings where occupants had significant complaints about the IAQ.  However, such problem buildings could not be compared because there was very little baseline IAQ information about typical buildings.

To fill this data gap, EPA conducted the Building Assessment Survey and Evaluation (BASE) study.  The BASE study used a standardized protocol to collect extensive indoor air quality data from 100 randomly selected public and commercial office buildings in 37 cities and 25 states.

This website, outlined below, describes in detail the BASE study, data, highlighted analyses, original data for independent examination, and more.

Contents:

Basic Information and Overview: This section provides a basic description of the study, its goals and objectives, and the information collected.

Methodology: The BASE study was conducted using a standardized protocol. A description of the seven basic activities performed for each of the study buildings is provided here.

Summarized Data: Summaries of select information collected from the 100 BASE buildings studied are available here.

Highlighted Analysis: The data collected provide normative information on several parameters which can be used for further assessment and analysis of IAQ-related issues.  Highlighted here are summaries and results of assessments and analyses performed on the BASE data.

Frequent Questions: We have provided answers to what we consider some of the probable frequently asked questions regarding this study.

How to Obtain Data: We have provided a means which will allow independent examination and analysis of the data and to allow for hypothesis development the raw data collected as part of the BASE study.

Publications: The protocol, quality assurance plan, and other supporting study documentation as well as publications describing the study and summarizing select study results are available here.

Glossary: We have provided definitions for some BASE-specific terminology used throughout the study.

The oil crises of 1973 and 1979 resulted in far more emphasis on energy conservation and designs to minimize use of energy. Most of California’s licensed architects at that time had been educated in the era of mythical unlimited energy. In the late 1960s and even in 173, nuclear power was expected to be so economical that they wouldn’t even need to meter it. Forty-one nuclear power plants were authorized that year.  The U.S. was not a net oil importer, and the limits to oil reserves were not commonly considered.  Single-pane glazing was standard in California until the California Energy Commission adopted regulations in its Title 24 that required double-glazing beginning around the time of Dean’s guide.

Dean, then a professor in the Department of Architecture at the University of California, Berkeley, both wrote and illustrated this 85 page guide that lays out the basic knowledge about energy and architecture. While it was written 30 years ago, the principles have not changed, and the guide is a very handy reference for students and professional architects alike.

You can download the guide (~11MB) from BuildingEcology.com, or you can request a hard copy from the California Energy Commission, 916-654-4287. The publication is now out of print, but the staff has recently made copies upon request.

California Greenhouse Gas Tool for Buildings California's Greenhouse Gas (GHG) tool for California is now available on the web -GHG Tool for Buildings in California. The tool is publicly available and free for download.

GHG Tool for Buildings in California

This is a major step forward, the first tool that provides time- and weather-resolved GHG emissions calculations. It is based on a dispatch model,* that is a model of how the grid inventory would be over time through the year 2020. Since electricity demand and consumption as well as grid performance are highly dependent on weather and time of day or week, an annual average value for GHG emissions at a location or a portion of the regional or sub-regional electric grid is not accurate.

Other commonly used and well-known GHG calculation tools use annual average emissions for a grid region or sub-region. In some countries, a national average is used. In fact, we know of no other tool that addresses the time of use and weather impacts on building energy use and associated GHG emissions. The other available tools can greatly distort the impact of a building on GHG emissions and result in very poor design and operational decisions.

The distortions can lead to mistaken result from analysis of a building's energy use either from use data or simulation models. Designers and building operators need more accurate, time and weather resolved data to make informed decisions intended to affect GHG emissions.

Design always involves trade-offs. Designers are now focused on reducing energy use. But grid-generated electricity has different carbon implications, depending on the time when it is used and the simultaneous inventory of electricity generators in the grid region or sub-region. For example,

  • To understand whether load shifting by using thermal storage or increasing building envelope insulation is more effective in reducing GHG emissions, one needs to model building performance with a time- and weather-resolved model and GHG emissions data.
  • Is it more effective to produce electricity on-site with solar PV or to invest the same amount in high performance glazing?

Questions like these can only be answered with a tool that considers the time of use and concurrent grid performance. Use of an annual average can distort the result of a comparison of alternative designs by as much as 60% in some grid regions in the U.S. while in other grid regions, the annual average is relatively accurate. Buildings are the largest electricity user in the U.S. (70%) and in most of the world. The potential to reduce GHG emissions in buildings is huge, largely because buildings are currently quite inefficient. Reducing electricity consumption will have a major impact on GHG emissions, and site energy use intensity, the usual metric for building energy use, is roughly only 1/3 of source energy use and the associated GHG emissions.

Addressing climate change in buildings also has huge potential to reduce the environmental impacts of buildings but also to be highly profitable to building owners. An estimate in the IPCC 2007 Nobel Prize-winning report on climate change places buildings as having the greatest potential to reduce GHG emissions and to do so largely at a negative cost to building owners. More than 80% of the estimated potential reduction in buildings' GHG emissions can be accomplished while saving money, according to Chapter 6 of the Working Group III report of the IPCC.

A study by Synapse Energy Economics of Cambridge, Massachusetts quantifies the differences by hour of the day and day of the year in a study for the U.S. Environmental Protection Agency. Using grid performance data from 2005, Synapse looked at the impacts of various strategies on GHG emissions for each grid subregion. The differences among regions are dramatic as can be seen in Synapse's color plots of GHG emissions hour-by-hour for the entire year. The Synapse report, Analysis of Indirect Emissions Benefits of Wind, Landfill Gas, and Municipal Solid Waste Generation.

The Synapse report clearly illustrates that the contrast within, between and among regions is dramatic at various times of day and year. The lead on the tool's development, Amber Mahone of Energy and Environmental Economics (E3) of San Francisco, was the consultant to the Project Committee for the ASHRAE GHG emissions tool development project. The concept for the project came out of E3's work helping define a concept for ASHRAE's GHG tool development. I encouraged Mahone to write a proposal to develop a tool for California and I requested (through Martha Brook) that the California Energy Commission fund it. The funding came, and the result is available now for downloading along with a User's Manual at the E3 web site. Please check out the tool, give some feedback, and spread the word. I would also appreciate hearing your comments on the tool after you have had a chance to look it over.

Small-Ion and Nano-Aerosol Production During Candle Burning: Size Distribution and Concentration Profile with Time

Matthew D. Wright, A. Peter Fews, Paul A. Keitch, and Denis L. Henshaw
H. H. Wills Physics Laboratory, University of Bristol, Tyndall Avenue, Bristol, United Kingdom

Burning of paraffin wax produces small-ions and aerosols in the diameter range 0.4 nanometers (nm) to 1.1 micrometers (μm).* The study investigated the particle characteristics formed by burning paraffin tea-light candles.  The study summarized here investigated the particle characteristics formed by burning paraffin tea-light candles.  There were peaks observed in the number concentration of particles in the diameters 10–30 nm and 100–300 nm. These are consistent with “normal” and “sooting” burn modes. They also saw a smaller mode in the size range 2.5–9 nm and interpreted it as a "soot-precursor." When they placed a fan behind the burning candle they saw a “modified small-ion” signal of particles at sizes 1.1–2.0 nm. This small size was "...not observed without the fan present or when a lamp chimney was used. During burning, aerosol concentration was elevated and small-ion counts were low. However after extinction of the flame, this trend was reversed and the number of small-ions increased to levels higher than those observed prior to burning, remaining so for several hours."

The researchers observed that "...although not the major source of indoor air pollution in most environments, a vastly increased inhaled dose of nano-aerosols from combustion could be received by anyone present in a typical domestic room containing a burning candle. Further work is required to characterize the charge state of the smallest particles produced in candle flames and to determine the extent of such increased deposition of combustion particles in the lung." However, regardless of the "charge state" [ion characteristics] of the particles, it is highly likely that particles in the nano-particle size ranges measured coming off the candle will be deposited in the human lung. This could have very serious health implications and should suggest both further research as well as caution on the part of those who frequently are exposed to burning tealight candles.

* 1 micrometer (μm) = 1,000 nanometers (nm). A nanometer is one billionth of a meter or 26 millionths of an inch. A human hair is about 5 x 10-5 meters or 50,000 nanometers in diameter.

Reference: Matthew D. Wright, A. Peter Fews, Paul A. Keitch, and Denis L. Henshaw, 2007. "Small-Ion and Nano-Aerosol Production During Candle Burning: Size Distribution and Concentration Profile with Time." Aerosol Science and Technology, 41:475–484, 2007.