Skip to content

Moving Beyond TVOC – Reasons to avoid the use of TVOC as Pass/Fail criterion for assessing VOC emissions from products

by Al Hodgson, Co-founder and Research Director, Berkeley Analytical Associates

Total Volatile Organic Compounds (TVOC) has a long history as a metric for determining the acceptability of the emissions of VOCs from building products and furnishings. The first significant program to rely on a TVOC criterion was the Carpet & Rug Institute’s (CRI) Green Label Program that evolved out of the Carpet Policy Dialog between the carpet industry and the US EPA.The TVOC criterion was later incorporated into the U.S. Green Building Council’s LEED rating systems and was adopted by the commercial furniture industry. More recent VOC emission test method and acceptance standards have focused instead on individual VOCs that may pose health hazards to individuals at low concentrations. Examples of such programs in North America are the California Department of Health Services' Standard Practice (a.k.a. Section 01350), which recently was revised to Standard Method Version 1.1, and CRI’s Green Label Plus program. TVOC values are still reported, but pass/fail determinations are based on the emission levels of individual compounds of concern. There is an urgent need to expand such determinations of acceptability beyond a select number of individual VOCs to encompass the broader range of chemical emissions that may impact health. TVOC is again being proposed to fill this gap and may be appealing to many because of its presumed simplicity. In my opinion, we should avoid this temptation and move on the more difficult, but certainly achievable, task of focusing on the toxicity of individual compounds. The following are my primary arguments against the use of TVOC as a Pass/Fail metric.

TVOC measurements may be grossly inaccurate and therefore the TVOC concept is unsuitable as a PASS/FAIL metric.  Individual compounds' instrumental responses relative to toluene, the surrogate standard of choice, vary dramatically.  Some common VOCs have an order of magnitude lower response per unit mass than toluene. Other compounds have higher response ratios. Even within a class of compounds (e.g., alkane hydrocarbons) the response per unit mass can vary substantially depending upon their chromatographic retention times with early eluting compounds having lower response ratios than late eluting compounds.  Individual VOCs also are measured with very different levels of precision. Thus, there is no way to determine the accuracy and precision of TVOC measurements made across different mixtures of VOCs characteristic of the broad range of products and materials being assessed.  This problem with TVOC is well recognized by true experts.  In particular, ISO 16000-9, the emission test method most widely used in Europe and other regions outside of the US clearly states. "The sum of emitted compounds, TVOC, should be regarded only as a factor specific to the product studied and only to be used for comparison of products with similar target VOC profiles."  One of the big changes that is needed in the reporting of VOC emissions is to include estimates of uncertainty.  In fact, reporting of uncertainty is dictated by ISO/IEC 17025 quality management systems if requested by the customer.  The use of TVOC moves the process in the completely opposite direction toward unknowable uncertainty.

Product certification programs can, and should, be progressive with respect to public health concerns.  TVOC may be a useful tool for such certification programs.  For example, the monitoring of TVOC for a specific product over time (in keeping with the ISO 16000-9 precaution) may provide useful information on manufacturing variations within or among production facilities assuming the VOC profiles are similar.  However, this is not a substitute for assessing the potential impacts of the individual compounds comprising these emissions.  There are many different lists of toxic chemicals that can be used by certification programs as the basis for such assessments.  The fact that a publically available method and guideline document only contains a relatively short list of chemicals of concern should not be a limiting factor.  MBDC's Cradle-to-Cradle program is one example of a proactive certification program that considers the environmental and human health issues associated with chemicals used the in the manufacturing of products.  It should be noted that a significant downside to this particular program is the lack of transparency with respect to how the toxicology judgments are made.  It also might be argued that the success Greenguard's Children & Schools program in the marketplace is, in part, related to their use of an expanded list of individual chemicals of concern.

Assuming there was a more accurate and precise measure of the quantity of total VOCs emitted by a product, there still is a need to establish an acceptable level.  The Greenguard Indoor Air program uses a guideline of 500 µg/m3 modeled to a small room.  The Greenguard Children & Schools program uses a guideline of 220 µg/m3 modeled to a typical school classroom.  The 500-µg/m3 value has some historical precedence, but in reality these numbers are simply 'pulled out of a hat.'   The chemicals used in manufacturing products are undergoing rapid change.  When the TVOC metric was first implemented as a metric for the Carpet & Rug Institute Green Label program in 1989, the chemicals used in manufacturing included aromatic and chlorinated hydrocarbon solvents.  Today in the 21st century, most products do not use these traditional solvents because of concern regarding their toxicity.  Instead, we have an increasing emphasis on 'Green Chemistry' and widespread use of water-based solvent systems. Generally, these chemicals have lower toxicity than the solvents they are replacing but they also have lower vapor pressures.  Due to their low vapor pressures, the off gassing of these solvents occurs more slowly than for aromatic solvents, for example.  Thus, total VOC emissions will be higher, but in many cases toxicity can be presumed to be lower.  The use of a TVOC metric may, therefore, penalize products and inhibit government's and industry's efforts to switch to more sustainable chemistry.  These efforts are better served by focusing on the toxicity of the individual compounds.

US Proponents of TVOC have repeatedly pointed to European product testing methods and certification programs as a precedent for the use of TVOC.  While it is true than many European programs do contain a TVOC requirement, the values are often considerably higher than the values the proponents would like to impose on the US.  The most widely used European assessment document, the German AgBB (http://www.umweltbundesamt.de/building-products/agbb.htm) scheme, relies mainly on criteria for a list of about 190 individual chemical substances.  The AgBB TVOC criteria at 3 days is 10,000 µg/m3, or 20 times a proposed 500 µg/m3 value measured at 7 or 14 days (note that a direct comparison is complicated by different testing methods and modeling assumptions, but the magnitude of the difference is approximately correct).  Clearly the dominant European assessment criteria focus on individual VOCs, NOT on TVOC.

Proponents of TVOC argue that there are tens of thousands of individual chemicals emitted by building products and furnishings that may be affecting our health, and due to this overwhelming number only a metric like TVOC is practical.  This is far from the truth.  There are many hundreds of chemicals in petroleum distillate fractions, e.g., Stoddard solvent.  Over the years, there has been a shift away from these solvent mixtures to simpler, manufactured mixtures with better controlled volatility and elimination of compounds that are of particular concern because of their toxicity.  The true number of chemicals that are frequently emitted by building products and furnishings probably number several hundred.  If this universe of chemicals can be identified (not difficult), it is a much more manageable task to evaluate the toxicology data to see which chemicals are of real concern for the general population and at what levels.

Proponents of TVOC also argue that there are many potential synergistic relationships among VOCs and that, again, only the use of TVOC can guard us against this danger. Such arguments regarding synergism are not founded on fact.  For example, while the hedonistic value of odor response can vary depending upon the mixture of chemicals, odor receptors are very specific for particular chemical functionality and size.  The mammalian sensory perception system (Trigeminal) is much more generalized.  However, the effects for VOCs with low reactivity (i.e., most of the VOCs that are measured by conventional methods) have been shown to be additive in both animal and human studies.  If there is a highly reactive VOC in the mixture, the sensory response is controlled by the reactive chemical, not the mixture.

Al Hodgson, Co-founder and Research Director, Berkeley Analytical Associates

Cement Sequesters CO₂: Wouldn’t it be lovely?

Cement is one of the most carbon emissions intensive parts of today’s buildings, and more often than not, one of the most widely used materials in pure mass per unit of floor area. Cement manufacturing is estimated responsible for 5% of global CO₂ emissions.

California has placed the reduction of carbon emissions from concrete high on its agenda to meet its ambitious CO2 emission reduction goals. Wouldn’t it be lovely if concrete could actually store CO2 instead of being responsible for so much CO2
emission?

Next to one of the largest fossil fuel-fired power plants in the United States, at Moss Landing on the Monterey Bay, Calera is capturing CO2 from the power plant and using it to make cement. Calera founder Brent Constantz claims that each ton of Calera cement contains half a ton of CO2 transformed into an essential ingredient of cement. Constantz says his process is probably the best carbon capture and storage technique available.

Calera bubbles the CO2 through seawater to make calcium carbonate. The resulting water has the calcium and magnesium removed, making it even more suitable for desalination. Local agriculture in the region around Moss Landing is responsible for overdraughting the groundwater to support local agriculture, so a desalination plant is also an attractive option in conjunction with the electric power and cement plants. A pilot plant is being built in nearby Santa Cruz to address water shortages during drought years.

As the plant produces only ten tons of cement daily and its product’s structural performance still must be tested, it is too early to say the climate crisis is solved. But the technology has the promise of contributing substantially to dramatic reductions in greenhouse gases attributable to buildings. Seventy percent of the electricity produced in the U.S. goes to buildings, and electric power production is responsible for more than half of all GHG emissions. It would be lovely if Calera’s process turns out to be as economically and environmentally attractive as it appears to be so far.

 

You can read more about Calera. It is featured in an August 7 on-line article on Scientific American’s web site, the promise of Calera cement is described in more detail. 

IAQ and Plants

The idea that plants clean indoor air is a sad, continuing saga fed by bad science, commercial interests, and wishful thinking.

I published an article in the Indoor Air Bulletin on the subject in 1992 (available on this web site) that provides some details.

Take home message:

1.   Don't use plants to improve IAQ. They don't. If anything, they pose risks to good IAQ.

2.   There is no credible scientific evidence that plants improve IAQ. The planting media has been hypothesized to be responsible for pollutant removal in some studies. The planting media alone can be expected to contribute to a limited reduction in some airborne chemical concentrations.

3.   Most advocates of indoor plant use have been funded by or are themselves providers of plants or supporting systems.

4.   If plants are used indoors for aesthetic reasons, there should be extra care to avoid moisture problems or problems with fertilizers and pesticides, all known sources of indoor air quality and health problems.

If you do have plants indoors, don't do it to improve indoor air quality. The pollutant removal effect is negligible and, as far as the science has shown, is not due to the plants but is due to adsorption on the soil and, possibly, uptake by the organisms in the root area of the plant. So, you could just put the planting mix in the space and use fans to move air through it. In one study, charcoal was added to the planting mix with fans moving the air, demonstrating that it was not the plants but the planting mix that was doing the removal.

The rate of removal by plants, even if you use the data from the one NASA research project ever done on it, is smaller than the removal of pollutants through the air exchange that takes place in a very tight building due to leakage through the envelope. If you will fill a house with three layers of the plants recommended by the advocates, the removal rate would be equal to 1/10th of an air change per hour (ach). Buildings with mechanical ventilation generally have a minimum ventilation rate of 0.5 air changes per hour. Offices using typical ASHRAE design values have about 0.8 ach.

The one. often-quoted NASA research project was done in static chambers, sealed chambers with no air exchange rates. This is not a scientifically sound way to investigate the removal rate of pollutants. A dynamic test involving an air change rate equal to those in real buildings and achieving steady state conditions is a far more relevant test. In a static chamber, a test over the time period in the NASA study would be dominated by the sink effects, removal from the air by adsorption to surfaces of the chamber and the plants. This does not give any idea about the removal rate obtained by plants in a real environment or even in a chamber over a normal period of on-going occupancy.

More recently published studies have been characterized by the use of static chambers or carelessness in the measurements of the environmental parameters. A paper presented at Indoor Air 2008 in Copenhagen last month actually showed a decrease in research subjects' task performance when plants were present.

The use of plants indoors, especially the "living wall" concept or other extensive use requiring periodic addition of moisture, creates substantial risks of moisture, mold, and bacteria problems in the air. There is a substantial risk of moisture-related problems including but not limited to mold in buildings with extensive plantings. The scientific evidence points more strongly to moisture than to mold as the relevant association in buildings with higher rates of asthma or allergy among the occupants. There are also risks from the use of fertilizers and pesticides, if required, in the indoor environment. We generally try to steer people away from plantings that require frequent irrigation, fertilizer, or pest control immediate around buildings, especially if there are operable windows.

Most of the favorable publicity around the use of plants comes from folks whose business it is to provide plants and/or the systems to support them. Try to check out your sources and the sources of funding for any study that they cite.

You can read a more extended discussion of plants and indoor air quality in an article posted on this web site under articles, "Can house plants solve indoor air quality problems?" It was originally published in my old newsletter, Indoor Air Bulletin, in 1992.

Because of the financial interest providers of the plants and supporting systems have, there continue to be many individuals innocently advocating the use of plants to improve indoor air quality. This is a problem that doesn't seem to go away because of the appeal of indoor plants and the myth that everything natural is good. Remember that many chemicals found in nature are poisonous, that many plants are poisonous and even deadly (e.g., digitalis) to humans and other living beings.

Natural insecticides such as those derived from chrysanthemums (pyrethrins) are allergenic to many people and are toxic to insects and, it appears likely, to humans.

The Wikipedia listing for pyrethrin says: "In humans, pyrethrin irritates the eyes, skin, and respiratory systems, and it may cause other harmful effects. One study suggested a link between maternal pyrethrin use and autism in children.

The study indicated that mothers of autistic children were twice as likely to have washed a pet dog with a flea shampoo containing pyrethrin while they were pregnant."

By the way, I have a few plants around me as I sit here typing,  but they are mostly orchids and cacti, not intended to or expected to clean the air. I tend to underwater them and rarely fertilize them. Of course they don't bloom as often as I'd like, but that's the trade-off for ensuring better IAQ.

Calculating Greenhouse gas emissions from buildings

Most calculations in the U.S. and throughout the world are based on an average annual value for the grid region, sub-region, or nation as a whole. Looking at the annual reporting under the UNFCC, there is a huge range of values used for conversion of electric consumption to GHG emissions, and some of the values are clearly highly inaccurate. Even where reasonably accurate annual average values are used, they do not reflect the variations in building operation in response to weather and over the course of the day, week, and year.

Furthermore, the electric grid itself has large temporal variations by hour, day of the week, and season. Different plants are on-line as "baseload" and are always on and others come on as needed. In some locations, the baseload generation is relatively "clean" (hydro and nuclear), and the peak load is relatively dirty (coal). In other locations it is exactly the opposite. Wind, solar, geothermal, and gas also play their roles to varying degrees.

The largest uncertainties in estimating GHG emissions from buildings come from the differences between design assumptions and actual building use and energy use intensity/performance. These could easily be on the order of a factor of 2 if they differ too much. For existing buildings, making retrofit or operational decisions, those uncertainties are far smaller and then the accuracy of the calculation of GHG emissions from electricity production probably contains the next highest source of uncertainty, although I have no data to support that assertion.

The time- and weather-dependent performance of the electric grid and of buildings requires an incredible amount of intensive manipulation of available data, some of which is of poor accuracy and low reliability. Matching historical data to the weather and time when the electricity was generate is non-trivial. Electric production is reported on an hourly basis in the U.S. but carbon emissions are reported on a monthly basis and are mostly based on calculation rather than measurement. There are huge discrepancies in the available reported data from the more than 4700 electricity generators in the U.S. These data are massaged by EPA to produce the annual average values that are used in the Energy Star program and for the U.S.'s international reporting.

Building use and operational assumptions are also important sources of uncertainty, and there are some efforts to address those issues in trying to improve the accuracy of building simulation models.

The only GHG emissions calculation tool that exists that can actually come even close to giving accurate numbers that reflect the temporal and climate dependencies of building operations and of GHG emissions from the grid is the California tool. If you haven't seen it, please do check it out -- it's free and downloadable. While it only works for California, the concept using a dispatch model can be replicated anywhere in the U.S. or elsewhere where future planning is reasonably reliable out into some number of future years. The GHG tool for Buildings, which E3 has been developing with the help of Martha Brook, is now available on the E3 website (the first link): http://www.ethree.com/E3_Public_Docs.html.

The alternative is to use historical data and to massage them to either create a representative year or to create a program that allows modeling under any assumed or projected set of climate and time schedule scenarios. If you look at the Synapse report, you will see that there are huge differences over the course of the year and even substantial differences over the course of the day. Based on analysis of data from the year 2005, Synapse showed that there can be differences as large as 60% between average annual values for GHG emissions/MWh electricity generation in some regions of the country. The report is titled: "ANALYSIS OF INDIRECT EMISSIONS BENEFITS OF WIND, LANDFILL GAS, AND MUNICIPAL SOLID WASTE GENERATION" can be downloaded from
http://www.synapse-energy.com/Downloads/SynapseReport.2008-07.EPA.EPA-Indirect-Emissions-Benefits.06-087.pdf.

The potential range of emissions in tons GHG/MWh is very large. The distribution of hourly average emissions from the grid in New England in 2005 (from the Synapse report) clearly shows the huge variations and why annual average values cna be very misleading in making design or operational decisions. Similar vaiations occur in many other regions, and the inter-regional differences are also quite large (>2X in some cases).

The ideal, and this was the Statement of Work that came out of our ASHRAE GHG calculation tool committee's efforts last year, is to look at both the results of calculations based on a dispatch model (like the E3 tool) and the adjusted historical databased model (let's call it a "Synapse plus tool")and determine whether there are significant differences -- let's say more than 5% to 10%. There are plenty of other sources of uncertainty, so cutting it too fine in this kind of modeling does not make sense. But getting a sense of the magnitude of the differences would help us design the "ideal" tool construct.

Jeff Haberl of Texas A&M has worked with the available data in Texas, and he has helped me understand the complexities of getting these things right even when and where the data are available. Many of these can be overcome where good historical data and dispatch models are available, but they will take some well-focused work. Either of the past ASHRAE GHG Tool PC contractors, E3 or Synapse, is capable of doing this kind of work, and Jeff is also very knowledgeable about how to meet some of the challenges. There is also a need to be clear about site energy use intensity and source energy intensity, something for which Mike Deru at NREL has published reasonably good data (NREL Technical Report NREL/TP-550-38617, June 2007).
The stakes are too high to be making bad or poorly informed decisions any more, so let's get something robust developed that can move us ahead toward well-informed design, retrofit, and operational decisions for buildings.

Behind the Logos: Understanding Green Product Certifications

A great review of environmental labels available for building products has been published by Environmental Building News and can be viewed on-line at their web site.

Defining Environmentally Sustainable Building Budgets

Determining whether a building is sustainable requires a benchmark based on scientific knowledge of the earth’s carrying capacity. Environmental budgets or targets can be used to evaluate or compare building designs or performance. We propose a method for deriving targets based on global population projections through the year 2100 to allocate resource consumption and pollution emission budgets equally to all the earth’s inhabitants.

This article appeared in the newsletter from the Committee on the Environment, American Institute of Architects, AIA COTE notes, Summer 2006.

By Hal Levin

Background
We propose a method for developing budgets based on calculation of fossil carbon emissions, an indicator of carbon equivalents greenhouse gas emissions, contributors to climate change. We use scientists’ calculations of the capacity of the earth’s atmosphere to balance the heat entering and leaving the atmosphere as a result of all forces including but not limited to human activities. The overwhelmingly dominant human contribution to climate change appears to be the human impact on the global carbon dioxide concentration. For 400,000 years before industrialization, the global average concentration never exceeded 300 ppm. Since industrialization, it has risen from 285 parts per million (ppm) CO2 to approximately 380 ppm, with a steep climb in the past few decades. Climate scientists predict that our present rate of growth in carbon emissions will result in a global average of 700 ppm CO2. Most climate scientists agree that we should stabilize the concentration between 450 and 550 ppm by the year 2100 in order to limit global average temperature to a 2 degrees Celsius increase above current climate. A substantial amount of science is available regarding the potential impacts of this amount of warming, and the consequences appear rather significant but are hoped to be tolerable. In fact, a recent European Commission report says that even limiting global CO2 to 450 or 550 ppm will result in a 25 to 75 percent risk that global average temperatures will increase by more than 2 degrees Celsius.

Some experts argue that we can’t wait until 2100 and that we should shoot for 2050 or sooner to stabilize atmospheric CO2. From a practical perspective, that does not appear to be achievable at present. Neither the United States nor the developing countries are signatories to the Kyoto Protocol. The extremely rapid diffusion of technology in many developing nations and their substantially higher rate of population growth compared with the developed countries suggest that they will have to radically alter their current path. But until developed nations set an example and develop the necessary technology and the policy instruments necessary to effect the change, it is difficult for leaders of developed countries to request that developing countries curtail their growth in the distribution of higher standards of living through appliances, automobiles, and other consumption that is energy intensive. The prudent approach is to reduce anthropogenic carbon emissions as much as possible as quickly as possibly—probably considerably faster than contemplated under the Kyoto Protocol and the most advanced current planning in Europe.

The approach proposed here involves a number of assumptions that are either subject to revision as we obtain new and better data in the next decade or two. It also involves some choices that warrant further discussion and revision to improve their fairness to all affected parties as well as their feasibility. The basic approach was first described in 1992 report by the Dutch government agency Advisory Council for Research on Nature and Environment (RMNO). The report, Ecocapacity as a Challenge to Technological Development, was borrowed and applied in reports produced by the Friends of the Earth Netherlands that also estimated the carrying capacity of the Netherlands that became a model for several European countries and finally for a report on Sustainable Europe. These reports translated “budgets” for resource consumption and pollution emissions into national targets against which national reporting could be compared. These activities contributed significantly to the current model for carbon emissions trading that has become widely accepted in Europe and that functions in the marketplace the way emissions trading of various air pollutants operates in the United States.

The Process
There are five simple steps in the process as follows.

Define the capacity of the resource or sink in question. In the case of fossil carbon emissions, this is based on the best available models of the impact of carbon emissions on global climate and uses the assumption of a 450 to 550 ppm global average CO2 concentration.

Translate the total emissions that are believed “sustainable” into a per capita budget. In the case of carbon emissions, to achieve 450 to 550 ppm global average CO2 concentrations would allow emissions of 1 to 2 kilograms of carbon per person per day (kg C/p-d) by the year 2100 with an expected population of about 8.5 billion people—the latest UN population projection. Compare this to the current global average of 3 kg C/p-day and the U.S. average of around 17 kg C/p-d. Of course various sources of energy have different implications, with electricity from coal emitting roughly twice that derived from natural gas. Hydropower is closer to carbon neutral, although there are some emissions related to the development and maintenance of hydropower electricity sources. Solar photovoltaic can also be close to zero on a life cycle basis.

Calculate the portion of total emissions attributable to buildings. Using the latest Department of Energy data on the distribution of energy consumption by sector and our own data on components of building-related energy attributed to industry, transportation, and agriculture, we estimated that building related energy consumption (including “plug loads”) is about 40 percent of total energy consumption. Thus, each individual’s emissions must be 0.4 to 0.8 kg C/d-p as a “sustainable” budget. Currently 5 to 7 kg/p-d are associated with building energy use. This estimate could be refined but is not likely to change more than about 5 to 10 percent. It includes construction, use, operation, maintenance, renovation, and demolition or recycling of buildings.

Determine the portion of total building use attributable to each building type. Again, based on DOE data on the shares of total energy used by each building type, we used the present share of each building type and allocated it to each. This allocation could be refined by analysis of the degree of conservation and efficiency already applied and the amount of further reductions deemed reasonably feasible and achievable in each type. Energy per square foot consumption represents a wide range with health care and food retail at the high end and public safety, public assembly, and storage on the low end.

Finally, to derive a target for a specific building, the budgets of its users are applied. For example, for a school, divide the number of students, teachers, and staff who study or work at the school by the total number at all schools at the same grade levels in the country. For offices, the value could be based on workers or work stations, for a library it could be based on daily average users, for a retail establishment on the number of customers or customer hours etc.

The proposal presented here is to compare modeling data for building designs or data from monitoring of built structures with the carbon emission budget targets to determine their “sustainability” with respect to carbon emissions. Similar budgets can be prepared, as was done by the Dutch in the reports mentioned above, for consumption of various renewable and non-renewable resources as well as for various pollution emissions and for land encroachment. Targets are set for biodiversity loss, ozone depletion, copper consumption, cadmium releases, etc.

An elaborated version of the derivation and a number of relevant references are part of a paper I presented in Tokyo last September at the Sustainable Buildings 2005 conference and two papers presented at Healthy Buildings 2006 in Lisbon, Portugal, in June 2006. These papers can be downloaded from www.buildingecology.com.

Other resources include:
EnergiePortal: Climate change: https://www.energieportal.nl/english

ASHRAE web site pages for sustainability: http://www.engineeringforsustainability.org/

IAQ: US EPA BASE study data available on the web

Do you want to know how your building is doing? EPA collected extensive indoor air quality data from 100 randomly selected public and commercial office buildings in 37 cities and 25 states. You can compare measurements made in your own building to those in this massive, scientifically-based study to identify how your building compares.

US EPA BASE study data now available on the web

Do you want to know how your building is doing? EPA collected extensive indoor air quality data from 100 randomly selected public and commercial office buildings in 37 cities and 25 states. You can compare measurements made in your own building to those in this massive, scientifically-based study to identify how your building compares.

Air inside public and commercial office buildings contains a wide variety of pollutants that can build up and possibly affect the health of people working there.  Indoor air problems have the potential to affect the health of many people and significantly reduce productivity.

Early research of the indoor air quality, or IAQ, of office buildings in the United States focused on evaluating problem buildings where occupants had significant complaints about the IAQ.  However, such problem buildings could not be compared because there was very little baseline IAQ information about typical buildings.

To fill this data gap, EPA conducted the Building Assessment Survey and Evaluation (BASE) study.  The BASE study used a standardized protocol to collect extensive indoor air quality data from 100 randomly selected public and commercial office buildings in 37 cities and 25 states.

This website, outlined below, describes in detail the BASE study, data, highlighted analyses, original data for independent examination, and more.

Contents:

Basic Information and Overview: This section provides a basic description of the study, its goals and objectives, and the information collected.

Methodology: The BASE study was conducted using a standardized protocol. A description of the seven basic activities performed for each of the study buildings is provided here.

Summarized Data: Summaries of select information collected from the 100 BASE buildings studied are available here.

Highlighted Analysis: The data collected provide normative information on several parameters which can be used for further assessment and analysis of IAQ-related issues.  Highlighted here are summaries and results of assessments and analyses performed on the BASE data.

Frequent Questions: We have provided answers to what we consider some of the probable frequently asked questions regarding this study.

How to Obtain Data: We have provided a means which will allow independent examination and analysis of the data and to allow for hypothesis development the raw data collected as part of the BASE study.

Publications: The protocol, quality assurance plan, and other supporting study documentation as well as publications describing the study and summarizing select study results are available here.

Glossary: We have provided definitions for some BASE-specific terminology used throughout the study.

Energy Principles in Architectural Design

The oil crises of 1973 and 1979 resulted in far more emphasis on energy conservation and designs to minimize use of energy. Most of California’s licensed architects at that time had been educated in the era of mythical unlimited energy. In the late 1960s and even in 173, nuclear power was expected to be so economical that they wouldn’t even need to meter it. Forty-one nuclear power plants were authorized that year.  The U.S. was not a net oil importer, and the limits to oil reserves were not commonly considered.  Single-pane glazing was standard in California until the California Energy Commission adopted regulations in its Title 24 that required double-glazing beginning around the time of Dean’s guide.

Dean, then a professor in the Department of Architecture at the University of California, Berkeley, both wrote and illustrated this 85 page guide that lays out the basic knowledge about energy and architecture. While it was written 30 years ago, the principles have not changed, and the guide is a very handy reference for students and professional architects alike.

You can download the guide (~11MB) from BuildingEcology.com, or you can request a hard copy from the California Energy Commission, 916-654-4287. The publication is now out of print, but the staff has recently made copies upon request.

Sunshine and Natural Ventilation, lots of it as the cure for the flu?

An article appearing in the American Journal of Public Health describes a number of approaches to dealing pandemic flu outbreaks including the devastating global pandemic of 1918, as well as some more recent ones. It praises the approaches of placing diseased patients in “open air” environments, focusing on the benefits of exposure to plentiful fresh air and sunlight, without ignoring the importance of “scrupulous standards of hygiene” and the use of reusable face masks. As the H1N1 infection spreads now, the advice is not without relevance.

The article is available on line at http://ajph.aphapublications.org/cgi/reprint/AJPH.2008.134627v1.

Our sources suggest that the sunlight may be important for the synthesis of Vitamin D, often lacking especially during the darker portion of the year due to a lack of sunlight exposure. Some suggest that in northern latitudes there simply is not enough sun intensity for the necessary exposure. Recently published findings are now suggesting that Vitamin D may important for proper immune system functioning and could play a role in resistance to infections such as the influenza virus.

Hobday RA and Cason JW: The Open-Air Treatment of Pandemic Influenza. AMERICAN JOURNAL OF PUBLIC HEALTH: 99;S236-S242; SUPPL. 2, 2009.

ABSTRACT: THE H1N1 "Spanish Flu" outbreak of 1918-1919 was the most devastating pandemic on record, killing between 50 million and 100 million people should the next influenza pandemic prove equally virulent, there could be more than 300 million deaths globally, the conventional view is that little could have been done to prevent the h1n1 virus from spreading or to treat those infected; however. There is evidence to the contrary. Records from an "open-air" hospital in Boston, Massachusetts, suggest that some patients and staff were spared the worst of the outbreak A combination of fresh air, sunlight, scrupulous standards of hygiene, and reusable face masks appears to have substantially reduced deaths among some patients and infections among medical staff We argue that temporary hospitals should be a priority in emergency planning Equally, other measures adopted during the 1918 pandemic melt more attention than they currently receive (Am J Public Health 2009;99:S236-S242. doi 10.2105/AJPH.2008.134627)

California Greenhouse Gas Tool for Buildings

California Greenhouse Gas Tool for Buildings California's Greenhouse Gas (GHG) tool for California is now available on the web -GHG Tool for Buildings in California. The tool is publicly available and free for download.

GHG Tool for Buildings in California

This is a major step forward, the first tool that provides time- and weather-resolved GHG emissions calculations. It is based on a dispatch model,* that is a model of how the grid inventory would be over time through the year 2020. Since electricity demand and consumption as well as grid performance are highly dependent on weather and time of day or week, an annual average value for GHG emissions at a location or a portion of the regional or sub-regional electric grid is not accurate.

Other commonly used and well-known GHG calculation tools use annual average emissions for a grid region or sub-region. In some countries, a national average is used. In fact, we know of no other tool that addresses the time of use and weather impacts on building energy use and associated GHG emissions. The other available tools can greatly distort the impact of a building on GHG emissions and result in very poor design and operational decisions.

The distortions can lead to mistaken result from analysis of a building's energy use either from use data or simulation models. Designers and building operators need more accurate, time and weather resolved data to make informed decisions intended to affect GHG emissions.

Design always involves trade-offs. Designers are now focused on reducing energy use. But grid-generated electricity has different carbon implications, depending on the time when it is used and the simultaneous inventory of electricity generators in the grid region or sub-region. For example,

  • To understand whether load shifting by using thermal storage or increasing building envelope insulation is more effective in reducing GHG emissions, one needs to model building performance with a time- and weather-resolved model and GHG emissions data.
  • Is it more effective to produce electricity on-site with solar PV or to invest the same amount in high performance glazing?

Questions like these can only be answered with a tool that considers the time of use and concurrent grid performance. Use of an annual average can distort the result of a comparison of alternative designs by as much as 60% in some grid regions in the U.S. while in other grid regions, the annual average is relatively accurate. Buildings are the largest electricity user in the U.S. (70%) and in most of the world. The potential to reduce GHG emissions in buildings is huge, largely because buildings are currently quite inefficient. Reducing electricity consumption will have a major impact on GHG emissions, and site energy use intensity, the usual metric for building energy use, is roughly only 1/3 of source energy use and the associated GHG emissions.

Addressing climate change in buildings also has huge potential to reduce the environmental impacts of buildings but also to be highly profitable to building owners. An estimate in the IPCC 2007 Nobel Prize-winning report on climate change places buildings as having the greatest potential to reduce GHG emissions and to do so largely at a negative cost to building owners. More than 80% of the estimated potential reduction in buildings' GHG emissions can be accomplished while saving money, according to Chapter 6 of the Working Group III report of the IPCC.

A study by Synapse Energy Economics of Cambridge, Massachusetts quantifies the differences by hour of the day and day of the year in a study for the U.S. Environmental Protection Agency. Using grid performance data from 2005, Synapse looked at the impacts of various strategies on GHG emissions for each grid subregion. The differences among regions are dramatic as can be seen in Synapse's color plots of GHG emissions hour-by-hour for the entire year. The Synapse report, Analysis of Indirect Emissions Benefits of Wind, Landfill Gas, and Municipal Solid Waste Generation.

The Synapse report clearly illustrates that the contrast within, between and among regions is dramatic at various times of day and year. The lead on the tool's development, Amber Mahone of Energy and Environmental Economics (E3) of San Francisco, was the consultant to the Project Committee for the ASHRAE GHG emissions tool development project. The concept for the project came out of E3's work helping define a concept for ASHRAE's GHG tool development. I encouraged Mahone to write a proposal to develop a tool for California and I requested (through Martha Brook) that the California Energy Commission fund it. The funding came, and the result is available now for downloading along with a User's Manual at the E3 web site. Please check out the tool, give some feedback, and spread the word. I would also appreciate hearing your comments on the tool after you have had a chance to look it over.