HomeMy WebLinkAboutChugach Electric Watt Buster Final 2010
Watt Buster Final Report
AEA Metering Project #AEA 10-012
Version 1.0
October 29, 2010
Watt Buster Final Report
i
Table of Contents
Executive Summary .................................................................................................................................... 1
1. Background ........................................................................................................................................... 3
2. Residential Component ....................................................................................................................... 3
2.1. Effectiveness of Building Energy Monitors....................................................................................... 4
2.1.1. Methodology ........................................................................................................................... 4
2.1.1.1. Devices Used ............................................................................................................... 4
2.1.1.2. Recruitment and Selection Process ............................................................................. 4
2.1.1.3. Distribution and Training ............................................................................................... 5
2.1.1.4. Energy Efficiency Education ......................................................................................... 6
2.1.1.5. Participant Surveys ....................................................................................................... 6
2.1.1.6. Control Groups ............................................................................................................. 7
2.1.2. Results.................................................................................................................................... 8
2.1.2.1. Change in Consumption ............................................................................................... 8
2.1.2.2. Significant Results for Different Groups or Classes of Subjects .................................. 9
2.1.2.3. Participant Feedback from Surveys ........................................................................... 11
2.1.3. Conclusions .......................................................................................................................... 12
2.2. Comparison/Assessment of Building Energy Monitors .................................................................. 13
2.2.1. Methodology ......................................................................................................................... 13
2.2.2. Devices Used ....................................................................................................................... 13
2.2.3. Results.................................................................................................................................. 14
2.2.4. Conclusions .......................................................................................................................... 15
2.3. Use of Appliance Power Meters ..................................................................................................... 15
2.3.1. Methodology ......................................................................................................................... 15
2.3.2. Results.................................................................................................................................. 15
2.3.3. Conclusions .......................................................................................................................... 15
2.4. Recommendations for the Future .................................................................................................. 16
3. Commercial Component .................................................................................................................... 17
3.1. BEM Used ...................................................................................................................................... 17
3.1.1. Monitors ................................................................................................................................ 17
3.1.2. Background Data .................................................................................................................. 18
3.1.3. Energy Savings Assessment Report .................................................................................... 19
3.2. Methodology ................................................................................................................................... 20
3.2.1. Customer Recruitment ......................................................................................................... 20
3.2.2. Initial Assessment Meeting................................................................................................... 21
3.2.2.1. Confirming Study Participation ................................................................................... 21
3.2.2.2. Gathering Background Information ............................................................................ 21
3.2.2.3. Identifying Monitor Locations ...................................................................................... 22
3.2.2.4. Scheduling Deployments ............................................................................................ 22
Watt Buster Final Report
ii
3.2.3. On-site Monitoring ................................................................................................................ 22
3.2.4. Assessment Reports ............................................................................................................ 22
3.2.4.1. Reviewing Reports ..................................................................................................... 23
3.2.4.2. Reporting Findings ..................................................................................................... 23
3.2.5. Surveys................................................................................................................................. 23
3.2.6. Customer Education ............................................................................................................. 23
3.3. Results ........................................................................................................................................... 24
3.3.1. Additional Research ............................................................................................................. 24
3.4. Conclusions .................................................................................................................................... 25
3.5. Recommendations for the Future .................................................................................................. 25
Appendix A: Residential Component Communications ..................................................................... A-1
Appendix B: Residential Component Educational Materials ............................................................. B-1
Appendix C: Residential Component Surveys..................................................................................... C-1
Appendix D: Residential Component Data ........................................................................................... D-1
Appendix E: Commercial Component Solicitation ...............................................................................E-1
Appendix F: Summarized Commercial Data ......................................................................................... F-1
Appendix G: Commercial Building Energy Assessment Reports ..................................................... G-1
Appendix H: Commercial Component Surveys ................................................................................... H-1
Watt Buster Final Report
1
Executive Summary
Thirty-two commercial customers and 96 residential customers participated in the Watt Buster research
project between February and October 2010. The project was conducted to determine whether building
energy monitors – or in the case of commercial customers, an energy assessment – would result in
energy efficiency improvements.
Residential Component
Test participants in the residential component installed and used the Tendril home system, which
provides close to real-time feedback on household electrical consumption. The Tendril system includes a
counter-top display unit and a web portal, allowing participants both at-a-glance information and more
detailed data on the portal. The Tendril system was in place between three and four months, depending
on when participants installed it.
The project included three participant surveys: baseline, mid-project and closing. The surveys elicited
valuable information about participants’ energy behaviors and perceptions. The relationship between a
BEM and changes in electric consumption was derived from the following: comparison of the test group’s
actual consumption in the test period to the same months the previous year; comparison of actual test
period consumption of test group subsets to control groups; and changes perceived by the test group.
The residential component also compared the weaknesses and merits of the Tendril home system with
EnergyHub and OpenPeak, two other residential BEM systems, and surveyed users of appliance power
meters Kill A Watt and Watt’s Up?
The aggregated consumption data did not reveal significant reductions in electric consumption among
test participants. However, several subsets of the test group did reduce their household electric
consumption significantly: participants who thought they had already done as much as they could to make
their household energy efficient, in fact found more ways to reduce their consumption. Households with
electric heat (a very small percentage of the total group) reduced their electric consumption far m ore than
any other subset. A likely explanation is that milder weather had a more significant impact on those with
electric heat. Based on decade built, homes built in the 1980s were the only subset to show reduced
electric consumption from the previous year.
Key findings of the residential component include the following:
• Customers found the BEMs helpful, but it is not clear they would pay for it themselves
• Building energy monitors increase knowledge and interest in energy efficiency
• Certain types of information, about appliances in particular, would help customers be more
energy efficient
• Perceptions of comfort and convenience are significant barriers to energy efficiency
improvements
• Time-of-day pricing may be the key to gaining consumer attention and behavior changes
• BEM technology is still evolving
Watt Buster Final Report
2
Commercial Component
The commercial portion of the BEM study consisted of BuildingAdvice energy assessments of 32
commercial buildings: 21 from Chugach’s service area and 11 from Municipal Light & Power’s (MLP)
service area. The commercial energy assessments modeled a building’s total energy use, based on one
week of monitoring data and background building data, and generated a report with recommendations for
energy savings. The project team then met with each participant to review the report’s findings and
recommendations. Participants then answered a brief survey about any plans to implement the
recommendations, as well as their perceptions about the project.
The commercial survey was complemented by a telephone survey of 121 of Chugach Electric’s largest
commercial customers.
Key findings of the commercial component include the following:
• Commercial customers are taking steps to become more energy efficient. The most common
steps are energy efficient lighting, increasing employee awareness, and streamlined operations
and scheduling.
• Commercial customers’ decisions about energy efficiency improvements are driven by the
potential for energy savings and cost.
• There is a disconnect between perceived and actual electric consumption. Commercial
customers believe their energy consumption has increased over the past three years (by a mean
increase of 9.5%). In fact, consumption has declined just over 3% per year. This disconnect
between perceived and actual consumption should be an important element in future education
efforts.
• Commercial customers are clearly driven by financial considerations: return on investment,
capital outlay required, and impacts to their bottom line.
• Many of the easiest and least expensive measures are not obvious to owners and property
managers. Chugach could help commercial customers make significant improvements by helping
identify the low-hanging fruit of simple, low and no-cost changes.
Watt Buster Final Report
3
1. Background
Chugach Electric conducted a research project, called Watt Buster, on the impacts of residential and
commercial building energy monitors (BEMs) on energy efficiency and conservation. This project was
funded in part by a matching grant up to $75,000 from the Alaska Energy Authority under AEA Metering
Project #AEA10-012.
The BEM research project had two distinct goals:
• To identify whether, and under what conditions, the deployment of BEMs is most effective for
reducing energy usage
• To identify the best equipment available and which features are most helpful to consumers and
utilities.
The project included both residential and commercial components. The original project scope also
included rural customers. Chugach made numerous attempts to engage rural partners. The prospective
partners were unresponsive. These efforts were described in previous reports to AEA, which concurred
that this portion of the project could be dropped.
For the commercial component, energy assessments were conducted at 32 commercial sites. The
commercial energy assessments modeled a building’s total energy use, based on one week of monitoring
data and background building data, and generated a report with recommendations for energy savings.
Commercial participants were surveyed to determine whether they planned to implement any of the
recommendations.
2. Residential Component
The residential component of Watt Buster project included three pieces:
• Effectiveness of BEMs, using the Tendril Home system
• Comparison/assessment of BEMs, using Tendril, OpenPeak, and EnergyHub BEMs
• Use of power meters, using Kill A Watt and Watt’s Up? meters
Note: Chugach Electric intended to achieve both goals with a study in which test subjects used one of
three BEMs: Tendril, OpenPeak, or EnergyHub. However, only Tendril was available in the time
frame needed so the plan was amended to conduct the evaluation and comparison later in the
year.
Watt Buster Final Report
4
2.1. Effectiveness of Building Energy Monitors
To evaluate the effectiveness of BEMs in residential buildings, Chugach used the Tendril Home system,
which test participants installed and used from March through June, 2010.
2.1.1. Methodology
The following sections describe the Tendril system and the methodology Chugach used for this portion of
the research project.
2.1.1.1. Devices Used
The Tendril Home system consists of three devices and access to a Web portal:
• Translate—The Translate device receives data from the customer’s electric meter and
communicates the data to the other two devices: the Transport and Insight. The Translate, which
must be located close to but not directly in line with the electric meter, receives data at a
frequency of anywhere from every few seconds to every 10 minutes. It transmits the data to the
Web portal every 15 minutes.
• Transport—The Transport receives data from the Translate and communicates with the Internet.
It is plugged into the customer’s router or modem.
• Insight—The Insight is a counter-top device that displays data from the Translate. It provides
information on energy consumption, including projected billing amount and cost per hour, which
was provided by Chugach. It also displays short messages from Chugach Electric.
• Vantage Web Portal—The Vantage Web Portal provides current and historical data about the
customer’s energy consumption and other information in a website. Chugach provided Tendril
with one year of historical data for each participant, which enabled the participants to monitor and
compare current energy usage with historical consumption.
Each test member also received an antenna for the Transport; Ethernet and power cables; and a card
with the test customer’s user name, password, and the Web address for the Vantage Web Portal. The
combination of the Translate, Transport, and Insight form a Home Area Network.
Set Point Thermostats. Tendril’s product line includes a Set Point thermostat, which ties into the Home
Area Network and can be controlled remotely. Chugach purchased approximately 20 of these “smart”
thermostats and offered them to the test group. However, the Tendril thermostats require a compatible
heating system, which not all prospective users had. Chugach installed Tendril thermostats in nine
homes, and the response was quite positive. Users reported they liked the remote control and being able
to monitor their heat consumption on the Tendril system. Testers were encouraged to keep the
thermostats after the project ended.
Volt. Tendril also offers the Volt, a device that measures electrical consumption of household appliances
and displays the data in the Home Area Network. Chugach Electric did not purchase the Volt.
2.1.1.2. Recruitment and Selection Process
On January 22, 2010, Chugach sent an email to the 5,844 residential members for whom it had email
addresses. The email explained the Watt Buster research project, the initial requirements, and included a
link to the application form on the SmartPowerAK website (www.smartpowerak.com). In addition,
Chugach announced the project in the February issue of Chugach’s member newsletter, the Outlet, and
in a print advertisement in the Anchorage Daily News in early March. (Refer to Appendix A for copies of
these materials.)
Watt Buster Final Report
5
In order to participate in the study, members were required to meet the following criteria:
• Have and use a home computer and the Internet
• Have an open port on their router, or be willing to purchase a router
• Be willing to use the monitor as directed and participate in surveys
• Reside in the home for the duration of the project
• Have been a Chugach residential customer for at least three years
In response, 324 residential members volunteered to participate in the project. Prospective volunteers
were asked to provide information about their home size, building type, heat source, and number of
persons in the household. Chugach sorted applicants by home size, building type, and fuel source, and
then selected 96 participants at random from within those groups in numbers to reflect a cross-section of
the entire residential membership. Table 1 lists the number of participants within each classification,
based on the self-reported information in the application.
Note: When analyzing data (refer to 2.1.2 Results), Chugach used CAMA data from the Municipality of
Anchorage, which may differ slightly from the self-reported information.
Table 1: Classification of Residential Participants
Heat Source Home Type Home Size (Square Feet)
Electric 5
Gas 89
Mobile 3
Duplex 7
Condo 5
Single Family 81
≤ 1,000 6
1,001 – 1,799 22
1,800 – 2,499 32
≥ 2,500 32
Because the test group consisted of self-selected volunteers, they did not necessarily represent the
average Chugach residential customer. Even so, this research project generated valuable information
about the strengths and limitations of BEMs, as well as motivations and attitudes of customers with an
interest in energy efficiency.
2.1.1.3. Distribution and Training
Chugach personnel, including several customer service representatives who were direct contacts for
participants during the project, received on-site training from Tendril on February 9 and 10.
After participants were selected, Chugach reviewed each participant’s meter. Most of the test participants
had older meters, which required replacement for the Tendril system to function. Chugach replaced all
needed meters before participants were allowed to pick up devices, which caused some delay in the
project start.
Selected participants were instructed via email on February 19 to pick up their devices from Chugach
between February 22 and March 5. When test participants picked up their equipment, a lead service
representative met with them to review the equipment, installation, and registration.
The pick-up, installation, and registration process took longer than Chugach anticipated. The initial group
of test participants picked up their equipment between February 22 and March 16. Despite enthusiasm for
the project, test participants had to be reminded, usually by email, to pick up their devices, install, and
register them. Test participants who were particularly slow to respond received direct phone calls.
Volunteers who decided not participate or failed to pick up the Tendril equipment by mid-March were
replaced by other volunteers, who picked up their equipment between March 26 and March 30.
Watt Buster Final Report
6
Originally, Chugach planned to conduct training sessions for the test participants. However, upon
consultation with Tendril and based on Chugach staff’s own training, Chugach determined that formal
training sessions with the test participants would not be necessary. Directions provided by Tendril were
clear, and both Tendril and Chugach provided support and assistance on request.
Chugach established a dedicated email address (WattBuster@chugachelectric.com) and phone number
for Watt Buster questions. Key Chugach staff stayed in contact with participants to answer questions and
monitor connections. In addition, supporting materials—Frequently Asked Questions, a Tendril User
Guide, an explanation of the Home Area Network—were posted on Chugach’s SmartPowerAK website.
Note: Refer to Appendix B for copies of these materials.
Most volunteers had few problems installing and using the Tendril devices. Problems encountered were
primarily from data input errors (such as an incorrect address), signal pick-up problems, or faulty
Translate devices. Most of the problems were rectified quickly. Several participants failed to follow
through and never really participated in the project. The majority of complaints and problem s were
addressed as they emerged by Chugach personnel. In a few instances, participants contacted Tendril
directly. Chugach personnel also referred problems to Tendril when necessary.
2.1.1.4. Energy Efficiency Education
Throughout the test period, education on energy efficiency was provided on SmartPowerAK, Chugach’s
energy efficiency website. In addition, Chugach posted a series of weekly messages through Tendril,
which were displayed on the Insight and the Vantage Web Portal.
Additional energy efficiency education was sent to all Chugach residential customers in a special mailing
and in articles in the Outlet, Chugach’s member newsletter.
Note: Refer to Appendix B for copies of these educational materials.
2.1.1.5. Participant Surveys
During the Tendril portion of the project, three surveys were conducted: a baseline survey, a mid-project
survey, and a closing survey. All three surveys were circulated in draft among the Chugach team to
ensure thorough review and comment. When finalized, the survey was entered into SurveyMonkey
(www.surveymonkey.com), an online survey tool. Chugach then emailed each test participant a unique
link to the survey, and participants completed the survey online.
The intent of the baseline survey was to gather as much early information as possible, regarding test
participants’ habits, expectations and motivations, as well as some additional household characteristics. It
was drawn in part from one conducted by Cape Light Compact in Massachusetts. Test participants were
asked to take the survey after they had installed the Tendril system in order to get an immediate
assessment of the installation process. The first baseline survey was received March 5; the final response
was submitted April 14, 2010. The response rate was 94%.
The objective of the mid-project survey was to take the pulse of how the project was going and to flag any
major changes that might affect the results, such as number of household residents and any extended
vacancies. The link for this second survey was emailed May 14. The final response was received June
18, 2010, with a total response rate of 78%.
The closing survey was designed to capture test participants’ perceptions of the project: whether their
behaviors and attitudes had changed as a result of it and their views about the path toward increased
energy efficiency. The closing survey, conducted between July 1 and August 4, had a 90% response rate.
Note: Refer to Appendix C for copies of these surveys.
Watt Buster Final Report
7
2.1.1.6. Control Groups
To evaluate whether the BEM affected energy use during the test period, Chugach created two control
groups. Each control group was selected to reflect the typical Chugach residential customer with the
following specific attributes: year home was built, land use description, residential style, heat fuel, heat
type, total number of rooms, number of bedrooms, number of bathrooms and square feet of living area.
Two control groups were selected to reflect the two different heat types: forced air and hot water
baseboard heat. Overall, Chugach customers are split about evenly between the two heat types.
The control groups were compared to subsets of the test group, selected to match the attributes. The “Hot
Water” control group (refer to Table 2) consisted of 1,097 residential customers who matched attributes of
21 participants, who comprise the Hot Water Test Group.
Table 2: Hot Water Group Characteristics
Attribute Hot Water Control Group Hot Water Test Group
Number in Group 1,097 21
Land use Single family home Single family home
House type Bi-level/two-story/split-level Bi-level/two-story/split-level
Average year built 1974 1973
Total # rooms 8 8
Bedrooms 4 4
Bathrooms 2.5 2.5
Living area 2,348 sq ft 2,284 sq ft
Heat type Hot W ater Hot Water
Heat fuel Natural Gas Natural Gas
The “Forced Air” Control Group consisted of 811 residential customers who matched attributes of 23
participants (refer to Table 3).
Table 3: Forced Air Group Characteristics
Attribute Forced Air Control Group Forced Air Test Group
Number in Group 811 23
Land use Single family home Single family home
House type Bi-level/two-story/split-level Bi-level/two-story/split-level
Average year built 1990 1992
Total # rooms 8 8
Bedrooms 4 4
Bathrooms 2.5 2.5
Living area 2,399 sq ft 2,450 sq ft
Heat type Forced Air Forced Air
Heat fuel Natural Gas Natural Gas
Watt Buster Final Report
8
2.1.2. Results
The following section describes the results of this portion of the research project.
2.1.2.1. Change in Consumption
Chugach evaluated the change in consumption
• During the test period between control and test groups
• From 2009 to 2010 for the test groups
Note: For these evaluations, subsets of the participants were used to compare with the control groups,
as described above. Potentially significant variables such as weather and energy price were not
ascertained for purposes of this evaluation.
Test Group vs. Control Group. As shown in Table 4, during the months of January and February 2010,
before the project began, the Hot Water Test Group was using 13% more electricity than the
corresponding Control Group. Once the project had begun, the Test Group reversed this as they
decreased consumption to 4% less than the Control Group.
Table 4: Hot Water Group—Test vs. Control
Test Group
(avg. usage) Control Group
(avg. usage) Usage
Difference
(Test vs. Control)
Percent
Difference
Pre-Tendril
Deployment
January 2010 1,100 983 +117 +11.9%
February 2010 1,056 921 +135 +14.6%
Average 1,078 952 +126 +13.2%
During Tendril
Deployment
March 2010 872 838 +34 +4.1%
April 2010 757 782 -25 -3.2%
May 2010 678 724 -46 -6.3%
June 2010 578 678 -100 -14.7%
Average 721 755 -34 -4.5%
As shown in Table 5, the Forced Air Test Group was using 2% less than the corresponding Control Group
in January and February. Once the project had begun, the test group widened the margin as their
consumption dropped to 4% less than the control group.
Watt Buster Final Report
9
Table 5: Forced Air Group—Test vs. Control
Test Group
(avg. usage) Control Group
(avg. usage) Usage
Difference
(Test vs. Control)
Percent
Difference
Pre-Tendril
Deployment
January 2010 1,069 1,100 -31 -2.8%
February 2010 989 1,014 -24 -2.4%
Average 1,029 1,057 -28 -2.6%
During Tendril
Deployment
March 2010 863 924 -61 -6.6%
April 2010 777 833 -56 -6.7%
May 2010 711 773 -62 -8.1%
June 2010 721 673 +48 +7.1%
Average 768 801 -33 -4.1%
Test Group vs. Previous Year. When the Test Group’s 2010 usage during the test period was compared
with their usage during the same period in 2009,
• The Hot W ater Test Group consumed 1.9% more in 2010
• The Forced Air Test Group consumed 2.5% less in 2010
For the entire participant group, there was no change in consumption from 2009 to 2010 during the
March-June test period.
Important: The data was adjusted to account for changes in the number of billing days. Additionally,
both of these statistics fall into the error margin from a practical standpoint. Contributing
factors not taken into consideration, but having significant impact, include price elasticity,
occupant ages, number of household occupants, and weather. Further analysis would be
required to determine why the test group’s consumption increased over the previous year.
One possible explanation is that gas prices, and hence, electricity, were high in 2009.
Consumers tend to increase and decrease electric consumption based on price. It may be
that the test group consumed more in 2010 in response to the lower prices.
2.1.2.2. Significant Results for Different Groups or Classes of Subjects
Note: The results in this section are for the entire participant group.
Chugach also compared the following variables against change in usage between March – June 2009
and March – June 2010:
• Home size
• Age of home
• Heat Type
• Heat Fuel
• Energy Efficiency Efforts Prior to Study
Watt Buster Final Report
10
Home Size. The biggest drops in consumption were in homes of 3000+ sq ft (2.5%) and in homes 1501-
2000 sq ft (2%).
Table 6: Consumption Change by Home Size
Home Size % Change
(2010 vs. 2009)
Up to 1,500 sq ft +1.3%
1,501 – 2,000 sq ft -2.0%
2,001 – 2,500 sq ft +3.2%
2,501 – 3,000 sq ft -1.3%
3,001+ sq ft -2.5%
Age of Home. Homes built in the 1980s were the only group that used less electricity. Homes built in all
other decades increased consumption in 2010, with the oldest homes having the largest increases.
Table 7: Consumption Change by Year of Construction
Year of Construction % Change
(2010 vs. 2009)
1950s/1960s +5.1%
1970s +2.5%
1980s -5.1%
1990s +0.7%
2000s +0.8%
Heat Type. Regardless of heat type, the test homes consumed slightly more in 2010 than in 2009.
Table 8: Consumption Change by Heat Type
Heat Type % Change
(2010 vs. 2009)
Forced Air +0.4%
Hot Water +0.3%
Heat Fuel. Test households that heat with natural gas, comprising the vast majority of the total test group,
increased their consumption 0.3% in 2010. Households heated with electricity consumed 25.1% less in
2010 than in the same period the previous year.
Table 9: Consumption Change by Heat Fuel
Heat Fuel % Change
(2010 vs. 2009)
Natural Gas +0.3%
Electric -25.1%
Energy Efficiency Efforts Prior to Test. In the initial application, the following was posed as a yes or no
question: “I have already done just about everything I can to make my home energy efficient.” This
question was included on the assumption that households who answered “yes” were unlikely to realize
Watt Buster Final Report
11
additional efficiencies. However, test participants who answered “yes” reduced their consumption 6.5%,
compared to 1.7% increased consumption among those who had not done all they could.
Table 10: Consumption Change by Previous Energy Efficiency Efforts
Previous Energy
Efficiency Efforts
% Change
(2010 vs. 2009)
No +1.7%
Yes -6.5%
2.1.2.3. Participant Feedback from Surveys
Through the three surveys during the study, the following key information was learned:
• Test participants were already engaging in energy efficient practices before the project began.
Most participants had already replaced at least some of their incandescent bulbs with CFLs and
usually or always turn off lights in unoccupied rooms. About three-quarters already ran their
washer, dryer, and dishwasher only with a full load.
• In the closing survey, test participants were asked what, if any, changes they had made as a
result of participating in the project. Almost two-thirds of the respondents (63.2%) reported
changes in their household energy use. Table 11 lists changes cited, in descending order of
frequency. Yet at least 85% of them had been engaging in energy efficient practices before the
project began. Did participation motivate them into stepping up their efforts even more? Or might
there be some overlap of past efforts and recent efforts?
Table 11: Reported Changes in Energy Efficiency Practices
Change Percent
Turn off lights in unoccupied rooms 79.3%
Completely turn off or unplug electronic devices not in use 65.5%
Run dishwasher, washing machine and dryer only when full 56.9%
Turn down thermostat in winter, when leave house 41.4%
“Other” 27.6%
Installed a programmable thermostat 24.1%
Turn down temperature of hot water heater 22.4%
• As a result of the project, participants became more knowledgeable about and interested in
household energy use and energy efficiency.
• Participants plan to purchase energy efficiency appliances in the future.
• Over the course of the test period, participants’ responses on several issues remained constant:
information that would help households be more efficient; barriers to becoming more efficient; and
what others would pay for a BEM.
• Participants believe the following types of information would help customers become more energy
efficient:
• Which appliances and equipment are least efficient
• Habit changes that wouldn’t impact their lifestyle
• Information about the most energy-efficient product models
Watt Buster Final Report
12
• Participants considered the cost of new appliances and (in a distant second) the desire for
comfort at home as barriers to energy efficiency.
• Consumers are willing to make some behavior changes, especially changes that do not
inconvenience them or impact their comfort level.
Note: Refer to Appendix C for copies of the complete survey results.
2.1.3. Conclusions
Based on the results of this project, it is not completely clear whether BEMs result in reduced energy
consumption. Although test participants reported they did adopt more energy efficient practices, the hard
data is not conclusive.
As described in section 2.1.2.2 (Significant Results for Different Groups or Classes of Subjects), Chugach
compared 2009 and 2010 usage based on several characteristics of the buildings. Based on this data,
BEMs may be most effective for those individuals
• With electric heat. Test participants with electric heat consumed 25.1% less energy during the
test period compared to the same months the year before. They account for a very small
percentage of the total participants, so the impact of those savings was diluted in the aggregate.
A possible explanation is that households heated with electricity would be affected
disproportionately by mild temperatures.
• Who are interested in energy efficiency. One of the unexpected findings was that test
participants who had already taken steps to be more energy efficient in fact reduced their
consumption significantly more than others.
• With homes built in the 1980s. Homes built in the 1980s were the only energy savers; their
consumption dropped 5.1%. Test homes built in all other decades increased energy consumption
in the 2010 test period.
The size of home may also impact the effectiveness of BEMs, but these results were mixed, with the
largest homes (3,000 square feet or more) reducing the most, followed by homes of 1,501 to 2,000
square feet). However, there was essentially no difference in energy savings based on heating type,
whether forced air or hot water.
Perceptions of inconvenience and quality of life impacts are probably significant barriers to energy
efficiency among the general population. This conclusion is based on the significant rating of these as
barriers among our highly motivated Watt Buster participants. If this is an issue for them, it is likely a
bigger issue for the general public.
In addition, greater efforts should be devoted to effective information and education about the energy use
of specific appliances and household behaviors. Also, a table of low-medium-high energy consumption for
households with multiple variables could provide customers a context for how they compare to similar
households.
Time-of-day pricing would likely be a significant incentive for energy efficiency. In addition, although
participants found the BEMs useful, under the status quo—with no time-of-day pricing—customers are
probably unwilling to pay the actual costs of BEMs.
Watt Buster Final Report
13
2.2. Comparison/Assessment of Building Energy Monitors
The original plan included a single project that assessed the effectiveness of BEMs and to identify the
most effective BEM features. However, when Chugach learned that the EnergyHub and OpenPeak BEMs
would not be available for the four-month research period, the project team concluded that the only
practical approach would be to recruit a smaller group of volunteers for the comparison and assessment
component of the project.
2.2.1. Methodology
On June 22, Chugach sent an email to all 96 Watt Buster participants, explaining the comparison
component and asking for volunteers (refer to Appendix A for a copy of this email). The only requirement
was that continuing volunteers had to have completed all three of the Watt Buster surveys. Chugach
accepted the first 10 participants who responded.
When the Tendril testing ended June 30, the 10 continuing test participants picked up the OpenPeak
system and were asked to use it for one month. Their actual usage ranged from 3 weeks to more than
two months. Most of the participants k ept it for two months or more. During this period, one volunteer
became frustrated and opted out of the remainder of the project. The plan was for the volunteers to return
the OpenPeak after one month and install the EnergyHub. However, testing by Chugach revealed
numerous problems with EnergyHub, which delayed delivery to the volunteers.
Participants returned their OpenPeak devices between August 7 and September 17. The EnergyHub
systems were picked up between September 9 and September 20, Chugach had asked participants to
use the EnergyHub for at least one week and to return it by September 24. Perhaps because of the delay,
only three of the remaining nine volunteers actually installed the EnergyHub.
Chugach then posted a comparison survey of the three BEM systems on SurveyMonkey and sent links to
the nine remaining participants on September 24. All nine participants completed the survey.
Note: Refer to Appendix C for a copy of the survey.
2.2.2. Devices Used
OpenPeak’s OpenFrame 7E consists of a base unit, an Ethernet cable, power adapter, and cleaning
cloth. The unit can be used with either wireless or Ethernet. Once programmed, the OpenPeak device
receives kwh consumption information from the consumer’s electric meter. Applications, such as
YouTube, games and Google Map, may be added to the base unit. OpenPeak had intended to solidify an
agreement with Google so that consumers could use Google Power Meter to track their power
consumption online. However, the agreement was not in place for Chugach testers to use.
Note: Refer to Appendix B for instructions Chugach prepared to complement the OpenPeak Quick Start
Guide.
The EnergyHub System consists of a dashboard display, a power strip, three sockets, a power cord, a
user manual, and an installation checklist. The EnergyHub, like the Tendril, requires a wireless
connection. The power strip and sockets permit the consumer to monitor and track energy consumption
of specific appliances. (Tendril offers an add-on for appliance monitoring—Volt—which Chugach did not
purchase.) EnergyHub also promised the Google Power Meter option, but it did not materialize during the
project time frame.
Watt Buster Final Report
14
2.2.3. Results
Through the survey, the following key information was learned:
• For many, EnergyHub was too complicated. Testers who were not intimidated liked it very much,
in particular the appliance monitoring feature. It was rated the most difficult to use.
• Tendril had several strengths over the others, such the ability to compare current and past weeks
and months, the ability to compare with other households, and the availability of a W eb portal. It
rated the easiest to use and provided the most valuable information.
• The OpenPeak was simple to use, with a pleasing design, but it didn’t offer as many energy
efficiency features as the other two. It was rated easiest to install.
Table 12 summarizes what the respondents liked and disliked about the different systems.
Table 12: BEM Comparisons
Device Positives Negatives
OpenPeak • Simple to connect and view
• Clock feature
• Looks contemporary
• Usage graph, smaller time increments
• Not as detailed in usage
• No thermostat control
• No ability to monitor individual
appliances
• No web browser
• No remote access
Tendril • Thermostat
• Excellent wireless connectivity
• Good base display
• Able to compare week to week
• Liked hourly reading
• Simplicity and small footprint
• Usage graph
• Liked the brief energy efficiency
messages
• Output clear and understandable
• Didn’t like having three devices
• Son kept unplugging the transmitter
• Information compounded and not easy
to analyze
• No information about specific
appliances
• Looks old school
• Hard to see and read compared to
others
• Want usage broken into smaller time
increments
• Difficult to hook up and equipment was
bad, had to be replaced
• Required the monitor to be near the
receiver
EnergyHub • Ability to monitor individual appliances
(mentioned repeatedly)
• More user friendly
• Easy to operate
• Gives good information
• Monitor is nice, communicates with
SmartMeter
• Overwhelming, too complicated
• Wireless connection spotty at best
• System would drop out for no reason
• Took a long time to install and couldn’t
reach the appliances I was most
interested in (water heater)
• Look and style
• A hassle; too many parts
Note: Refer to Appendix C for a copy of the complete survey results.
Watt Buster Final Report
15
2.2.4. Conclusions
The technology of building energy monitors is still very much in flux. All three companies fell short on
some of their promises or over-stated the functions and capabilities of their systems. Chugach believes
the EnergyHub system has the most potential for value, to both customer and the utility, but its usefulness
also makes it more complex for the customer.
2.3. Use of Appliance Power Meters
Chugach supplemented the BEM research project by gathering information about the use of appliance
power meters, which is described in this section.
2.3.1. Methodology
In October 2009, Chugach began offering consumer power meters on loan to customers. Two different
meters are available: the Kill A Watt and the Watts Up?. Both models allow the consumer to measure
power consumption of household appliances and determine the cost of power consumed. The appliance
is plugged into the meter, which is then plugged into an outlet. The consumer inputs the electric rate,
selects a cost projection period (e.g., hour, day, week, month, year), and selects a power measurement.
The Chugach customer may keep the appliance meter for up to two weeks. Customers borrowing the
device receive a briefing on how to use it and a copy of the operating manual.
In April, Chugach publicized the availability of these meters in emails and in the Outlet. Beginning in April,
borrowers were asked to complete a brief survey when they returned the device. A Chugach customer
service representative called borrowers who neglected to return the survey. Between April 1 and
September 30, 114 Chugach members borrowed an appliance meter; 84 of them completed the survey,
in whole or in part.
Note: Refer to Appendix C for a copy of the survey.
2.3.2. Results
Through the survey, the following key information was gleaned:
• Two-thirds of the respondents borrowed the Kill A Watt device.
• The most frequently cited reasons for borrowing the meter included
• Lower their electric bill (73.1%)
• Desire to reduce energy use (67.9%)
• Curiosity (65.4%).
• Nearly 80% found the meter easy to use.
• Close to two-thirds (63.1%) had not made any changes based on what they learned, but about
the same amount (62.3%) did plan to make changes.
Note: Refer to Appendix C for a copy of the complete survey results.
2.3.3. Conclusions
These are useful tools that help customers become more aware of how various appliances use energy.
Although using the meter did not usually result in immediate changes, they appear to motivate customers
to plan energy efficiency changes in the future. Chugach should continue to loan out these power meters
but needs to remind customers periodically that they are available. Interest in them spiked after an article
appeared in the Outlet, but dropped again.
Watt Buster Final Report
16
2.4. Recommendations for the Future
Based on the information learned through the residential component of this research project, Chugach
recommends the following:
• Focus education efforts on painless habit changes. Consumers are willing to change their energy
consumption behavior if those changes do not inconvenience them or impact comfort level.
• Provide clear information comparing the energy consumption of specific appliances and models.
• Provide an easy way for consumers to compare their household energy consumption with other
similar households. Tendril’s comparison was not adequate since there was no meaningful
breakdown of household attributes, such as home size and number of household members.
• Consider proposing time-of-day pricing to encourage energy efficiency at peak times.
• Design and conduct a longer term project using just the EnergyHub. Although it is more
complicated to install, it has tremendous potential value to both customers and the utility.
• Continue to loan appliance power meters to customers, reminding them periodically that they are
available. The meters are useful tools that help customers become more aware of how various
appliances use energy. Although using the meter did not often result in immediate changes, they
appear to motivate customers to plan energy efficiency changes in the future.
Watt Buster Final Report
17
3. Commercial Component
The commercial portion of the BEM study was comprised of BuildingAdvice energy assessments for 32
commercial buildings: 21 from Chugach’s service area and 11 from Municipal Light & Power’s (MLP)
service area. Unlike the BEMs used in the residential portion of the study, which allowed occupants to
monitor electric usage in near real-time over a multi-month period, the commercial energy assessments
modeled a building’s total energy use (based on one week of monitoring data and background building
data) and generated a report with recommendations for energy savings. The project team then reviewed
the report’s findings and recommendations with each participant. Following the report, test participants
completed a survey about changes they made or plan to make because of what they learned.
3.1. BEM Used
For the commercial component, the project team used the BuildingAdvice system (www.airadvice.com).
This modeling system assesses how a building uses energy based on the following data:
• Real-time measurements of temperature, humidity, carbon dioxide, and lighting from monitors
placed throughout the building
• Background data
• Utility bills
• Building information (e.g., construction type and usage profile)
• HVAC system information (e.g., system type and efficiency, control systems type, schedules,
and set points)
• Weather data
Using this data, the BuildingAdvice software models how the building uses energy and generates an
energy savings assessment that includes information about current use, estimated potential savings, and
general recommendations on improvements.
Each of these BuildingAdvice elements is described in more detail in the following sections.
3.1.1. Monitors
A BuildingAdvice system includes 10 monitors—each approximately 6”x5”x3” in size—and a gateway.
The monitors are placed throughout the building and measure ambient temperature, relative humidity,
ambient light, and carbon dioxide 1
At the beginning of the study, the project team had one BuildingAdvice system available. However, to
complete all the needed monitoring within the study period, the team added a second set of monitors at
the end of April. Although this second set measures the same data, it is not a BuildingAdvice system: the
monitors record the data but do not transmit it to a data center. Instead, the Control Contractors
representative manually downloaded the data after each deployment.
in 2-minute increments. Each monitor has a power supply and uses
the ZigbeeTM wireless mesh protocol (802.15.4MHz) to communicate readings to a central
Communication Gateway. This Gateway transfers the monitor readings to the AirAdvice data center using
cellular (GSM) communications. In addition, to prevent data loss if wireless communication is not
available, the monitoring system can also store up to one month of data.
1 Only 5 of the 10 monitors record carbon dioxide.
Watt Buster Final Report
18
3.1.2. Background Data
In addition to the monitor data, the BuildingAdvice modeling software also incorporates background data
about the building. For this study, the following building information was gathered for each building:
Table 13: Commercial Background Data Gathered
Category Information
Utility billing history • Electric usage and demand (kWh/KW and total cost) for 12 months
• Fuel usage and cost for 12 months
Building information
• Building usage
• Square footage (total, gross floor area for various areas)
• Number of stories
• Construction type
• Weekly operating hours
• Number of workers on main shift
• Number of PCs
• Offices heated (percentage)
• Offices cooled (percentage)
HVAC system
information
• Heating system type
• Domestic hot water type
• Economizer (yes/no)
• Cooling system type
• Humidification system (yes/no)
• Demand control ventilation system (yes/no)
• Dehumidification system (yes/no)
• Temperature control schedule (time, heating set point, cooling set point,
whether occupied)
Controls system
information
• Controls type
• Schedule type
• Air delivery method
• Air-flow modulation
In addition, the BuildingAdvice model uses weather data (for the monitoring period and for the period of
the utility history provided) from the weather station nearest the building’s zip code. The model also
incorporates ENERGY STAR data for comparable buildings 2
2 Because Alaska-specific data is limited, it is assumed that these comparable buildings include many from outside
Alaska.
.
Watt Buster Final Report
19
3.1.3. Energy Savings Assessment Report
Using the monitor and background data, the BuildingAdvice software generates an assessment report for
the building 3
• Executive Summary: The building’s calculated energy use (in kbtu/sq ft), ENERGY STAR rating,
cost (in dollars/sq ft), and carbon footprint. In addition, the building is compared against other
similar buildings in the ENERGY STAR database. At the bottom of the summary, the report
provides estimated potential savings from raising the building’s ENERGY STAR rating. All of
these values are calculated using the utility billing history and the ENERGY STAR Portfolio
Manager
. The report includes the sections described below; refer to Appendix G for copies of each
report.
4
• How Does Your Building Use Electricity Today?: The building’s historical electricity usage
(and demand, if the data was provided) compared to outside temperatures.
.
• How Does Your Building Use Fuel Today?: The building’s historical gas usage compared to
outside temperatures.
• Building Comfort and Ventilation Analysis: Summary of the ambient temperature, humidity,
and carbon dioxide monitoring results.
• Temperature: Detailed information about the ambient temperature readings from the monitors,
including the temperature spread during occupied times and the percentage of time the
temperature was outside the desired range. The bottom of the page includes a graph of the
readings from each meter, which helps identify changes in temperature and when those changes
occurred. The page also provides temperature-related recommendations for saving energy.
• Relative Humidity 5
• Carbon Dioxide: Detailed information about the carbon dioxide readings from the monitors,
including recommendations and a graph of readings. The carbon dioxide readings help identify
how much fresh air is being brought into the building: the more fresh air, the lower the carbon
dioxide. During times with cold or hot outside temperatures, using less outside air reduces energy
usage because the HVAC systems are heating/cooling less air. However, during moderate
temperatures, bringing in outside air can reduce the use of the HVAC systems.
: Detailed information about the humidity readings from the monitors, as well
as recommendations for improvement and a graph of the meter readings.
• Lighting: Detailed information about ambient light readings from the monitors. As with the other
detailed pages, the lighting page in the report includes a graph of the monitor readings and
recommendations on energy savings opportunities.
• Outdoor Conditions: Temperature and dew point during the monitoring period.
• Building and Monitor Placement Information, Building Description, Building Controls
Information, and Building Utility Information: List of data provided by the participant for the
report as a means of validating the accurate report inputs.
3 Although one set of monitors used was not a BuildingAdvice system, the same software was used to generate the
assessment report.
4 Additional information about Portfolio Manager is available at
http://www.energystar.gov/index.cfm?c=evaluate_performance.bus_portfoliomanager
5 Humidity is a component of air quality, not energy efficiency. Also, in Alaska, it is normal to see very low humidity
readings.
Watt Buster Final Report
20
3.2. Methodology
The following sections describe the methodology used for the commercial component.
3.2.1. Customer Recruitment
For the study, the project team needed 30 commercial customers, including both small and large
commercial customers. To recruit the needed commercial customers, representatives from Chugach and
MLP selected a pool of prospective commercial participants based on their knowledge of the customers.
The following aspects were considered when selecting the customers:
• Building square footage (minimum of 10,000 sq. ft.)
• Industry/Building Use
• Demonstrated level of interest in energy efficiency
• Past responsiveness to the utility
• Utility service area
After identifying a potential pool, the utility representatives contacted a building representative via email,
phone, or fax with information about the project. Initially, the utilities contacted potential participants one
at a time, as the project was ready for a new participant. However, because this approach exacerbated
scheduling challenges, in early April Chugach sent an email (refer to Appendix E for a copy) to the
remaining pool of approximately 50 potential participants, telling them about the study and asking them to
participate.
After this email was sent, approximately 15 customers contacted Chugach, indicating they were
interested in participating in the study. To fill the remaining slots (as well as two other openings when
participants opted out after the process began), the utilities identified other potential participants and
contacted them directly until they found a total of 30 participants. All participants were confirmed by the
end of May.
Note: Although Chugach only planned to include 30 buildings in the project, because of changes over
time, the project team completed assessments on 32 buildings.
The final participant list included the following building usage/size:
Table 14: Commercial Participants
Square Footage Building Usage
Grocery Office School Retail Warehouse Hotel Total
10,000-29,999 1 10 1 1 13
30,000-49,999 6 6
50,000-69,999 3 2 5
70,000-89,999 2 2
90,000-199,999 2 2 4
200,000+ 2 2
Watt Buster Final Report
21
3.2.2. Initial Assessment Meeting
After a commercial customer indicated interest in participating in the study, Chugach scheduled a meeting
with the customer. The meeting was attended by the following people:
• Chugach representative
• MLP representative (if the customer was in the MLP service area)
• Control Contractors representative
• Customer representative(s) in the following roles
• Management representative (e.g., COO)
• Facility manager
• Facility maintenance personnel
Note: For some customers, the roles of management representative, facility manager, and facility
maintenance personnel were filled by one person; for other customers, these roles were filled by
multiple people.
This initial assessment meeting typically lasted between 45 minutes and 2.5 hours, depending on the size
of the building, complexity of the systems, age of the systems, and questions from the occupants.
Twenty-nine of the assessment meetings were completed by mid June; the last assessment meeting was
combined with the monitor deployment because of the building’s location.
3.2.2.1. Confirming Study Participation
During this initial meeting at the customer’s location, the utility and Control Contractors representatives
explained the study and assessment process in greater detail, including the customer’s responsibilities:
• Provide building and historical energy usage data
• Assist the project team in identifying locations to place monitors
• Have a representative available when monitors are deployed and picked up from the building (10
monitors in place for 1 week)
• Meet with the project team to review the report and recommendations
• Complete a brief follow-up survey
After providing this introductory information, the commercial customer could decide to not participate in
the study 6
3.2.2.2. Gathering Background Information
.
If the customer committed to participating in the study, the project team gathered basic building
information that the participant representatives would know without research, such as building square
footage. After the prelim inary information was gathered, the project team walked through the participant’s
building with facility representatives to record additional information about the building, such as
specifications on HVAC systems.
If all of the needed information was not readily available, the project team requested that the participant
research the information. In most cases, the utility representatives assisted in gathering utility usage by
pulling the data from the utility’s databases.
6 Only one customer opted to not participate after this meeting.
Watt Buster Final Report
22
3.2.2.3. Identifying Monitor Locations
During the walk-through, the project team also identified the best locations to place the monitors. When
identifying these locations, the project team particularly looked for the following:
• Populated areas through which people walk (e.g., reception areas)
• Areas in each of the building’s four corners
• Areas about which the facility manager receives the most complaints (e.g., too hot or too cold)
In addition, the project team tried to ensure that the locations were in areas that would be used during the
week the monitors were in place (e.g., verify that the person in an office was not going to be on vacation
during the deployment week).
3.2.2.4. Scheduling Deployments
The project team also worked with the participant to schedule a time for the monitors to be deployed.
Initially, the team allowed participants to select dates that worked best for them. However this approach
was very difficult to manage because of the number of schedules to be coordinated; also, such ad-hoc
scheduling did not always efficiently utilize the monitors (e.g., there were gaps between deployments).
For deployments beginning at the end of April, the project team created a deployment schedule with pre-
defined slots, from which the participants could select. Each week included two slots: one from Tuesday
afternoon to Tuesday morning and one from Wednesday afternoon to Wednesday morning.
3.2.3. On-site Monitoring
On the scheduled deployment date, the project team brought the monitors to the location at the pre-
arranged time (typically 2:00 PM). The team then placed the monitors in the pre-selected areas
throughout the building. When a monitor was placed in an area, the team described to occupants the
purpose of the monitors and requested that they not unplug or move the monitor. In addition, the team left
documentation with each monitor to explain its purpose and request that it not be moved or unplugged.
In spite of these precautions, a few monitors were unplugged, either for short periods of time (e.g., while
janitors were vacuuming the space) or for extended periods (e.g., the monitor was unplugged and left
unplugged for the remainder of the week). When a monitor was unplugged, it did not record any data for
that period; if the no-data period was long enough, the project team excluded results from that monitor in
the report.
At the end of the monitoring week, the Control Contractors representative returned to the location and
gathered all the monitors. If necessary, he downloaded the data from the previous week and reset the
monitors before deploying them at the next location.
All on-site monitoring was completed by the end of July.
3.2.4. Assessment Reports
Before preparing the energy assessment report, the project team ensured all data was available and
worked with the participant as needed to gather the data. Once all the data was available, Control
Contractors entered the data in the BuildingAdvice software and generated the energy savings
assessment.
Watt Buster Final Report
23
3.2.4.1. Reviewing Reports
Before giving the report to the participant, the project team reviewed the report for anomalies,
investigated any found and made adjustments as needed. For example, in some cases the project team
determined that data provided by participants was incomplete or inaccurate (especially for historical
usage data). In these cases, the project team worked with the participants to correct the data and reran
the reports.
3.2.4.2. Reporting Findings
Once the assessment report was ready, the utility representatives scheduled a review meeting with the
participant. This meeting, which typically lasted from 1 to 3 hours, was attended by the project team (i.e.,
utility representatives and a Control Contractors representative) and participant representatives.
The project team encouraged participants to have someone with decision-making authority—such as the
facility manager, CEO, or COO—attend the meeting. For smaller buildings, usually the owner and a
facility representative attended. In addition, sometimes participants included external parties, such as the
company managing the building control systems, the property manager, or the building’s administrative
manager.
During the meeting, the project team explained the report and its findings, reviewing and explaining each
page with the participants. Although meeting attendees usually understood some aspects of the report
(e.g., lighting), other sections—such as control systems—required additional explanation. For each area
within the report, the project team also identified recommendations on areas the participants could
investigate further.
While discussing the findings, the participants were oftentimes able to identify activities in their buildings
that correlated with the data, such as changing out the boilers to reduce gas usage.
3.2.5. Surveys
After the assessment review meeting, the project team gave participants a follow-up survey to assess the
usefulness of the report and whether they plan to implement any of the energy efficiency
recommendations (refer to Appendix H for a sample of the survey).
The project team posted the survey on SurveyMonkey and initially sent emails to participants shortly after
the assessment review meeting. However, because of a low response rate, the project team began also
giving participants a hard-copy version of the survey at the end of the assessment review meeting and
personally contacting participants to remind them to complete the survey. Although participants were very
slow to complete the survey, requiring multiple follow-up contacts by the project team, all participants did
complete a survey before the end of the research project.
3.2.6. Customer Education
As alluded to throughout this Methodology section, customer education in the commercial portion of the
study was very individualized. Utility and Control Contractors representatives met with each participant at
least twice. During these meetings, the energy assessment program was described in detail, including
information about elements such as ENERGY STAR, potential energy savings, and factors affecting
energy efficiency. In addition, the project team was able to discuss specifics about the participant’s
building and systems and answer participants’ questions.
Watt Buster Final Report
24
3.3. Results
The majority of facilities took advantage of the BEM project to assess where they were in energy
efficiency and plan future efforts. The few buildings with high ENERGY STAR ratings used the
assessment to validate measures they had already taken. The survey responses indicate participants are
very responsive to recommendations for low cost, simple measures that could be taken, prior to or short
of major changes in the facility.
Those who did well in ENERGY STAR ratings (75-100) were aware that with small changes they can
achieve an ENERGY STAR rating. At least two locations plan to pursue changes needed to achieve
ENERGY STAR designation.
The assessments met participants’ expectations and they were satisfied with the results and
recommendations. A strong majority (80%) of the participants plan to pursue one of more of the
recommendations. Their decisions about whether to implement recommendations are driven largely by
the potential energy savings and cost.
Information deemed most helpful to the participants tended around themes of
• How customers compare with others
• Where the savings potential is
• Where minimal effort will yield significant savings
• Distinctions between gas and electric use
The vast majority (29 of the 30) would recommend the assessment to others. It should be noted,
however, that no reference was made to the actual cost of conducting an assessment, and it was free-of-
charge to the test participants.
Note: Refer to Appendix F for a summary of key findings from each location; Appendix G includes
copies of the full assessment reports, each of which includes findings and recommendations for
that location. Refer to Appendix H for results from the survey.
3.3.1. Additional Research
In part to complement the commercial component, Chugach commissioned Ivan Moore Associates to
conduct a phone survey of Chugach’s largest commercial customers. One hundred twenty-one customers
responded to the survey. It is important to note that unlike the W att Buster research, these participants
were not self-selected; they were called at random from lists provided by Chugach. Key results of this
survey include the following:
• Taken as a whole, respondents believe their energy consumption has increased over the past
three years (by a mean increase of 9.5%). However, consumption has actually declined 3% per
year. This disconnect between perception and actual will be an important element in future
education efforts.
• Commercial customers (77.9% of respondents) are taking steps to become more energy efficient.
The most common steps were installing energy efficient lighting (92.6%), increasing employee
awareness (85.3%), and streamlining operations and scheduling (56.8%).
• Of the respondents, 70.5% have done all they plan to do to improve energy efficiency. The 29.5%
who do plan to make energy efficiency improvements are most influenced by energy prices and
by improvements that would impact their bottom line.
Note: Refer to Appendix H for a copy of the survey and results.
Watt Buster Final Report
25
3.4. Conclusions
Based on the results of the BuildingAdvice project and the commercial survey, the following conclusions
can be drawn:
• How decisions are influenced. While the terms used may vary, commercial customers are
clearly driven by financial considerations: return on investment, capital outlay required, and
impacts to their bottom line.
• Low-hanging fruit. Many of the easiest and least expensive measures are not obvious to owners
and property managers. Chugach could help commercial customers make significant
improvements by helping identify the low-hanging fruit of simple, low, and no-cost changes.
Lighting retrofits are relatively inexpensive and yield significant savings.
• Interest in energy efficiency. Many commercial customers are taking steps to become more
energy efficient. Yet the phone survey indicates 70% have done all they plan to do. Is this an
immoveable roadblock—or a reflection of customers’ lack of awareness? Chugach suspects it is
the latter. An energy assessment, or similar process, would probably motivate customers to make
additional energy efficiency improvements.
• Comparisons with others. Most commercial customers are aware of ENERGY STAR ratings
and are generally interested in how they compare to similar facilities.
3.5. Recommendations for the Future
Based on the information learned through the commercial component of this research project, Chugach
recommends the following:
• Develop an information and education program including but not limited to the following
components:
• Real-life examples of specific energy efficiency improvements and money saved
• Publicize energy efficient role models, i.e., commercial customers who have made significant
improvements
• Examples from the research project as pointers for others
• Develop and publicize resources available to help commercial customers become more
energy efficient
• Work with Enstar to develop a coordinated approach to commercial customers
• Take maximum advantage of Chugach’s ENERGY STAR partnership to help commercial
customers achieve ENERGY STAR ratings
• Offer historical use profiles to reduce the disconnect between perceived and actual use (if this is
already available, it could be more heavily marketed)
• Continue to monitor BEM technology and consider repeating the research project in middle
schools of the Anchorage School District.