Maximizing the Value of Data Exchange (Part 1): Laying the Foundation
Utilities and regulators often perceive tools that help customers share their energy usage with service providers as “nice to haves,” but not critical for day-to-day customer needs. In the first of my three-part post, I break down how data access is not just cost-effective, but essential to energy affordability.
Load growth, aging grid assets, and supply chains are creating hot spots where energy bills are increasing, and customers are taking notice. Citing higher energy costs as well as impacts from power outages, JD Power recently reported that residential and commercial customer satisfaction with electric utilities is becoming strained.
Dozens of public utility commissions are now taking a closer look at what tools they have to help manage the impending costs. In Massachusetts, the Department of Public Utilities opened a proceeding to identify changes to rate structures, which could include cost caps. The District of Columbia Public Service Commission is seeking comments on whether it has the authority to consider affordability as a factor in rates.
Commissions are also exploring how load flexibility can reduce system costs, particularly focusing on virtual power plants (VPPs) as a tool to reduce or shift load to avoid the need for new power supply and other infrastructure. For example, the Maryland Public Service Commission recently directed electric utilities to be more ambitious with the size of their virtual power plant pilots, given affordability and resource adequacy pressures.
When energy costs increase, customers double down on how to manage their bills. They replace old equipment and invest in distributed energy resources to stabilize their bills and prevent them from losing food or medicine during outages. VPPs and time-varying rate programs will encourage these actions, enabling customers to contribute to peak reductions. VPPs hint at a world where people can save money on their own bills while making their neighbors’ bills cheaper too.
But for these actions to be successful, customers need information. Specifically, they need to get information about how and when they use power to the analysts and contractors who can help them conserve or shift their usage. Unfortunately, throughout much of the country, this process remains clunky and time-consuming. This means that we have a massive opportunity to unlock affordability for utility customers just by doing the boring work of cleaning up data transactions. This is an issue that tends to get buried in sprawling regulatory proceedings.
This post makes four assertions:
- To manage their energy, customers have to jump through hoops that waste time and money.
- Utilities and customers can save money by deploying data exchange platforms that serve many different data uses and users at once.
- Data exchange platforms can be deployed faster and cheaper than ever.
- Data exchange platforms punch above their weight because of the utility integration work they require, not in spite of it.
In the next sections, we’ll quantify the problem and we’ll talk about the best way to solve it, using policy tools that commissions already have.
To manage their energy, customers have to jump through hoops that waste time and money.
Say I am a homeowner who wants bids from three contractors to upgrade my gas furnace to an electric heat pump. I’ve spent some time figuring out how to use my utility’s online web portal, so I can find and download the 12 months of bills I need to send each contractor pretty easily. It takes me about 15 minutes. Each contractor who receives the bills has to transpose my usage data and check it for accuracy so they can calculate what size heat pump they’re going to recommend. Say it takes them about 45 minutes each to enter and check that data. That means that collectively, we’ve spent 2.5 hours processing basic data that comes from utility bills.
What are customers, and more often their contractors, spending this time doing? What they’re NOT doing is giving up on getting the data they need. Instead, they’re engaging in repetitive, inefficient, error-prone work:
- Instructing utility customers how to log onto their web portal and download 12-24 months of historic bill documents, and sometimes sitting on the phone with customers as they do this.
- Finding their utility web portal username and password and emailing it to their preferred contractors, so they don’t have to log in and download paperwork themselves.
- Building web scrapers to collect customer log-ins and pull data from utility web portals, and then updating web scrapers when utility web portals break, get replaced, or add new log-in steps.
- Emailing or faxing signed letters of authorization to utility staff, sometimes breaking them into groups of 5 or 10 attachments to prevent requests from being rejected. Then, cross-checking data sent by utility staff by email to confirm it’s associated with the correct utility customer, since meter transposition errors can occur on either end.
- Copying and pasting data from emailed PDF bills or manually entering data from mailed bills or non-searchable PDFs.
- Interpreting and cleaning data points–figuring out what various bill line items mean (riders, program names), converting data to different units, filling in estimates if data is incomplete.
Now extrapolate this level of effort to a VPP, the power of which comes from its ability to scale participants and capacity. According to the Brattle Group, a test of the Demand Side Grid Support program found that about 100,000 residential behind-the-meter battery systems could deliver a 1.9% reduction in CAISO’s net peak demand and contribute to over $28-206 million in net benefits over three years. If 100,000 customers and their prospective battery vendors spent 2.5 hours processing utility data to get project bids and enroll in the program, we would collectively lose over 28 years just to boring, basic data processing. (I had to triple-check that number because it’s so preposterous.)
The costs of doing this work get passed on to the customer through other means, and their disproportionate effect on residents and small businesses is one of the reasons Wood Mackenzie cited for why VPPs are having a hard time scaling to that audience despite their market potential.
And that’s just residential customers scaling for one state’s program. The costs of utility bill management are massive for business customers, individually and collectively–large retailers, small businesses, manufacturers, local governments, universities, military bases. In a recent study for New Hampshire, Dunsky Climate + Energy estimated that streamlining utility data-sharing could save these customers about 6 hours per year in energy tracking and 4 hours per building per year in ongoing energy monitoring, at around $250 per hour. In my home state of Florida, FP&L has around 700,000 commercial customers. If commercial utility customers spend only 1 hour per year on bill management on average, at a cost of $150/hour, that would be equivalent to $105 million for FP&L businesses alone–a cost they are bearing on top of the utility bill payments.
In my experience, the figures put forward by Dunsky were extremely conservative on this point, particularly for entities like local governments. Cities might have entire staff or consulting contracts dedicated to processing and cleaning utility bill data for hundreds of distinct buildings (the number of bills can be voluminous when street lighting gets involved). They might have to dedicate staff to regulatory engagement or utility partnership to try to obtain this data more smoothly. Working for a small city was how I got my start in the niche field of utility data access, but New York City is one of the most vocal champions of improving utility billing because of its 7,500 premises.
These costs are big, and they’re born unequally. Wal-mart has been a huge advocate for streamlined data access so it can manage its budgets and bills. Large companies can fund staff out of the savings from better bill management, but compare that to GRID Alternatives, a nonprofit that wants easier access to utility data so it can better meet its mission of providing low/no-cost rooftop solar to low-income residents.
Additionally, let’s not forget that utility staff are the ones dealing with these outside data requests. According to the IT Panel witnesses in a recent rate case filing by ConEd, the time for customer service representatives to manually pull data just once for each of the 70,000 customers who authorized data-sharing to third parties was equivalent to $1.4 million in operations costs. ConEd avoided these costs through software investments and added that “the potential cost savings achieved through continuous automated API access may be many multiples of that $1.4 million estimate.” Indeed–it would cost around $511 million annually in operations costs to have customer service representatives manually transfer interval data for those customers on a daily basis, which could be the norm for Order 2222 compliance in some markets.
Additionally, ConEd’s estimate of 30 minutes per data request might have been conservative. Managing manual letters of authorization (here’s an example from Colorado) requires utility staff time to review documents, verify customers, pull and reconcile data from different systems, and double-check the data to prevent privacy problems like releasing the wrong customers’ information. On top of that, they are (hopefully) creating a system to track authorizations so that it’s possible to know when the authorization ends–usually 2-3 years in the future–or to process a request to revoke data if that comes in earlier.
Across utilities, customers, and customers’ vendors, this is years of analyst time and hundreds of millions of dollars to process basic billing and usage information.
This brings me to my second point, explored in my next post: how data exchange platforms are in fact critical tools for reducing operational costs.