Modeling Energy Strategies for the Cable Industry

by quantellia 14. June 2013 16:11
Cable operators, like many companies, are facing the prospect of steep increases in the cost of energy in the coming years.  In response, they are looking at alternative energy sources. However, navigating the transition to this new world contains hidden dangers, so an evidence-based modeling approach can make a big difference.
 
Energy in the United States is changing: the cost of on-grid, coal- and petroleum-generated energy is expected to rise rapidly, while the availability of renewable and distributed power sources such as solar, wind, and hydrodynamic energy is decreasing.   These changes are altering the landscape for energy-intensive industries across the world.  Amongst them: U.S. cable operators, who face a sea change in the profile of energy costs, as subscribers grow more hungry for video and bandwidth at the same time that the cost of power to support a typical cable operators’ plant is expected to grow steeply.
 
Cable operators have an opportunity to “get ahead of the curve” through structured decision analysis of choices about energy usage.  Quantellia’s Mark Zangari and Tim McElgunn, Chief Analyst Bloomberg BNA Broadband advisory services, presented on this topic at the Society for Cable Telephone Engineers (SCTE) Smart Energy Management Initiative (SEMI) forum earlier this year in Atlanta.  Through a combination of data provided by Bloomberg BNA and modeling provided by Quantellia, Zangari and McElgunn were able to demystify some of the important factors involving this complex decision making process.  McElgunn also authored an accompanying analysis report, which is available to Bloomberg BNA subscribers.
 
Says Bloomberg BNA’s McElgunn, "Within the next five years, the cost of energy to power the U.S. cable industry will become the single largest network cost component.” As shown in the accompanying video and as we’ll discuss below, operators who make good decisions stand to benefit substantially.
 

Background – The Energy Cost of Success for Broadband Providers

As explained by Zangari and McElgunn, in recent years, U.S. cable operators have steadily expanded their service offerings to both consumers and businesses in such a way that energy demand and associated costs will continue to increase. Bloomberg BNA research shows, for example, that over three quarters of U.S. internet service customers subscribe to “bundles” of two or more services, which may include pay TV, home phone, and wireless service in addition to high-speed data.
 
Cable operators are responding to this increasing demand by building new data centers and expanding existing ones. These data centers contain equipment with increasingly lower energy footprints – to as low as 3 watts per server blade. However, the total growth in energy demand exceeds the benefit from these technology improvements.
 
For these reasons, cable operators face difficult decisions in a complex environment. Where should facilities be located? How should existing local and/or regional infrastructure be leveraged? How should cooling and heating be planned? Should operators consider alternative energy sources to utility-delivered power?
 
Says  McElgunn, “The U.S. cable industry has effectively transitioned to an all-digital video plant, and is preparing for an eventual transition to an all-IP network.” A digital network is less costly to operate than an analog one, however, HDTV impacts this trend in important ways. Says McElgunn, “By the end of 2014, essentially all new television sales in the U.S. will be HDTV-compatible, increasing the number of HD-fed sets per home. Bloomberg BNA estimates that approximately 40 million U.S. cable households will subscribe to HD tiers in 2017, with 1.5 HDTVs connected per home. “A large number of these HDTV sets will be displaying HD linear channels, as well as video-on-demand (VOD). A single HD video stream consumes approximately 8 Mbps, and there are multiple simultaneous such streams delivered to essentially every video household for some portion of each day. “This leads to massive demands on the network,” says McElgunn, “This will impact both distribution and access plant and the resources required for content generation and formatting.”
 
In addition, U.S. cable operators offer services to businesses, which place additional demands on networks, and the energy required to operate them. However, McElgunn and Zangari write that “Current architectures and software designs require that generation and transmission resources that are preparing and delivering all of that video operate at full power virtually 24 hours a day every day.” This applies to multiple parts of the cable plant: headends, data centers, and outside plant facilities, which together will require substantial growth in energy to power them.
 

Modeling a Solution

Quantellia’s Mark Zangari worked with Bloomberg BNA to prepare a model that integrates forecasting data with a systems model to allow cable operators to quickly and precisely determine if a proposed energy usage strategy exposes the operator to unacceptable financial and/or technical risks, and to visualize how adjusting specific decision parameters impacts desired outcomes.
 
Among the variables included in the model are expansion of Network DVR, balancing increasing server efficiency versus equipment replacement costs, evolutionary network and facilities design changes, and energy price changes over time.
 
Some findings were as follows:
  • Capital costs are a substantial component: Each additional watt of power consumption adds incremental costs due to the increased quantity of energy the operator needs to supply. However, each additional watt also increases infrastructure needs, and the annualized capital costs resulting from higher power consumption can be significantly greater than the effect on power supply charges.
  • Moore’s Law effects are mitigated by power issues: As illustrated below, although the cost of delivering a given unit of computing performance has been dropping rapidly over the years, the cost of power required to deliver that performance has not been dropping at the same rate.
 
  • Timing is everything: Operators who take too long to change their energy contracts face steep competition through decreasing margins as costs increase. 
  • Sleep mode versus peak shaving: Policies that invoke "sleep mode" for idle equipment can reduce energy usage, but it's important to keep in mind that capital-intensive infrastructure must be built to support the "peaks" of usage, so "peak shaving" is also an important cost containment strategy.

Organizational Challenges

 
Despite the cost benefits from making good decisions about energy usage, says Quantellia’s Zangari, “most U.S. cable operators have separate organizations with independent budgets for the establishment and operation of facilities, and the operation of the network equipment housed in those facilities.  While power and its supply and assurance is managed as a facilities function, network equipment and its performance falls under the operational and financial responsibility of the Network organization.”  Therefore, there is no single department within a typical cable operator with a holistic view of energy management. 
 
A model like the one discussed here, provides this kind of cross-organizational view and equips decision-makers with access to all the factors relevant to measuring, understanding, forecasting, and minimizing the cost of energy.
 
About Bloomberg BNA's Broadband Advisory Services (BAS)
The task of transforming energy supply and consumption is complex and is dependent on a huge number of known and unknown variables. These complexities create significant financial and operational risk for MSOs.
Beginning with our May 2013 report, "Cable Industry Energy Management Strategies", Bloomberg BNA’s Broadband Advisory Service is adding service provider energy strategy to our areas of focus. We will provide ongoing analysis of developments in this space and deliver recommendations based on our research and incorporating data and insight from other divisions within Bloomberg.
In future reports, we will investigate activities spearheaded by the cable industry’s CableLabs research and standards-setting consortium in the areas of energy efficiency for set-top boxes, DVRs, cable modems/EMTAs, and other CPE. We will also assess advances in fleet management and technology and assess the state of the industry’s efforts to approach zero-landfill equipment sourcing and replacement.
Broadband Advisory Services subscribers have full access to these reports, along with our extensive portfolio of reports and databases covering the U.S. broadband services market.
Information on subscribing to the BAS library is available at http://www.bna.com/broadband-advisory-services-p12884902148/.
You can also visit http://www.quantellia.com to learn more.

Tags:

Business and Management | Modeling | Politics | Cable | Sustainability | Telecommunications

Big decisions, small data

by quantellia 4. June 2013 16:04
Conquering the most important problems faced by complex organizations often requires great models, not “big data”: in many situations, better decisions can be made with imperfect or incomplete data.

As storage and processing costs plummet, high-bandwidth networks become cheaper, and data analysis methods become mainstream, organizations have mined a rich store of information—“Big data”— primarily about the behavior of their customers and details of operations.  However, outside of these arenas, big data often looks like an answer looking for a question.  Paradoxically, big data is most effective for the smallest decisions that an organization makes, but that it makes very often, and must therefore be able to be made at almost zero cost.   An example: “what discount plan should I offer this customer?” or “what other product should I recommend to this customer who’s just purchased this book?”

 

By contrast, big decisions have different characteristics that are often not as well-supported by big data.  Here, agility and human insight are often more important than automation.  Smart organizations know that smaller, well-crafted data sets, along with powerful models, are the basis for the best decision outcomes.

 
  
Avoiding the dangers of big data over-analysis
  
Organizations must recognize the difference between situations that require massive amounts of highly detailed information (requiring time-consuming migration and cleansing efforts), and those situations where agility is at a premium, and where imperfect or incomplete data sets are a better choice.  Without such a distinction, we can fall prey to “gratuitous data cleansing”: an expensive, risky, and time-consuming exercise.   Omitted from the brochure of “big data” management products is a dirty secret: the enormous cost—often based on highly manual effort— that gathering, cleansing, and unifying data requires as a precondition before the benefits of big data can be realized.
  
“The decision is only as good as the data that supports it” is an often-misunderstood claim.  For example, a pollster can reliably predict an election result based on interviews with a tiny percentage of the electorate.  And gathering data on eye color is unlikely to improve the accuracy of a credit score. There is a law of diminishing returns on the quantity of information, and not every piece of information is critical to every decision.  Without a good model, we can succumb to a costly fixation on data:  focusing on excessive or unimportant information at the cost of agile and effective decision making. The good news: understanding how to make great decisions with small data requires just a few simple principles, as I’ll explain below.
  
How did we get here?
 
To understand the road forward beyond big data, it helps to understand its history.  The cost per megabyte of storage has dropped from close to $200 to about 122 megabytes for a penny—that’s over 2.4 million percent—from about 1984 to 2010.   Since it’s so cheap to store and transmit, governments, the internet, and even machines, are generating huge data sets.  The new Boeing 787, for instance, generates half a terabyte of data per flight, and the US Federal Reserve alone produces 73,000 economic statistics. This explosion of information has, in turn, driven new analytics technology to make sense of it. 
  
The business value of big data is like the song of the mythical sirens, whose sweet voices lured sailors to their doom.  It has created a focus on the parts of a business in which data is plentiful.  However, the location of easy data availability in an organization does not correspond to the most important business problems.  This is why strategic decision makers will follow a pattern where they “pore over the data, then set it aside to argue”, instead of reaching a decision by following a more structured or systematic process.  On the flip side, it is often the case that expensive data management projects produce little business value.
 
Small Data Principles
The answer, simply put, is that smart organizations have a secret: They know how to use small data to drive their most important decisions.  As a result, they produce great outcomes quickly and cheaply.  Here’s how.
 
  1. Recognize the difference between operational and analytical data, and treat them differently.  For our purposes, with operational data, every byte matters, but analytical data can be imperfect or incomplete.  Examples: the size of a bolt on an airplane, or the costs on your phone bill:  operational. The likelihood of civil war in a country given its poverty level: that’s analytical, which requires a different—and often much less expensive and time-consuming— approach to data management than if every byte was analyzed.
  2. Understand that great decisions can be made from uncertain data.  If you’re sure that your customers will prefer your product as long as your price is at least $5 less than your average competitor, then you don’t need to know if your competitor charges $6 more or $600 more: huge uncertainty in the data that leads to a nonetheless highly confident decision.  Many situations are like this.
  3. Understand the sensitivity of your decision to key assumptions.  Often, decisions are made based on a number of “key assumptions”: those data elements that both have uncertainty as well as to which the decision is highly sensitive.  If you’re launching a new product into Europe, and your competitor’s launch there would substantially change your prospects, and you’re not sure if they will: that’s a key assumption.  Focus all of your data gathering and cleansing effort on these key assumptions—often a “small” set of data compared to big datasets— and deemphasize the rest.
  4. Don’t overlook the value of human expertise.  Somehow, big data hype has led us to ignore our most important asset: between the ears.  We swing between two extremes: projects either ignore this most important asset or rely exclusively upon it: ignoring the data altogether.  But a structured decision modeling approach can produce the best of both worlds, where human expertise substitutes for data when it is missing, and vice versa.
  5. Promote your modelers.  A good modeler is an employee who understands your business, understands your data (along with its limitations), and has a good head for seeing the pattern in the noise.   They’re worth their weight in gold, and can help you to overcome the shortage of data scientists.  Nate Silver predicted the outcome of the most recent presidential election in all 50 states.  Probably the world’s first “celebrity modeler”, Silver used a small data set, not a big one.
  6. Use visual analysis.  Many modelers come from a quantitative background, and so are most comfortable with math and spreadsheets.  Yet non-modelers often have to work very hard to understand them.  Encourage a cross-disciplinary team to use create visual models to make the invisible, visible, and shared in an aligned, collaborative way.
  7. Understand systems. Core to the modeler’s toolkit is an understanding of systems dynamics: feedback loops, winner-take-all dynamics and the like.  The distinction between your organization being in a downward spiral versus enjoying the “invisible engine” of a positive network effect, matters much more than specific data about your situation.
  8. Be agile.  Near-term data is often the most accurate, and least costly to obtain.  When long term information is suspect, or in short supply, you can substitute for it by moving from “Titanic” to “white water rapids” mode: steer your business frequently as new information becomes available. 
Underneath the big data hype is an important assumption: that the future is like the past.  Modelers know different: they understand the limitations of historical data, the mistakes we make when we don’t understand those limitations, and how to assemble historical data together with human expertise to build models to drive business value in a rapidly changing future.
 
By building an accurate model of a decision, and “baking in” expert knowledge, we can build a framework that can guide us towards the data that has the highest value for decision making, and avoid wasting time and effort on data gathering, storage, migration, and cleansing projects that don’t matter.  By doing this, we can transcend the often high cost of “big data” and move towards an organization-wide discipline of better decision making .

Want to know more about how great models can drive value for your organization?  Drop me a line at lorien.pratt@quantellia.com and/or visit http://www.quantellia.com to learn more.

Tags:

Business and Management | Modeling

Disclaimer
The opinions expressed herein are my own personal opinions and do not represent my employer's view in anyway.

© Copyright 2023 The World Modeler Blog