Quantellia announces release of The System Dynamics of Aid: Understanding the Carter Center's CJA Program in Liberia

by Lorien Pratt 28. January 2014 00:44

In the fall of 2012, Quantellia was approached by The Carter Center to study a breakthrough program in Liberia. The Carter Center was The System Dynamics of Aidusing Community Justice Advisors (CJAs)—legal paraprofessionals from local communities—to support plaintiffs and defendants as they navigated through two legal systems in Liberia:  the "formal", or Monrovia-based state-run system; and the "customary" system used in more rural areas.

Since the ultimate goal of this program was peacebuilding, Quantellia built a systems model to explore how spending dollars on this approach, which had substantially lower infrastructure costs than more centralized programs, could help the country to "do more with less", and to move from a "vicious cycle" of conflict to a "virtuous cycle" of economic prosperity and peace.

This paper, and the model shown in the video above, contains our findings. Interestingly, the model showed that a consistent injection of aid along a number of fronts was necessary to overcome the "energy" in state space to move from one phase to another of this complex system. After this injection, the system can become self-sustaining under certain conditions. However, if the aid is not timed correctly, then the system enters a third state of "catch-up", where government money is exhausted on peacekeeping, and is not available for other purposes as the democracy grows. This proof-of-concept model shows these and other dynamics, indicating that this might be the reality in Liberia; extensions include connecting the model to accurate data and cause-and-effect connections.


Development | Economics | Modeling | Politics | Sustainability | Systems analysis | World Modeler

Modeling Energy Strategies for the Cable Industry

by quantellia 14. June 2013 16:11
Cable operators, like many companies, are facing the prospect of steep increases in the cost of energy in the coming years.  In response, they are looking at alternative energy sources. However, navigating the transition to this new world contains hidden dangers, so an evidence-based modeling approach can make a big difference.
Energy in the United States is changing: the cost of on-grid, coal- and petroleum-generated energy is expected to rise rapidly, while the availability of renewable and distributed power sources such as solar, wind, and hydrodynamic energy is decreasing.   These changes are altering the landscape for energy-intensive industries across the world.  Amongst them: U.S. cable operators, who face a sea change in the profile of energy costs, as subscribers grow more hungry for video and bandwidth at the same time that the cost of power to support a typical cable operators’ plant is expected to grow steeply.
Cable operators have an opportunity to “get ahead of the curve” through structured decision analysis of choices about energy usage.  Quantellia’s Mark Zangari and Tim McElgunn, Chief Analyst Bloomberg BNA Broadband advisory services, presented on this topic at the Society for Cable Telephone Engineers (SCTE) Smart Energy Management Initiative (SEMI) forum earlier this year in Atlanta.  Through a combination of data provided by Bloomberg BNA and modeling provided by Quantellia, Zangari and McElgunn were able to demystify some of the important factors involving this complex decision making process.  McElgunn also authored an accompanying analysis report, which is available to Bloomberg BNA subscribers.
Says Bloomberg BNA’s McElgunn, "Within the next five years, the cost of energy to power the U.S. cable industry will become the single largest network cost component.” As shown in the accompanying video and as we’ll discuss below, operators who make good decisions stand to benefit substantially.

Background – The Energy Cost of Success for Broadband Providers

As explained by Zangari and McElgunn, in recent years, U.S. cable operators have steadily expanded their service offerings to both consumers and businesses in such a way that energy demand and associated costs will continue to increase. Bloomberg BNA research shows, for example, that over three quarters of U.S. internet service customers subscribe to “bundles” of two or more services, which may include pay TV, home phone, and wireless service in addition to high-speed data.
Cable operators are responding to this increasing demand by building new data centers and expanding existing ones. These data centers contain equipment with increasingly lower energy footprints – to as low as 3 watts per server blade. However, the total growth in energy demand exceeds the benefit from these technology improvements.
For these reasons, cable operators face difficult decisions in a complex environment. Where should facilities be located? How should existing local and/or regional infrastructure be leveraged? How should cooling and heating be planned? Should operators consider alternative energy sources to utility-delivered power?
Says  McElgunn, “The U.S. cable industry has effectively transitioned to an all-digital video plant, and is preparing for an eventual transition to an all-IP network.” A digital network is less costly to operate than an analog one, however, HDTV impacts this trend in important ways. Says McElgunn, “By the end of 2014, essentially all new television sales in the U.S. will be HDTV-compatible, increasing the number of HD-fed sets per home. Bloomberg BNA estimates that approximately 40 million U.S. cable households will subscribe to HD tiers in 2017, with 1.5 HDTVs connected per home. “A large number of these HDTV sets will be displaying HD linear channels, as well as video-on-demand (VOD). A single HD video stream consumes approximately 8 Mbps, and there are multiple simultaneous such streams delivered to essentially every video household for some portion of each day. “This leads to massive demands on the network,” says McElgunn, “This will impact both distribution and access plant and the resources required for content generation and formatting.”
In addition, U.S. cable operators offer services to businesses, which place additional demands on networks, and the energy required to operate them. However, McElgunn and Zangari write that “Current architectures and software designs require that generation and transmission resources that are preparing and delivering all of that video operate at full power virtually 24 hours a day every day.” This applies to multiple parts of the cable plant: headends, data centers, and outside plant facilities, which together will require substantial growth in energy to power them.

Modeling a Solution

Quantellia’s Mark Zangari worked with Bloomberg BNA to prepare a model that integrates forecasting data with a systems model to allow cable operators to quickly and precisely determine if a proposed energy usage strategy exposes the operator to unacceptable financial and/or technical risks, and to visualize how adjusting specific decision parameters impacts desired outcomes.
Among the variables included in the model are expansion of Network DVR, balancing increasing server efficiency versus equipment replacement costs, evolutionary network and facilities design changes, and energy price changes over time.
Some findings were as follows:
  • Capital costs are a substantial component: Each additional watt of power consumption adds incremental costs due to the increased quantity of energy the operator needs to supply. However, each additional watt also increases infrastructure needs, and the annualized capital costs resulting from higher power consumption can be significantly greater than the effect on power supply charges.
  • Moore’s Law effects are mitigated by power issues: As illustrated below, although the cost of delivering a given unit of computing performance has been dropping rapidly over the years, the cost of power required to deliver that performance has not been dropping at the same rate.
  • Timing is everything: Operators who take too long to change their energy contracts face steep competition through decreasing margins as costs increase. 
  • Sleep mode versus peak shaving: Policies that invoke "sleep mode" for idle equipment can reduce energy usage, but it's important to keep in mind that capital-intensive infrastructure must be built to support the "peaks" of usage, so "peak shaving" is also an important cost containment strategy.

Organizational Challenges

Despite the cost benefits from making good decisions about energy usage, says Quantellia’s Zangari, “most U.S. cable operators have separate organizations with independent budgets for the establishment and operation of facilities, and the operation of the network equipment housed in those facilities.  While power and its supply and assurance is managed as a facilities function, network equipment and its performance falls under the operational and financial responsibility of the Network organization.”  Therefore, there is no single department within a typical cable operator with a holistic view of energy management. 
A model like the one discussed here, provides this kind of cross-organizational view and equips decision-makers with access to all the factors relevant to measuring, understanding, forecasting, and minimizing the cost of energy.
About Bloomberg BNA's Broadband Advisory Services (BAS)
The task of transforming energy supply and consumption is complex and is dependent on a huge number of known and unknown variables. These complexities create significant financial and operational risk for MSOs.
Beginning with our May 2013 report, "Cable Industry Energy Management Strategies", Bloomberg BNA’s Broadband Advisory Service is adding service provider energy strategy to our areas of focus. We will provide ongoing analysis of developments in this space and deliver recommendations based on our research and incorporating data and insight from other divisions within Bloomberg.
In future reports, we will investigate activities spearheaded by the cable industry’s CableLabs research and standards-setting consortium in the areas of energy efficiency for set-top boxes, DVRs, cable modems/EMTAs, and other CPE. We will also assess advances in fleet management and technology and assess the state of the industry’s efforts to approach zero-landfill equipment sourcing and replacement.
Broadband Advisory Services subscribers have full access to these reports, along with our extensive portfolio of reports and databases covering the U.S. broadband services market.
Information on subscribing to the BAS library is available at http://www.bna.com/broadband-advisory-services-p12884902148/.
You can also visit http://www.quantellia.com to learn more.


Business and Management | Modeling | Politics | Cable | Sustainability | Telecommunications

Big decisions, small data

by quantellia 4. June 2013 16:04
Conquering the most important problems faced by complex organizations often requires great models, not “big data”: in many situations, better decisions can be made with imperfect or incomplete data.

As storage and processing costs plummet, high-bandwidth networks become cheaper, and data analysis methods become mainstream, organizations have mined a rich store of information—“Big data”— primarily about the behavior of their customers and details of operations.  However, outside of these arenas, big data often looks like an answer looking for a question.  Paradoxically, big data is most effective for the smallest decisions that an organization makes, but that it makes very often, and must therefore be able to be made at almost zero cost.   An example: “what discount plan should I offer this customer?” or “what other product should I recommend to this customer who’s just purchased this book?”


By contrast, big decisions have different characteristics that are often not as well-supported by big data.  Here, agility and human insight are often more important than automation.  Smart organizations know that smaller, well-crafted data sets, along with powerful models, are the basis for the best decision outcomes.

Avoiding the dangers of big data over-analysis
Organizations must recognize the difference between situations that require massive amounts of highly detailed information (requiring time-consuming migration and cleansing efforts), and those situations where agility is at a premium, and where imperfect or incomplete data sets are a better choice.  Without such a distinction, we can fall prey to “gratuitous data cleansing”: an expensive, risky, and time-consuming exercise.   Omitted from the brochure of “big data” management products is a dirty secret: the enormous cost—often based on highly manual effort— that gathering, cleansing, and unifying data requires as a precondition before the benefits of big data can be realized.
“The decision is only as good as the data that supports it” is an often-misunderstood claim.  For example, a pollster can reliably predict an election result based on interviews with a tiny percentage of the electorate.  And gathering data on eye color is unlikely to improve the accuracy of a credit score. There is a law of diminishing returns on the quantity of information, and not every piece of information is critical to every decision.  Without a good model, we can succumb to a costly fixation on data:  focusing on excessive or unimportant information at the cost of agile and effective decision making. The good news: understanding how to make great decisions with small data requires just a few simple principles, as I’ll explain below.
How did we get here?
To understand the road forward beyond big data, it helps to understand its history.  The cost per megabyte of storage has dropped from close to $200 to about 122 megabytes for a penny—that’s over 2.4 million percent—from about 1984 to 2010.   Since it’s so cheap to store and transmit, governments, the internet, and even machines, are generating huge data sets.  The new Boeing 787, for instance, generates half a terabyte of data per flight, and the US Federal Reserve alone produces 73,000 economic statistics. This explosion of information has, in turn, driven new analytics technology to make sense of it. 
The business value of big data is like the song of the mythical sirens, whose sweet voices lured sailors to their doom.  It has created a focus on the parts of a business in which data is plentiful.  However, the location of easy data availability in an organization does not correspond to the most important business problems.  This is why strategic decision makers will follow a pattern where they “pore over the data, then set it aside to argue”, instead of reaching a decision by following a more structured or systematic process.  On the flip side, it is often the case that expensive data management projects produce little business value.
Small Data Principles
The answer, simply put, is that smart organizations have a secret: They know how to use small data to drive their most important decisions.  As a result, they produce great outcomes quickly and cheaply.  Here’s how.
  1. Recognize the difference between operational and analytical data, and treat them differently.  For our purposes, with operational data, every byte matters, but analytical data can be imperfect or incomplete.  Examples: the size of a bolt on an airplane, or the costs on your phone bill:  operational. The likelihood of civil war in a country given its poverty level: that’s analytical, which requires a different—and often much less expensive and time-consuming— approach to data management than if every byte was analyzed.
  2. Understand that great decisions can be made from uncertain data.  If you’re sure that your customers will prefer your product as long as your price is at least $5 less than your average competitor, then you don’t need to know if your competitor charges $6 more or $600 more: huge uncertainty in the data that leads to a nonetheless highly confident decision.  Many situations are like this.
  3. Understand the sensitivity of your decision to key assumptions.  Often, decisions are made based on a number of “key assumptions”: those data elements that both have uncertainty as well as to which the decision is highly sensitive.  If you’re launching a new product into Europe, and your competitor’s launch there would substantially change your prospects, and you’re not sure if they will: that’s a key assumption.  Focus all of your data gathering and cleansing effort on these key assumptions—often a “small” set of data compared to big datasets— and deemphasize the rest.
  4. Don’t overlook the value of human expertise.  Somehow, big data hype has led us to ignore our most important asset: between the ears.  We swing between two extremes: projects either ignore this most important asset or rely exclusively upon it: ignoring the data altogether.  But a structured decision modeling approach can produce the best of both worlds, where human expertise substitutes for data when it is missing, and vice versa.
  5. Promote your modelers.  A good modeler is an employee who understands your business, understands your data (along with its limitations), and has a good head for seeing the pattern in the noise.   They’re worth their weight in gold, and can help you to overcome the shortage of data scientists.  Nate Silver predicted the outcome of the most recent presidential election in all 50 states.  Probably the world’s first “celebrity modeler”, Silver used a small data set, not a big one.
  6. Use visual analysis.  Many modelers come from a quantitative background, and so are most comfortable with math and spreadsheets.  Yet non-modelers often have to work very hard to understand them.  Encourage a cross-disciplinary team to use create visual models to make the invisible, visible, and shared in an aligned, collaborative way.
  7. Understand systems. Core to the modeler’s toolkit is an understanding of systems dynamics: feedback loops, winner-take-all dynamics and the like.  The distinction between your organization being in a downward spiral versus enjoying the “invisible engine” of a positive network effect, matters much more than specific data about your situation.
  8. Be agile.  Near-term data is often the most accurate, and least costly to obtain.  When long term information is suspect, or in short supply, you can substitute for it by moving from “Titanic” to “white water rapids” mode: steer your business frequently as new information becomes available. 
Underneath the big data hype is an important assumption: that the future is like the past.  Modelers know different: they understand the limitations of historical data, the mistakes we make when we don’t understand those limitations, and how to assemble historical data together with human expertise to build models to drive business value in a rapidly changing future.
By building an accurate model of a decision, and “baking in” expert knowledge, we can build a framework that can guide us towards the data that has the highest value for decision making, and avoid wasting time and effort on data gathering, storage, migration, and cleansing projects that don’t matter.  By doing this, we can transcend the often high cost of “big data” and move towards an organization-wide discipline of better decision making .

Want to know more about how great models can drive value for your organization?  Drop me a line at lorien.pratt@quantellia.com and/or visit http://www.quantellia.com to learn more.


Business and Management | Modeling

Understanding the US 2012 Presidential Election Result: How Models Triumphed over Big Data

by quantellia 17. November 2012 16:04

Today, opinions of what happened in the 2012 presidential election are beginning to settle.  Since history is often written from the perspective of the winners, many commentators have attributed the American right’s pre-election false confidence to a sort of blindness or group-think.  But there’s another truth.  Given what was known ahead of the election, the prediction that Romney would win was more defensible than the prediction that Obama would be re-elected.
The video above shows why.
The reason is simple. Take a look at this graph, reflecting data published by the social science data analysis network (SSDAN).
As you can see, the percent of African Americans who voted in 2008 is a clear departure from the historical trend.  It’s reasonable to assume that Republicans concluded that the pattern would revert to historical levels, which given the overwhelming amount of data going back over 30 years, is a completely rational, “big data” conclusion.  (Read Nobel laureate Daniel Kahneman’s “Thinking Fast and Slow” for a clear and deep analysis of this phenomenon, which is called Regression to the Mean).
An alternative explanation, of course, is that in 2008, something exceptional happened. And that pattern continued into 2012.  Predicting the future requires an additional ingredient, beyond simply projecting historical trends:
The Future = Historical Trends + Unique events and decisions happening today
Nassim Taleb defines a “black swan” as an event that is 1) a surprise, that has 2) a big effect, and that 3) history rationalizes with hindsight:  “of course the right was ignoring the obvious”.  The reason Black Swans happen: ignoring the second half of the above equation.    Taleb describes dozens of historical events like this one, where misunderstanding exceptions to historical rules has led to disaster. 
Simply put, the future is not always like the past.  It is essential to know when it is, and isn’t.
Obama’s win qualifies on all three Black Swan criteria.  Since such a high percentage of blacks voted for Obama, this miscalculation can be seen as having misled many Republicans to believe in Romney’s guaranteed election, and perhaps even to have caused the loss through leading to an inappropriately small focus on the needs of blacks and other minorities. What’s missing is the ability to clearly analyze the right-hand side of the equation: how do we know when something new is unique enough that it changes the course of history? 
It is clear that the Democrats were masters of this “secret sauce”.  They built an on-the-ground voter turnout operation to bump the curve upward from its expected value in exactly the places that would affect the Electoral College result.  By being able to focus their resources on exactly where they would contribute most towards achieving victory, the Democrats effectively out-modeled the Republicans. 
Before the book is closed on this election, all who chronicle its history should ensure that the modelers on both sides are appropriately acknowledged.  Because from now on, close elections will be won by campaigns who base their strategic decisions on models that are better than those of their competitors.

Want to know more about how great models can drive value for your organization? Drop me a line at lorien.pratt@quantellia.com and/or visit http://www.quantellia.com to learn more.



General | Politics

Tax Policy and the Economy: Towards Better Understanding and More Productive Discussion

by quantellia 3. November 2012 16:13
It's exhausting: understanding the economy in the midst of this election season is incredibly complex. And political economics tends to be, well, adversarial. Trying to understand the likely outcomes of opposing claims about economic policy—what does each mean for the future?—is maddening. There are SO many moving parts and incomplete facts, not to mention misleading commentary. And now, here we are, at a critical point in time, at an election that could shape the economic destiny of the United States for many years to come. Yet we have few aids, other than history lessons and common sense, to cut through the complexity and grasp the real consequences.
We need something to help. A way to clearly and simply decipher and weigh the alternatives and implications. How can subjective assertions be made more objective? How can relevant details be defined, tested, and considered, say from the perspective of various “what if” scenarios? Could opposing claims be compared and contrasted much more easily?
The answer is yes, there IS something! If interested, read on.
First, some background: Democratic societies are founded on the idea that government is “by the people, for the people.” But today, most are far more complex than their founders could have conceived. Let’s face it, it’s difficult to determine, for example, which tax policy will really lower unemployment, reduce the deficit, or raise GDP. It seems we’re left captive to the opinions of pundits and other “experts” who may have lost their objectivity in partisanship.
The good news: Using the approach suggested here, any reasonable person can recover the power to independently assess political claims. As a result, our society as a whole can recover the kind of transparent and participative democracy that keeps politicians accountable to The People as rightful authority.
An invitation: After watching the video below, join the discussion. Together we can take on many tricky issues facing the world today—starting with, as in this example, economic policy. Only an informed electorate can make the sound judgments necessary to ensure a free society. Help us in enabling a greater understanding of our complex world through the use of advanced technology.
Moving forward: beyond the dull thud of conflicting ideology
How do we create a common ground for those who are passionate both about advancing society, and yet also bring different points of view to the debate? How do we create a level playing field where we can explore different positions, where our goal is shared understanding and achievement of improved outcomes for everyone?
Before proceeding, we’ll confess that we don't have a comprehensive answer. What we do offer is a way to change a way that we discuss these topics.  In particular, an assertion that "doing X will cause Y", or that "doing A will have benefit B" and "will not cause a negative outcome C" should be something that we can test, in such a way that we'll agree on the result.
We should certainly be able to shed more light on questions like this than the dull thud that we often feel from futile collisions of conflicting ideologies.
The right tools can help us argue constructively about complex things
Here’s an analogy.  Let’s imagine two engineers arguing about the best design for a new airplane wing.  One engineer thinks it should have an upward curving shape. Another thinks the wing should have a winglet at the end.  Both agree that the goal is better aerodynamics, fuel efficiency, and safety.  They won’t spend too much time in ideology, rather at some point both will agree that a wind tunnel simulation will help them to reach common ground, and that once it exists, the behavior of the wind tunnel model can be both the arbiter of the disagreement, and also bring into sharp focus the precise points upon which they disagree.
With model in hand, the discussion shifts from whether upward curves are better than winglets per se, and towards a more constructive analysis of the details of both proposals, and whether they (by themselves or in combination) produce the expected results.   A key dynamic within this argument is both sides’ recognition that the model is imperfect, yet nonetheless informative. It provides a forum to understand the implications of the two choices on aerodynamics, fuel efficiency, and safety.   Modeling makes explicit assumptions that each side has about the others’ proposal, and provides a forum to test and correct those assumptions. 
Altogether, this approach shifts the tone of the argument from loudly proposing one position or the other to a collaborative exercise where the model is tweaked to achieve a goal.
Our economy is far more complex than that airplane wing, and we have far more at stake.  We propose that we can learn from the airplane engineers, using the same sort of tools and reasoning to overcome the complexity inherent in understanding how policy is likely to affect the macro economy.
Agreeing to goals and levers
So how do we go about providing this small incremental, yet valuable, improvement?  We suggest the following approach.
First, both sides must be able to agree on what is a favorable outcome.  As an example, let us consider what we are trying to achieve through a tax policy.  Consider the following:
  1. A successful tax policy will result in an improvement in GDP
  2. A successful tax policy will reduce unemployment
If both sides can agree to these goals, then we have already made progress. Furthermore, we have a better basis for moving forward rather than making loose assertions about who is a job creator, what is fair, and how the economy has reacted in the past when we tried different ideas.
Second, both sides should agree upon what courses of action are available to them.  Let us consider the shape of the progressive tax scale (the tax rate levied on income earners at different income levels) as a “lever” of this system, over which we have control.
Now for the hard part.  Both sides make assertions about how changing tax policy will affect the outcomes that we have agreed to care about.  On the table today are two general ideas: policies where higher-income earners pay higher taxes, or a “trickle down” approach where higher-income earners pay less.   Ultimately this will result in a decision: to choose a policy that leans towards one or the other approach.  
Instead of leaving these discussions to the exclusive domains of economists, how can intelligent non-experts nonetheless form opinions and engage in informed, participatory discourse?  This is our challenge.
Our goal: improving discourse through better understanding
We propose that a solution is to work towards a common understanding of how the macro economy behaves, involving both sides in the discussion. The goal: to achieve consensus such that both sides a) understand how the system behaves, and can then b) apply the policy decisions to the model to observe how it affects outcomes.
Of course, this is ambitious to say the least.  The United States economy is a vastly complex system, with more variables than we could possibly hope to capture, along with interactions more tangled than we can easily represent or understand.  As if that's not difficult enough, there's also a considerable dose of random behavior in this system. All of this leads to a situation where it is impossible to predict the future.
Understanding the system, not predicting the future
So we’re not proposing that we build a precise model of the economy with any reasonable predictive ability.  What we are proposing is that we use these techniques to move towards a much more sophisticated discussion, with greater granularity than we have seen so far.   It is time, right now, to end empty debates as to whether rich people are "job creators" or whether a given tax policy is "fair.”  Rather, let's discuss whether a given tax policy is likely to stimulate demand within a given income bracket, and how will businesses react. What force will such a policy exert, either upward or downward, on either employment or GDP?  
It’s not about the numbers
Since models always appear to present precise numbers, it is always tempting, if not irresistible, to disregard a model whose numbers don't match reality.  Indeed, if our goal was to score a prediction bulls-eye, then any difference between numbers and reality would be a drawback.   But this is not our purpose.
Again, simply observing how something simple like taxation can ripple through an economy is vastly more enlightening, and is likely to lead to overwhelmingly better decision-making than adhering to ideological positions.
The World Modeler simulation
We have built a simulation in World Modeler which is by no means a precise model of the United States or, for that matter, any other economy. However, its structural features are broadly correct.
The direction that the modeled economy will take based on different kinds of policy decisions more or less matches the direction that the US economy or other industrialized economies typically take when similar measures are taken.
Over time, if such a model is allowed to live and grow, and if constructive criticism including ideas from experts, comparisons to history, and continuously improving data, are applied, then for the reasons we've mentioned earlier, while the model will never be an accurate predictor of an economy, it will, hopefully, evolve to a level where it provides us substantive guidance in how to create economic policies that benefit everyone.
It accomplishes this through:
  1. Alignment on all sides
  2. A much more sophisticated level of discussion
  3. A common reference point that all parties can agree can be used as an arbiter
The details
So let us continue with a concrete example of the kind of analysis that we are proposing.  As begun above, we have established agreement on both sides that policy decisions about the national economy should attempt to increase GDP while lowering unemployment. As a next step, let's add a new element: the idea that the economy consists of three major groups of actors: businesses, consumers, and the government. 
These actors have complex microeconomic internals but broadly interact with each other as follows:
  1. The government collects revenues from businesses and consumers through taxes. Some of that revenue is paid to consumers for their labor and paid to businesses for their goods and services.
  2. Businesses consume labor from consumers, raw material from other businesses, and provide goods and services that  the government, consumers ,and other businesses exchange for money.
  3. Finally, consumers receive money in exchange for labor, some of which they pay to the government in taxes, some of which they save or invest, and some of which they spend, consuming goods and services from businesses.
In the kind of discussion we propose, "the economy" is the equilibrium that these three components reach as the system finds its natural operating level through their interaction.  The discussion about tax policy is one concerning which forces the respective proposed changes exert on that equilibrium.

We can run the analysis at this level, with just these three interactions.  Let's suppose this three-part system is in equilibrium at a given unemployment rate, and at a given GDP.  What will be the impact of lowering taxes on high-income earners?  Any consumer is characterized by the degree to which they consume and the amount they save.  High-income earners have generally reached the peak of their consumption, so giving them more money is unlikely to significantly increase consumption.  It will, however, increase their ability to save and invest.

On the business side this has little effect.  Business activity is driven largely by consumption, and so more money in the pockets of high-income earners will have little impact.  More money savings will lead to greater investment.  When this investment is in businesses, it tends to lead to greater efficiencies and reductions in employment.

From the government's perspective, decreasing taxes to the rich decreases government revenues, which decreases its ability to employ.

Just this simple analysis - right or wrong (which is not our focus, as above) - begins to provide a mechanism by which we can assess the value of various policies, and to move beyond ideological adherence to one or the other position.

Our first-order model
To take this to the next step, we've built a simple first-order model of an industrial economy, as illustrated in the video and model snapshots below.  Using World Modeler, we can run a simulation of this situation, apply various tax changes, and allow a computer to do the hard work of analyzing how a tax policy affects the various factors. The video below shows results of some initial experiments.

It is important to emphasize, again, that the point of this model is not to assert that one policy is better than another.  Rather, our goal is to provide a forum in which these issues can be understood and evaluated at a level of clarity, precision, and granularity that allows those with different points of view to find common ground over time.
If there’s enough interest, we’ll be going through the details of these sub-models in future blog postings.  Please post in the comment fields to vote for a more detailed treatment. We can also send you a trial version of the software with the model included, which we are offering on an open source basis.   Sign up at http://www.worldmodeler.com or email directly to us.
Crowdsourced political intelligence
As a final aspirational statement, social networks and the concept of crowdsourcing provide an opportunity for models that begin as very rough-hewn and approximate to benefit from the incremental input of expert and others' opinions over time.
We envisage a movement for sharing models that is similar to open source, where small open-source movements originally built basic tools which over time evolved into complex software packages that rival the best that the commercial software sector has to offer, through the selfless and passionate interest of expert individuals.
A challenge and a vision
The challenge proposed here is to make this level of analysis accessible to a population that is not comprised solely of actuarial analysts and econometricians.  To do this, we must make quantitative analysis more visual and intuitive than is typically presented in these cultures. 
This article is, therefore, a call to action to take what is arguably one of our most important national debate topics, and to make it as accessible as possible to people who are affected by the outcome.

We welcome any and all suggestions as to how to move forward.  If you’ve read this far, you’re in a precious minority of those with the energy to make it happen.
What does success look like in this endeavor? At some point in the future, the graphics that media outlets use are beyond pictures donkeys and elephants butting heads, even beyond clever data visualizations.  Instead, they provide a dynamic yet understandable representation of how policies proposed by different sides of the debate affect different aspects of the United States economy.


Under the Hood: How to reach Agreement in Tax Policy and other Important Presidential Matters

by quantellia 30. October 2012 15:59
Instead of "he said, she said" or failing to convince the opposing side based upon an appeal to history or other countries, we can now have an intelligent conversation about the"engine" of the economy: what makes it tick, and the best way to fix it.

A few weeks back, my sister, visiting from Michigan, was driving my car here in Denver.  It's an old  blue Ford Contour, with mileage well into the six figures.  The car began to hesitate.  "Something's wrong for sure", I said.  No, my sister reassured me, "everything's OK".


The Contour sputtered to a halt on the side of Route 6, and we were left arguing.  "I think there's something very wrong here, probably the fuel pump," I said, "I've felt this before".  "No, I think it's the spark plugs." said my sister.


We're smart enough that we didn't debate much further, and left it to the experts at my local shop to open the hood and diagnose the problem.


As we waited for the tow, Faith and I chatted about the economy.  That first argument about the spark plug and fuel pump felt like some political debates we've heard lately: lots of discussion about what's under the hood, without actually taking the time to understand the engine.  Because of that, we end up going around in circles.


Under the Hood


What do I mean by the "engine" when we're talking about political disagreement?  Well, the discussion is often about the economy, tax policy, unemployment, the deficit, and the like.  Usually, we assume this is too complex to understand, and we leave the details to the experts: economists and others.  Not so, with modern tools, which go beyond mind mapping to full simulations on the desktop.  We can be more informed, we can understand the underlying mechanism.   


To begin, we can start by agreeing with our opponents on some basic parts of the systems that drive our world: how do businesses affect governments?  How do governments impact spending?  Where is the fan belt?  How's the starter motor doing?  How is it all connected?



I think that we can use the World Modeler software and the Decision Engineering approach to mapping complex systems to take these arguments to the next level.  Instead of "he said, she said" or failing to convince the opposing side based upon an appeal to history or other countries, we can now have an intelligent conversation about the "engine" of the economy: what makes it tick, and the best way to fix it.


For an easy introduction, watch the video below:


Useful Arguments


So here's the process:


  1. Build a simulation model.  Formerly a world restricted to economists, this is now much easier and faster using modern tools.   Sign up to get a free trial copy of the tool to use to do this at: http://www.worldmodeler.com .
  2. Include your favorite policy in your model. Show how it works.
  3. Send it to your friends and adversaries.  If they disagree, ask them to change the model and the data to show how they see things working instead. And, by the way, this is where "big data", analytics, and machine learning can make their greatest contribution: in helping us to understand the mechanisms inside complex systems like the US economy.  Without understanding the system, focusing on data alone is like diagnosing a car by watching the pattern of leaking fluid on the pavement.
  4. Repeat step (3) until you reach agreement, or at least until the disagreement has been clarified to an assumption that can be tested (for example, my model might assume a certain level of spending amongst wealthy people: something we can measure). 


 As for the model itself, the video shows three basic parts of an economy:

  1. Businesses
  2. The government
  3. Consumers

Connecting the Pieces

At the simplest level, these are connected in the following ways:

  1. The government receives revenues through taxes, and loses money through spending.  Taxes come from businesses and consumers.  Spending is money that goes to businesses (as the government buys things) and consumers (some of whom it hires).
  2. Consumers earn money through investments, government sources (like Medicaid), and wages.  They spend money on goods and services provided by business.
  3. Businesses lose money through taxes, through buying parts from other businesses, and through wages.  They earn money by selling to consumers and the government.

Using a simulator, we can experiment, for instance, with the impact of various tax policies.  These play out over time and impact important factors like unemployment and GDP. 


This is just a starting point.  I invite you to join an important new discussion. Sign up for more information at http://www.worldmodeler.com.



Economics | General | Politics

The Tyranny of Single Corporate Metrics

by quantellia 29. September 2012 15:53

Investors, academics, and the media are increasingly complaining about corporations that maximize shareholder value at the expense of all other factors.  Yet there is an emerging consensus that this thinking is flawed, from both a legal and a practical perspective.  Lynn Stout, a Cornell University law professor published "The Shareholder Value Myth" earlier this year, in which she demonstrates that there is not, indeed, such a legal requirement.  Stout and a host of others, such as Joe Nocera in the New York Times and Steve Denning in Forbes, also describe the damage done to corporate America by a single-minded focus on this metric.


Yet if not shareholder value, then what?  Denning and Nocera both chronicle the emergence of focus on customers  as a good substitute: if you delight the customer, the thinking goes, shareholder value will be pulled along for the ride, but not the other way around.  Denning goes on to warn against any approach that seeks to achieve multiple goals as simply not practical for any organization of substantial scale.


We shouldn't throw in the towel so easily.  Historically, companies have found it difficult to manage to multiple goals.  Yet customer delight as a single metric has its own flaws. And new technology can expand our capacity to work effectively in an organization without the "garbage can" anarchy described by Denning.


As illustrated below, we have reached a watershed, where our need to understand how decisions will play out in complex scenarios has increased, while technology has become easy enough to use that we can substantially expand our ability to think clearly about difficult situations.  The need to move beyond single metrics like shareholder value or customer experience is a great example.



Blue Jeans and Lear Jets


Denning quotes the following story from Jim Sinegal at Costco:


We were selling Calvin Klein jeans for $29.99, and we were selling every pair we could get our hands on. One competitor matched our price, but they had only four or five pairs in each store, and we had 500 or 600 pairs on the shelf. We all of a sudden got our hands on several million pairs of Calvin Klein jeans and we bought them at a very good price. It meant that, within the constraints of our markup, which is limited to 14 percent on any item, we had to sell them for $22.99. That was $7 lower than we had been selling every single pair for.


Of course, we concluded that we could have sold all of them (about four million pairs) for that higher price almost as quickly as we sold them at $22.99, but there was no question that we would mark them at $22.99 because that’s our philosophy.


I use that as an example because it would be so tempting for a buyer to go with the higher price for a very quick $28 million in additional profits, but ours didn’t. That’s an example of how we keep faith with the customer.


Denning goes on to describe a new approach: "Customer Capitalism", where the goal is to use this new focus on customers instead of shareholder value. Under this model, the thinking goes, it's OK to sell blue jeans less expensively because of the longer-term loyalty that results.  The situation is, however, substantially more complex, as we'll show in the video below.


Even more problematic is the fact that an exclusive focus on customer delight leads to its own issues.  One: the "Lear Jet problem": if customer delight is your sole focus, then there's an easy way to win every time: give every customer a Lear Jet with their purchase. Problem solved.


Of course, that's not the solution. We must balance customer delight against cost, revenues, and other business factors. And every department will make that judgment differently, so we're back in the garbage can. As simple metrics go, customer delight is probably better than shareholder value, but the fact remains that we don't need to be so simple-minded.  The good news: we can balance cost with revenues, customer delight with price, to more deeply understand how the pieces fit together so they hum along in a coherent system. 


The tougher news: we must.  Our business competitors in other countries are often better at systems thinking, at understanding unintended consequences, nonlinear patterns, feedback effects.  In a now-classic article by Josh Kerbel in Studies in Intelligence, the author describes the West's systematic failing in this realm.  Against this backdrop, managing a company through a single metric stands out as simplistic in the extreme.    Furthermore, international analysis of business practices leads us to conclude that we can move beyond single metrics, with substantial benefits.


Thinking more clearly


The video below illustrates this point.  As you watch, you'll gain—in under eight minutes—a deep understanding of the complex and sometimes subtle dynamics at play as cost, price, customer loyalty, and competition interact in the Costco blue jeans pricing situation described above.



Using tools like World Modeler, as shown here, can help companies to move beyond simplistic metrics like customer value.   We can finally engage in an agile ballet to balance markets, customers, technology, and, yes, shareholders.  Taking this approach will allow us live up to our full business potential, enhancing our companies and the societies they serve.


Please sign up at worldmodeler.com to learn more.


The opinions expressed herein are my own personal opinions and do not represent my employer's view in anyway.

© Copyright 2023 The World Modeler Blog