Needed: A Computer Driven Economy
A computer driven economy is an idea whose time has come. In fact, it is long overdue. The reasons are numerous: Our economy is too large, too diverse, and too important to us all to let it flounder at suboptimal performance levels or to fall into unnecessary recessions, or fall victim to terrorists. If GDP output is just 3 percent from optimum, the cost is $450 billion in goods and services and 4.2 million jobs. It is likely far from that point nearly all the time.
Billions of transactions also take place in the economy daily. Moreover, the data take time to collect and to analyze, so we never know the state of the economy at any given moment. Even after analyzing the data, more time is needed before discussions about what to do can begin. Then add another 6 months or so before any decisions are possible, and the changes decided on could be almost the opposite of those needed to correct matters. New ways to collect and analyze the data could speed up this process so that better and timelier decisions are possible.
Here is a perspective on how a 3 percent change in GDP could compound over time. Such a change would double its effects in 24 years. Now go back to 1988. If GDP grew by an additional 3 percent annually using optimal techniques, it would be at $30 trillion today, not at $15 trillion. And if incomes were distributed proportionally, they would all be doubled today. Go back to 1968 under the same assumptions, and GDP and incomes would be 4 times as high as they are today, ceritus paribus (holding everything else the same). That’s a small difference we could all live with!
But our economic decision making procedures are also flawed. Giving Congress the power to make policy changes to the economy is much like giving them the power to direct a brain surgeon’s actions while the operation is far away and underway. Or, it is analogous to taking decisions away from a pilot during a severe storm with 300 passengers aboard his aircraft and letting Congress (or any others) make all the decisions. Even if they were qualified surgeons or skilled pilots, the outcomes would be disastrous. Similarly, they are neither economists, nor mathematicians, nor programming experts, so how can we expect them to make sound economic decisions? Many of them know about their own limitations, so the fact that they rely on beliefs and ideologies and seat-of-the-pants analyses in economic decision making should not surprise anyone. We need to do a lot better.
Fortunately, we can do a lot better. These issues are solvable.
Faster data collection, analysis, evaluation, and decisions—even when done with the best of intentions—cannot give us assurances that the decisions will be good ones—unless we also improve the option choices for those who make the final decisions. In other words, there are times when both sound and timely economic decisions are essential. Miss the mark because of outdated data and political squabbling, and a $1 trillion dollar error could occur. Think of the consequences from a 9/11. The total cost to the U.S. economy for that attack was about $2 trillion.[i]
That total could have been significantly reduced with a fast and accurate assessment of all the linkages done by a super computer.
Imagine other attacks in which our cities are bombed. Transportation systems are disrupted. Water supplies are poisoned. Imagine also that we had a room full of top scientists, and we polled them for answers. They likely could come up with good counter measures in terms of what is salvageable, and where to direct resources. It is unlikely, however, that they would know where the resources were located, how quickly they could be accessed and moved, cost estimates, and optimal use of all resources to bring to bear on the matter. But a super computer that processes a trillion bits of data per second, with a network of smart algorithms, that also reacts to rapidly changing data, could provide an array of solutions that shows optimal and suboptimal courses of action. And all of it could be based on up-to-the-minute data, analyzed according to proven algorithms, with solutions displayed that were best suited to all the facts and to our economic and financial welfare—and perhaps—even to considering our military needs and options.
Now magnify this “what if” scenario into a thousand others with each requiring fast, accurate responses. Do we have the capability today to bring the power of computers to bear on most contingencies?
The answer is an unqualified “no.”
The Political Barrier
Politics and economics do not mix. Although analyzing and predicting economic activity is difficult, there is a consensus among enlightened economists about how the economy works. We know, for example, that money and its turnover rate (spending) equals GDP. And more GDP means more jobs, more tax revenues, and more wealth. Moreover, an equitable tax policy is one in which we all carry our fair share of the burden. Assertions, opinions, and conjectures about these matters that are filtered through narrow ideologies are not substitutes for facts, data, and good analyses. Opinions to the contrary, a legislator’s job is to act in the public interest, it is not to change the country and all who live here into a hegemony of power elitists dancing to the tune of “morally superior” beliefs, nor is it to use the economy as their personal tool toward achieving political ends.
The best way to take politics out of an effective, growing, and competitive economy is to take the essential and timely economic decisions away from politicians and put them into an algorithm run by a super computer. Such a computer can do a trillion accurate calculations in one second; we don’t know for certain if Congress can make an accurate decision at all when mathematics, speed, and an unbiased outcome are required.
We need a public discussion on the subject of how to get politics out of the way of progress in matters dealing with emergencies as well as economic well-being and prosperity. Put a better way, perhaps the discussion should be focused on how to bring politics into the light of openness and honesty, and how to use the power of computers to help everyone make faster, more accurate, and more effective use of our resources.
Simulation: How it Works
A simulation model of the U.S. economy is not a new idea. A lot of companies, government agencies, banks, and universities have them. Nearly all are based on equations of processes. The data are updated periodically, as well as the predictive outcomes. Some models are very complex with hundreds of equations that must be solved simultaneously. They are all “top down” mathematical models that use assumptions for large factors to develop predictions for smaller ones. Like chains, they are no stronger than their weakest link. But most are considerably better than guesses, hunches, or no predictions at all.
Micro simulation is different. Advances in massively parallel techniques for solving complex modeling problems, coupled with new sources of micro level data, have now made rapid and thorough simulation a viable economic analysis tool. Beginning about 20 years ago a group of scientists (aka economists/computer programmers) began work on a micro simulation (“bottom up”) model of the U.S. economy. It was called the “Aspen Model.” Their work was located at Kirtland Air Force Base near Albuquerque, New Mexico, and under development by Sandia Laboratories, Inc.
Their model developed and tested economic segments using “agent based algorithms.” Instead of using fixed equations with occasionally changing data, the Aspen Model used the most current data to update the “smart” algorithms, i.e., they learned from past behaviors and produced different results based on how consumer behavior is changing now and how it will likely change tomorrow. In running a simulation of the entire economy, trillions of transactions were calculated again and again, hence the need for a super computer to handle the massive amounts of data. The output tended to track and predict economic data more accurately, and more timely, and with greater specificity.
Sandia Labs now has an even newer model called “N-ABLE.” This model can specifically address what-if questions such as:
1. What economic sectors are most vulnerable to infrastructure disruptions and interdependencies?
2. Which sectors have different usages of energy, transportation, financial, communication sectors?
3. What short-run economic changes affect infrastructure performance?
4. What firms are most affected? Which ones do well, or poorly?
5. How do firms, individuals, and other economic components respond, over time and over regions?
6. What economic mechanism do national, state, and local governments have to assist firms and other economic sectors in their regions?
Once completed, the objective is to have a tool that can simulate outcomes from real or imaginary data. The analysts could ask many “what if” situations from different kinds of terrorist attacks, to natural disasters, or to things that are far removed such as unexpected currency changes—and receive results that easily outperform the best judgments of men. By comparison it would be like pitting IBM’s “Watson” computer against the best Jeopardy! [ii]
champions. They were no match against the accuracy and recall speed of Watson.[iii]
What will follow from the efforts of Sandia Labs’ development of N-ABLE? Today, further public information about N-ABLE is very limited. If now falls under the auspices of Homeland Security. Its physical security as well as the security of its simulations are probably better left under the cloak of national secrecy, at least for now. When more information about its development becomes available, I will amend this essay to bring you up-to-date.
[i] Estimate form the Institute for the Analysis of Global Security.
[ii] Jeopardy! is a trademark name owned by Jeopardy Productions, Inc.
[iii] Named for IBM’s first president, Thomas J. Watson.