by Rob Cunningham November 1976
The synergy concept was further developed by psychologist Abraham Maslow, who found that in the psychologically mature person there is no dichotomy between self-interest and kindness toward others. A "society of healthy individuals," Maslow conjectured, "would almost surely be a (philosophically) anarchistic group" and would afford "much more free choice than we are used to." [2]
By "synergy" (in a social context) we mean that increased power to realize his own goals that each individual obtains by acting together with others. Because synergic action is directed toward individual, self-chosen goals, it can arise only in a voluntary, non-coercive social context. The political correlate of the concept of synergy is the free-market society, in which each individual would be free to live and act as he chose, so long as he did not intrude on the lives and property of others. In such a society people would obtain mutual benefits by pursuing their own self-interest, either in voluntary cooperation or independently. The common premise of the free-market concept and Maslow's "synergy" is that people's rational interests are normally harmonious rather than antagonistic. Many persons would probably agree that a society based on freedom and natural synergic forces is a highly desirable ideal. But in an unregulated environment, according to a familiar argument, unbridled selfishness would induce some men to abuse their freedom and seek power and advantage over others. Consequently harmonious, synergic relationships would break down and strife and class conflict would ensue. In this paper it will be contended that a synergic society would contain internal forces tending to maintain its socioeconomic equilibrium and precluding the degeneration just described. Two basic kinds of synergy may be observed in social relationships. The more obvious and widely recognized kind is synergy of means. Generally, this term refers to the increased power and productivity often realized by individuals who allocate their means in a cooperative manner to achieve agreed-upon ends. Synergy of means is the universal basis for cooperative agreements and for the pooling of resources and energies for the attainment of common interests. The second and more subtle form of synergy can be called information synergy. This term denotes the increased flow of information and the consequent improvements in efficiency that are realized when individuals function independentlyor, on a broader scale, when decision-making is decentralized throughout a social system. Because it favors individuals and small, independent units over large, bureaucratic organizations, information synergy might even be labeled "dissociative synergy," in contrast to the "associative" synergy of means. At first glance the notion of dissociative synergy might seem paradoxical, but it can be clarified by a simple example. Two scientists, after years of unsuccessful joint research in pursuit of a miracle drug, decide to pursue their investigations independently. Thus they find they can now explore a wider range of approaches to the problem simultaneously. Moreover, each finds that his work is invigorated by the opportunity to compare his findings with those achieved by the contrasting methods of his former associate. Thus each scientist progresses more quickly and surely by working independently, rather than in explicit cooperation, and in this sense he benefits from "dissociation." Since these benefits would not occur if each worked in total isolation, their relationship is at the same time truly synergic. In economics, the advantages of associative synergy were understood early, finding explicit recognition in Ricardo's famous "law of association" (or law of comparative cost). This law showed in mathematically rigorous fashion the almost universal mutual gains in productivity that individuals could achieve by cooperative allocation of their efforts and resources. On the one hand, Ricardo's law demonstrated the immense increases in general productivity and living standards that people could realize from specialization, division of labor, unhampered trade, and other forms of economic cooperation. These benefits, it was shown, would extend to all men and countries, regardless of relative productive ability. Moreover, this law dispelled fears that the unhampered free market might degenerate into an atomistic arena of conflicting petty interests or into a struggle of strong versus weak. For individuals could naturally be expected to seek to maximize their utilities through associative advantage, so that the pursuit of selfish interests would lead to the growth of mutually beneficial or "synergic" relations. Eventually, however, an exclusive concentration on associative advantage led some economists to question the long-run efficiency of the unregulated market. As firms derived more and more associative benefits from mergers, pooling of assets, and other agreements, it was feared, powerful monopolies and cartels might come to dominate the market to such an extent that the vital process of competition would be disrupted. The full impact of this argument becomes evident when we consider the central role of competitive bidding in free-market theory. Bidding serves first to keep the demand and supply of every good in constant balance. Moreover, competitive bidding for various consumer goods brings their relative prices in line with their varying marginal utility to consumers. Finally, bidding by capitalists for land, labor, and capital goods tends to establish for each factor a price reflecting its (anticipated) marginal contribution to the production of consumer goods (discounted for the time factor and for risk). [3] Since these prices ultimately determine how resources will be allocated, the bidding process is essential to the market's success in rationally allocating means to ends and hence in best satisfying consumers. If that competitive process is hampered, it is argued, resources may be allocated in a sub-optimal manner. Conceivably, for example, a monopolistic firm could reap advantages from inelasticities in the demand curve for its product by raising its price above the equilibrium level. [4]
The fundamental weakness in this argument, as we shall see shortly, is that it fails to grasp the implications of the scarcity of information. In this and many other conventional models, supply, demand, and economic conditions in general are regarded as given data, which are automatically known by buyers and sellers (or central planners). Typically, such models assume that people not only will pursue their self-interest but moreover will maximize their long-run utilities like automatons with such clairvoyant precision that their activities and the resulting economic aggregates can be mechanistically predicted by mathematical equations. Values and costs are all assumed to be known in advance, and economic planning thus reduces to the technical problem of achieving the most valuable ends by the least costly means. In practice, of course, men do not always act rationally in the sense of using those means that will eventually prove most appropriate to the ends sought. Rather, men act in accordance with their ultimate ends only to the degree that they correctly anticipate the effects of their actions, given their context of limited understanding. The assumption of full knowledge is remote from a reality where the lack of full information imposes restrictions and costs on human action. Now the effects of imperfect knowledge might be largely insignificant for economic analysis if they were known to be independent of the structure of an economic system. For then we might reasonably assume that deviations arising from human error were essentially random and would have no significant or predictable aggregate effect on economic development. Relatively little study has been given, however, to the question of possible connections between information needs and the basic nature of the economic system. Specifically, do certain kinds of economic systems promote greater rates of information flow and corresponding gains in efficiency? Conversely, do information requirements impose any conditions or tendencies on economic systems? To investigate these questions we need to supplement conventional economic theory with a consideration of economies as information systems. The data in economic systems have two salient characteristics. First, despite their abundance and complexity, the available data are always incomplete. Observation of the concrete choices made by individuals in economic activity reveals only a limited sample of the valuations that make up underlying subjective value scales. For example, if we observe the demand for a consumer product at the prevailing price, we can draw only certain limited qualitative conclusions about the marginal utility of that product to consumers. In particular, we cannot deduce how much of that product would be traded at a higher or lower price. In the second place, economic data are constantly changing, due to variation both in personal value scales and in external physical conditions. Thus the supply and demand curves of the past are at best only approximations to those of the present. In order to function efficiently in such a dynamic, undetermined environment, an individual or firm must have constant access to fresh and varied experience. As a consumer, an individual can discover what goods and services are most beneficial to him (or her) only if he is free to experiment with a wide range of alternatives. In an open, competitive market, he determines how to fulfill his personal wants most efficiently through direct experience. He becomes familiar, not only with the particular goods offered in the market, but also with the very process of valuing and making decisions. To the extent that the open market is replaced by central planning authorities, his decision-making experience is diminished. Even if such central planning is controlled by democratic means, his discretionary power becomes largely limited to occasional participation in the electoral process, where his personal vote is diluted by thousands of others. In general, his values and choices therefore tend to be inferior to those he would develop in a decentralized market system. The producer's need for experiential data is best understood by considering the case of a monopolistic firm. In order to make its production and pricing decisions, such a firm needs information regarding the demand curve for its product. One possible source for this information might be subjective opinions based on customer surveys or on insight into customer psychology. Such assessments, it should be clear, are generally much less reliable than the objective, observable data of concrete marketplace decisions. Alternatively, the firm could test the demand curve at various selling prices over a period of time. Yet the "optimum" prices and production levels established by such a time-consuming trial-and-error process would necessarily reflect past economic conditions only and would therefore be highly unreliable. In order to circumvent this time lag, the firm could experiment with several prices and marketing approaches simultaneouslythus simulating a competitive environment of many sellers. Such a decentralized approach, however, would also be needed in order to make many other decisionsthe optimum price to be paid each supplier, the most profitable allocation of internal resources, choices of production methods, and so on. If decentralization were applied to all of these problems, the firm would cease to operate as an integrated entity and would become equivalent to a network of competing firms. Ultimately, a firm's most reliable information source is a competitive market in which a number of buyers and sellers are simultaneously and repeatedly testing supply and demand levels at various prices [5]. Evidently, then, the market is not just a system for producing goods, but also a system for generating essential information regarding values and costs. From this perspective bidding is a communication system by which buyers and sellers become aware of profitable and economical actions. Similarly, profit and loss are seen here not merely as incentives to productivity, but also as information feedback signaling success or failure in satisfying consumer wants. [6] As an information-producing process competition is synergistic because it fulfills the information needs of all competitors and thereby enables them to operate more profitably. This conclusion can be summed up in a "law of dissociation": by retaining their independence of function, individuals or firms can create a mutually desirable (synergic) increase in the information flow needed for advantageous operations. The efficiency of the decentralized, competitive market as an information system can be more clearly understood in the light of comparative analysis of other information-processing systems. In 1957 John von Neumann compared man-made computers with the nervous system and noted several striking differences. Artificial automata, he noted, are composed of fewer and larger components, whereas the human nervous system is a relatively decentralized network of at least 10 billion tiny neurons [7]. Furthermore, the nervous system processes its data in a "highly parallel" manner, in contrast to the "serial" methods of the computer:
Finally, in contrast to the precise algorithmic control systems in machines, "the message-system used in the nervous system...is of an essentially statistical character." This statistical system, von Neumann believed, "leads to a lower level of arithmetical precision but to a higher level of logical reliability," due largely to the greater tendency of serial computations to compound and amplify initial errors [9]. We can see here a likely explanation for the continuing marked inferiority of digital machines to men in tasks such as pattern recognition [10], which involve large bodies of incomplete and often ambiguous data. As one informed observer has concluded, such tasks require not sequential examination or trial-and-error procedures but rather a "global" grasp of information [11].
In the decentralized competitive marketplace, the data of human value scales and external conditions are processed in a parallel, statistical manner through the bidding of buyers and sellers. The combined experience of many competitors, who are continually and simultaneously testing market preferences, generates prices and determines levels of allocation. This system is clearly more akin to the processing network of the nervous system, especially when contrasted with the sequential trial-and-error methods to which central planners and monopolies are restricted. Moreover, only such a decentralized market network is capable of detecting and responding to essential economic patterns. The urgent economic need for market data is particularly striking in the case of the "vertically integrated" firm. Vertical integration, as defined by economist Murray Rothbard, "occurs when a firm produces not only at one stage of production, but over two or more stages" [12]. For example, a vertically integrated widget manufacturing firm might be engaged in the production not only of widgets for consumer use but also of machines and tools it required for widget production. In the open market, the price of any factor of production tends toward its anticipated marginal productivity [13]. The vertically integrated firm, however, must determine the implicit values of its own intermediate products in order to calculate its profit or loss from each phase of operation. As Dr. Rothbard demonstrates, such values can be accurately determined only by referring to external market prices. [14] Thus, as a highly simplified example, suppose that our widget firm must decide whether to produce widgets by manual labor or by building Type I or Type II widget-producing machines in order to maximize its profit. Before it can rationally invest in one of these three alternatives, the firm must first estimate the relative output it would realize from each type of production, as well as the relative time (a costly factor) required for each process. Clearly, such information could be most easily gleaned from observations of competitors. Let us assume, however, that technical formulas exist by which these parameters can be predicted. The firm must also project the relative costs involved at each stage under the alternative plans. For example, manual widget workers, Type I machine operators, Type II operators, and workers used in assembling the various machines may all require different wage rates. To a large extent, that pay differential may reflect psychological preferences that cannot be accurately predetermined except by reference to existing well-developed labor markets. Thus our widget firm is dependent upon competitive market data, not only in its pricing decisions when negotiating with customers and suppliers, but even in calculating how to allocate its resources internally. In the absence of competitors, the vertically integrated firm tends to be inefficient, unprofitable, and hence economically unstable (if not protected by governmental interference). As Rothbard concludes: "this economic law sets a definite maximum to the relative size of any particular firm on the free market." [15] The information problems of the vertically integrated monopoly are also faced on a much vaster scale by the planners of a state-controlled economy (as under socialism or fascism). Ludwig von Mises was the first to point out the impossibility, in a state-dominated economic system, of solving the central problem of economicsthe rational allocation of means to ends [16]. Partly in response to Mises's challenge, socialists developed various mathematical systems and econometric methods, which purport to solve the problem of optimal allocation. Soviet economists in particular have put much faith in computers, cybernetics, and linear programming as planning tools for the socialist society of the future. [17] Optimizing methods such as linear programming begin by postulating a set of variables (perhaps representing allocation levels of factors) that are under the control of the planner. The technical constraints imposed upon the planner by scarcity of resources are then described as a series of (mathematically expressed) "feasibility" conditions. Then a function is defined representing the "objective" or utility that the planner hopes to maximize. Optimization theory then seeks to determine values of the variables for which the objective function will realize its highest possible value, subject to the feasibility limitations. Formal algorithms are known for solving this problem in the linear case and under certain other very restrictive conditions. Now let us assume for a moment that the mathematics of optimality is fully amenable to computer solution. Let us suppose further that the socialist planner has at his disposal full knowledge of the technological formulas and physical and human resources that he needs to determine his feasibility inequalities or equations. Clearly, any economically reasonable formulation of his objective function will have to take into account the personal utilities and disutilities of consumers and laborers [18]. At any particular time consumers will desire certain goods and services more urgently than others. These varying relative utilities must be determined and evaluated in terms of a common unit of measure. But marginal preferences among goods find their only concrete expression in the voluntary marketplace, where they are measured in terms of the common denominator of money. Furthermore, the relative subjective costs or disutilities associated with different types of labor must be measuredagain requiring an open market in which workers are free to bid for wages and choose among different kinds of work. Also, the use of land (natural factors) may involve subjective welfare considerations that can be accurately assessed only in the market [19]. Finally, the voluntary marketplace is needed, not only to determine the general structure of prices and wages, but moreover to make detailed decisions concerning the distribution of goods and tasks among particular individuals. For example, in order to decide who will be poets and who will be engineers, the planner needs to evaluate individual preferences that can be measured only by market choices. In short, since the central planner has already ruled out reliance upon markets, he cannot know the personal value scales upon which the coefficients of his objective function must be based. [20] The dilemma facing state planners is thus in essence not mathematical but epistemological. Socialist economists have been misled here by a seeming paradox: certain economic solutions may be technically feasible, in terms of the resources available, but yet may not be feasible in practice because the information necessary to determine those solutions, or even to gauge success in achieving them, is not available. The determination of personal utilities and costs is a task requiring, not the mathematical methods of central computers and planners, but rather the flexible, decentralized information-gathering capacity of the voluntary market. In actual practice, as many observers have noted, socialist planners have been obliged to rely upon decentralized, quasi-market processes in order to establish economic guidelines [21]. In addition, socialist economies benefit from a remote effect of information synergy by observing the prices established in (relatively free) foreign and international markets [22]. As Rothbard notes:
Of course, the prices established abroad reflect a different set of cultural and geographical conditions and can therefore provide only approximate guidelines for planners. This "remote" synergy is therefore clearly inferior to the synergy of a domestic market. In any case, the inefficiency and impoverishment observed in socialist economies can be attributed in large part to their relatively low levels of information synergy. The synergic disadvantage suffered by concentrated, quasi-monopolistic corporations in a relatively free market is shown empirically by the research of Marxist historian Gabriel Kolko. In The Triumph of Conservatism, Kolko discusses some popular myths about the so-called "progressive era" of American history, beginning with the relatively unregulated economy of the late nineteenth century and evolving into a system of corporate statism dominated by government and big business. The relatively free market at the beginning of this era favored the growth of competition and dispersal of economic power: "the dominant tendency in the American economy at the beginning of this century was toward growing competition" [24]. Efforts of some large businessmen to increase profits by means of mergers and cartels "brought neither greater profits nor less competition. Quite the opposite occurred. There was more competition, and profits, if anything, declined" [25]. As we have already seen, such attempts to consolidate power lead to losses in information flow, efficiency, and economic stability. Eventually, these business interests were able to obtain monopolistic control only by means of restrictive federal regulation. [26] While large size may confer certain associative advantages upon a firm, it also imposes a critical cost in the form of decreased information flow. In the free, unhampered market, firms would therefore tend to approach a certain optimum size, which would be determined jointly by the law of association and the "dissociative" principle of information synergy. Beyond this optimum point, the marginal losses in efficiency due to reduced information input could be expected to outweigh the associative benefits of further growth. In short, information synergy and synergy of means would operate as counterbalancing forces, which together would maintain the equilibrium and efficiency of the market system. Thus the mechanism of information synergy would not only provide the continuing flow of experiential data needed in a dynamic economy in which human valuations and external conditions are constantly changing: it would also play a major role in insuring the long-term stability and continued effective functioning of a free, synergy-based society.
2. Abraham H. Maslow, Motivation and Personality, 2nd ed. (New York: Harper & Row, 1970), 179, 277. [back] 3. For detailed analysis see Murray Rothbard, Man, Economy, and State (Los Angeles: Nash Publishing, 1970), ch. 2-7 (more information). [back] 4. Note that this argument implicitly attributes omniscience to the monopolist with regard to the shape of the demand curve. It will be argued below that such a monopoly ordinarily could not acquire the essential information needed to preserve its position in an open market, much less exploit its demand curve in the supposed fashion. Cf. Rothbard, vol. 2, 566 ff. [back] 5. Because it lacks this essential source of data, a monopoly or quasi-monopoly ordinarily cannot satisfy market demand in the most economically efficient manner. In the unhampered market this inefficiency creates exceptionally profitable and attractive opportunities for potential new competitors. [back] 6. To the theorist who assumes a framework of known values and costs, entrepreneurial profit and loss seem to serve no useful function. In an information-generating model of the market, however, their function becomes obvious. Profits and losses in the free market are determined largely by a firm's ability to penetrate the unknowni. e., to discover less costly production processes and to anticipate market values correctly. [back] 7. John von Neumann, The Computer and the Brain (New Haven: Yale University Press, 1958), 48. [back] 8. Neumann, 50-1. [back] 9. Neumann, 78-80. [back] 10. See Hubert L. Dreyfus, What Computers Can't Do: A Critique of Artificial Reason (New York: Harper & Row, 1972). [back] 11. Dreyfus, 12-8, 24-32. [back] 12. Rothbard, 545. [back] 13. On the relationship between productivity and cost, see Rothbard, 301-8. [back] 14. Rothbard, 544 ff. [back] 15. Rothbard, 548. [back] 16. See, for example, Mises, Human Action, 3rd ed. (Chicago: Henry Regnery, 1963), ch. XXVI (available online). Cf. Rothbard, 548 ff.. As Rothbard observes, rational calculation is seriously impeded by interventionist measures even in the less socialized Western economies. See Rothbard, Power and Market (Menlo Park: Institute for Humane Studies, 1970), 137-8 (more information). [back] 17. See Michael Ellman, Soviet Planning Today (Cambridge: Cambridge University Press, 1971). [back] 18. We assume here, of course, that the planner is motivated by a broad-minded desire to maximize the interests of all persons affected by his decisions. Now human action provides no basis for comparing or summing the personal utilities of different individuals. (See Rothbard, Power and Market, 111.) Consequently, social utility should be conceived, not as a single quantity, but rather as a vector or array of individual utilities. Such an array can then be regarded as optimal if any other feasible array (such as might be attained by redistributing goods and services or tasks) would result in a relative loss of utility for at least one individual. (Some economists regard this "Pareto condition" as only a minimum requirement for optimality.) The free interaction of individual utilities in the unhampered market tends constantly toward such an optimality. In a centrally controlled economy, on the other hand, the absence of information synergy prevents the discovery of optimal solutions, even in the minimal "Pareto" sense. [back] 19. See Edwin G. Dolan, TANSTAAFL: The Economic Strategy for Environmental Crisis (New York: Holt, Rinehart and Winston, 1969) (more information). [back] 20. In the competitive market each firm can estimate its objective function (or expected profit) in monetary terms from prevailing prices and wages. Moreover, if the firm is small in comparison with the market, the relevant costs and returns are apt to be roughly linear. Thus, ironically, the linear-programming methods in which Soviet economists have specialized are most applicable to the competitive capitalist firm. [back] 21. Even socialist theory has made concessions to market procedures. Thus Polish socialist Oskar Lange, responding to Mises's argument, proposed that planners could allocate factors by using prices aimed "to equilibrate demand with supply." Thus they would vary prices through "a series of successive trials," continuing "until equilibrium is finally reached." The determination of prices (and hence wages and incomes) by means of supply and demand represents a significant departure from pure socialism ("from each according to his ability, to each according to his need"). Final equilibrium, however, is neither achievable nor desirable in the real world of constantly varying value scales, technology, and external conditions. An ongoing, decentralized process is required for rapid adjustment to such a dynamic environment. See George R. Feiwel, The Soviet Quest for Economic Efficiency (New York: Frederick A. Praeger, 1967), 14-5. Cf. Rothbard, Man, Economy, and State, 274-80. [back] 22. See Mises, Planned Chaos (Irvington-on-Hudson, NY: Foundation for Economic Education, 1970), 84 (more information). [back] 23. Rothbard, Power and Market, 138. [back] 24. Gabriel Kolko, The Triumph of Conservatism (New York: Free Press of Glencoe, 1963), 4 (more information). Cf. 26-54. [back] 25. Kolko, 4-28. [back] 26. "Big business led the struggle for the federal regulation of the economy." (Kolko, 58-9.) They were successful in this effort largely because the political climate was favorable to statism; as Kolko comments, "the history of the relationship between business and government until 1900 was one that could only inspire confidence in the minds of too many businessmen." (Kolko, 59.) [back] |
Previous |