In his stories and essays, Hillsdale College President George Roche celebrates the old frontier of the American West, with its values of freedom, family, faith, and courage. But he also writes of the new frontier in high technology which is leading the worldwide revolt against centralization and tyranny.

What is the connection between the old and new frontiers, between a million-fold coming advance in the efficiency of computers and the grinding struggle for food and warmth on the desolate reaches of the Oregon trail?

It is those selfsame values of freedom, family, faith and courage. Today, America is the world leader in technological innovation not because of its government or its Pentagon funding or its accumulated wealth or its natural resources or its continental scope—or any of the blessings to which U.S. success is often ascribed in textbooks—but because of an older ethic of freedom and sacrifice. In short, America’s triumphs spring from the very moral codes and disciplines, the religious commitment and faith, which George Roche champions on his own new frontier at Hillsdale College.

Why Secularism Is Not Enough

Too many people, scientists among them, are prone to believe that technology springs from a culture of secular rationalism in which each generation of children invents anew the codes and disciplines of civilization. But left on their own, children could not even figure out how to tie their shoelaces, let alone temper their appetites or suppress their immediate pleasure in order to fulfill long-term future needs. And we have learned that the more we banish religion and morality from the classroom, the more ill-equipped and credulous our children become. If this century teaches any clear lesson, it is Chesterton’s: When people stop believing in God, they do not believe in nothing; they believe in anything. After decades of remorseless secular schooling, recent polls show that 55 percent of all Americans now believe in astrology, up from some 30 percent twenty years ago.

What produces technological progress is certainly not astrology, nor is it “modern values.“ And as Hillsdale recognizes, progress does not spring from the corrosive creeds of modern materialism: it comes instead from traditional spiritual values, dogged work and discipline, devoted to the fruits of the distant future.

There is no doubt that teaching mathematics and science is crucial to technological progress. But teaching these subjects is far from sufficient to evoke technological advances. Outside of costly hothouse projects for the military, the USSR is a technological wasteland, despite the fact that its schools are far better than ours at teaching mathematics and science to a far greater proportion of the population. Ronald Reagan made the key point in an eloquent speech to the students at Moscow State University: “Even as we explore the most advanced reaches of science, we’re returning to the age-old wisdom of our culture. As Genesis affirms, in the beginning was, the spirit, and it was from this spirit that the material abundance of creation issued forth.“

Mikhail Gorbachev recently echoed Reagan: “The Soviet Union is suffering from a spiritual decline. We were among the last to understand that in the age of information technologies the most valuable asset is knowledge, which springs from individual imagination and creativity. We will pay for our mistake for many years to come.“

The Myth of America’s Decline

Contrary to the trade statistics and misleading productivity data used by many economists, the U.S. has just undergone a phenomenal upsurge of innovation and growth. Even the economic data show that during the 1980s the U.S. increased its share of global exports, manufacturing output, and GNP. In order to comprehend such success, however, it is necessary to understand the technological and business dynamics of the last two decades.

Today, the “experts“ in the academy and the media tell a story of decline and decay that makes recent history incomprehensible. Is it really likely that the capitalist triumph of recent years was achieved through a collapse of growth and innovation? Unlike most of these experts, however, Gorbachev got the message.

The chief development was the microchip, the computer etched on a tiny sliver of silicon the size of a fingernail. Beginning with the computer industry, the impact of the chip reverberated across the entire breadth of the U.S. economy and galvanized the electronics industry into a force with revenues today greater than all U.S. automobile, steel, and chemical manufacturers combined. Quite simply, the microchip—and the personal computer industry it inspired—are the central driving force of global economic growth.

In 1980, the U.S. dominated the computer industry, controlling more than 80 percent of the world market. Most of these revenues were produced by less than ten companies, the IBM Corporation plus “the BUNCH“ as it was called, including Burroughs, Univac, NCR, Control Data, and Honeywell. However, all of these firms, including IBM, lost ground during the ensuing decade, despite the fact that the computer industry grew five times in size and its cost effectiveness improved some ten thousand-fold.

This is an amazing and important story and it bears profound lessons. Imagine for a moment that someone told you back in 1980 that even though the computer industry was about to go though a period of extraordinary growth, all of the U.S. firms then dominant in the industry would suffer drastic losses of market share during the decade and some would virtually leave the business. What would you have predicted for 1990?

Would you imagine that U.S. companies would still command nearly 70 percent of world computer revenue? Despite lavish government programs around the world designed to overtake the U.S. in computing—the major target of every industrial policy —the U.S. held its own in market share and more than tripled its lead in real revenues and profits.

An Entrepreneurial Explosion

It was an industrial miracle. Before we try to copy the strategies of countries that failed, we should try to understand the meaning of America’s surprising success.

What happened was an entrepreneurial explosion: the completely unexpected emergence of some fourteen thousand new software firms. These companies were the catalyst. The U.S. also generated thousands of computer hardware and microchip manufacturers, and they also contributed heavily to the miracle of the 1980s. But the efflorescence of software was decisive. Giving dominance to the U.S. were thousands of young people turning to the personal computer with all the energy and ingenuity that previous generations invested in their Model T automobiles.

A high school hacker and Harvard drop-out, Bill Gates of Microsoft, wrote the BASIC language for the PC and emerged ten years later as the richest man in America. Scores of others followed in his wake, with major software packages and substantial fortunes, which—like Gates’—were nearly all reinvested in their businesses.

During the 1980s, the number of software engineers increased about 28 percent a year, year after year. The new software firms converted the computer from the cult tool of a priesthood of data processing professionals—hovering over huge air-conditioned “mainframes“—into a highly portable, relatively inexpensive appliance that anyone could learn to use.

This entrepreneurial explosion was a total surprise. It is safe to say that of all the hundreds of reports on technological competitiveness released in the U.S., Europe and Japan during the 1970s not one pointed to software hackers as the critical element—not one suggested that getting Bill Gates to drop out of Harvard would be crucial to the success of American computer technology during the 1980s.

But the transformation of the computer into a commodity appliance was largely achieved in this unexpected and entirely unplanned way. With hardware and software, the transformation was made possible by thousands of new companies that were as flexible and innovative as the new desktop computers and software programs they were producing. Although the “experts“ claim that the industry is in a slump, entrepreneurial leaders such as Sun, Compaq and Conner, Microsoft and Intel, remain among the fastest growing firms in the U.S. economy today. The eighties generation of some 120 microchip firms is the fastest growing in history. And the software hackers keep on innovating.

In contrast to the American approach to the computer industry over the last decade, the Europeans have launched a series of unsuccessful national industrial policies, led by national “champion“ firms, imitating a spurious vision of IBM. Their only modest successes have come from buying up American firms in trouble.

Following similar policies, the Japanese have performed scarcely better. For all their splendid technological achievements, the one field in which Japan did not triumph in the 1980s was computers. Rather than imitate the American model which allows thousands of computer companies to flourish in a competitive environment, the Japanese, like the Europeans, adopted the old centralized IBM model. They gambled that big companies, big capital and big mainframe systems with dumb (i.e., passive) terminals attached would be the wave of the future. They gained virtually no market share until the late 1980s when they began producing laptop computers. By early 1990, they had won only four percent of the American market.

Meanwhile, American entrepreneurs have launched a whole series of new computer industries: supercomputers, graphics computers, supermini computers, mini-supercomputers, desktop workstations, multimedia systems, network computers, cellular computers, file server computers, notebook computers, transaction processors, script entry computers—all accompanied by new software. The latest U.S. innovation is an array of special purpose supercomputers. As much as a hundred-fold cheaper than ordinary supercomputers, these new devices handily outperform them for special functions such as reading Pap smear tests, doing three dimensional graphics, executing Navier-Stokes equations (for complex fluid flows), recognizing speech, compressing and rendering video images, and many other uses. Thinking that the “game“ was general purpose super-computers, the Japanese have about caught up in that field, but still find themselves in the wake of American entrepreneurs who constantly change the rules.

Free market enterprise will always beat industrial policy. But today the U.S. government is doggedly trying to kill its entrepreneurial culture with deadly capital gains taxes—retaxing profits already taxed at high rates at the corporate level—and it has launched a national campaign against the so-called “junk bonds“ or high-yield securities that over the last four years have financed some 80 percent of computer industry expansion. The result is to constrict capital access and hike the cost of capital for American companies which are forced to compete with foreign firms that face virtually no capital gains taxes. The problem doesn’t stop with taxes: American companies also confront a growing body of contradictory and unsound environmental and bureaucratic regulations. And liability laws, which habitually favor plaintiffs and drain billions of dollars from those with “deep pockets,“ gravely threaten American enterprise. Even though U.S. computer and software industries remain the world leaders, these problems have to be confronted.

American Colleges as Seedbeds of Innovation

Despite the fact that most of the rationales for over-taxation, regulation and litigation arise on campus, the American system of higher education is an important source of competitiveness. By contrast with U.S. high schools, U.S. colleges are competitive, because they have to face thousands of rivals—unlike most foreign universities which have local monopolies. As a result, our institutions attract would-be entrepreneurs from around the world. The lesson of the American system of higher education is that what breeds competitiveness is competition.

Under the stress of this competition among institutions, U.S. colleges and universities created an entirely new culture of technology within the last ten years. Beyond the 28 percent annual increase in software engineers, the number of trained computer scientists rose 46 percent a year throughout the 1980s—that is, every year, 46 percent more than the year before. In 1980 there were 11,000 computer scientists in the U.S.; in 1986, there were 41,800; and in 1990 the figure exceeded 100,000. Not only did American colleges and universities expand existing computer science programs, but they also launched thousands of new programs. To respond to the radically new technologies and computer languages spawned by earlier graduates, these universities often had to invent entirely new courses. Hundreds of colleges provided adult education in a variety of high technology fields.

Perhaps the key figure in the high technology revolution was a professor at Caltech named Carver Mead. He foresaw as early as the 1960s that he and his students would eventually be able to build computer chips extraordinarily more dense and complex than experts believed was possible or than anyone at the time could design by hand. Therefore he set out to create programs to computerize chip design.

By the end of the 1980s, largely as a result of the work of Mead and his students, any trained person with a workstation computer costing some $20,000 could not only design a major new chip but could also manufacture prototypes on his desktop.

Just as digital desktop publishing programs led to the creation of some ten thousand new publishing companies, so desktop publishing of chip designs and prototypes unleashed tremendous entrepreneurial creativity in the microchip business. During the decade of the 1980s the number of new chip designs produced in the United States rose from just under ten thousand a year to well over one hundred thousand. And it all began with one obscure college professor teaching a few students year after year, who went on to found scores of small companies and thus share this new microchip breakthrough with the world.

The 1990s: A New High Tech Revolution Ahead

In the 1990s, we are about to see a dramatic acceleration of the progress first sown in American universities and colleges by the likes of Carver Mead. Right now, it is possible to put twenty million transistors on a single sliver of silicon the size of your thumbnail; by the year 2000, it will be a billion. To understand what a billion transistors means, think of the central processing units of twenty Cray 2 supercomputers, which are the most powerful computers on the market today, each of which costs some $20 million.

Just after the turn of the century, American (and Japanese) computer companies will be able to put the computer power of twenty Cray 2 supercomputers on a single chip and manufacture it for under $100. Of course, in an industry of revolutionary surprises, the chip won’t necessarily take the form of twenty general purpose supercomputers. But the dimensions of progress are summed up by that measure. It signifies that, in the next decade or so, we’re going to see about a million-fold rise in the cost effectiveness of computing hardware.

This impending advance of a million-fold improvement in computer efficiency is the most important fact in the world economy today. Gorbachev apparently senses it, and so do his generals. But do the “gloom and doom“ experts have the slightest conception that this is going on? Do they have any idea of the explosive impact such progress will bring? To get an idea of the likely effect, we should examine the impact of a much smaller but still huge advance achieved during the 1980s.

In 1977, virtually all computer power was commanded by large mainframe computers, mostly from IBM, with dumb terminals attached. Ten years later, by 1987, less than one percent of the world’s computer power was commanded by such large computer systems. In 1987, there were some 80 million personal computers in the world. In 1990, it has been estimated there are over fifty million personal computers in the United States alone; by comparison, the U.S. has more than three times as much computer power per capita as Japan.

The 1990’s counterpart of the mainframe computer—similarly vulnerable to the onrush of more powerful tools—is the television industry. Just as there were a few thousand mainframe computers linked to dumb terminals, there are today just over 1,400 television stations and a handful of networks supplying millions of dumb terminals known as television sets, or “idiot boxes.“

The experts will tell you that the Japanese made the right decision when, ten years ago, they launched a multibillion dollar program to develop “high definition” television. In fact, it is widely assumed that HDTV will dominate electronics by the end of the next decade. The advice of the experts, therefore, is “catch-up-and-copy,“ summoning a massive government effort to create our own high definition television sets.

HDTV does represent a significant advance; the new sets have a resolution five times higher than current models. The television industry may well improve its technology in other respects. But all these gains will be dwarfed by the coming technology of the telecomputer: the personal computer upgraded with supercomputer powers for the processing of full-motion video.

As we’ve seen, the computer industry will improve its cost effectiveness about a million times during the next decade. Unlike HDTV, which is mostly an analog system specialized for the single purpose of TV broadcast and display, the telecomputer is a fully digital technology. It creates, processes, stores and transmits information in the non-degradable form of numbers, expressed in bits and bytes. This means the telecomputer will benefit from the same learning curve of steadily increasing powers as the microchip with its billion transistor potential and the office computer with its ever proliferating software. With tens of thousands of hardware and software firms in the U.S., the computer industry is an entrepreneurial force vastly more innovative and vigorous than the television broadcast and manufacturing industries. And the television set, even with the benefit of high definition cosmetics, is still just a passive receiver.

The digital computer or telecomputer can, receive full motion video just as well as any television set can. Indeed the computer can dispense with most of the complex conversion processes of analog HDTV and accept perfect digital signals from fiber optic telephone lines. The computer, however, is not only a receiver; it is also a processor of video images, capable of windowing, zooming, storing, editing, and replaying. Furthermore, the computer can originate and transmit video images that will be just as high quality and much cheaper than the current television and film industries can provide.

This is a huge difference. It goes beyond the possibility of receiving perhaps a hundred one-way TV channels to having access to as many channels as there are computers attached to the network: millions of potential two-way channels around the world. With every desktop a possible broadcasting station, thousands of U.S. firms are already pursuing the potential market of video systems as universal and simple to use as the telephone is today. Imagine a world in which you can have access to any theater, church, business, college classroom, or library anywhere. The freedom which will be extended to the individual by this technological innovation is beyond compare.

Gorbachev and His Generals Get the Message

Most Americans have never heard of carver Mead, the leading protagonist of the high tech revolution, but once again, Gorbachev got the message. Many people imagine that the breakdown of the Soviet system was the result of a popular uprising, or demand for consumer goods, or a hunger for democracy. But the views of the people have never mattered under communism. What matters are the views of the generals.

Whatever reforms Gorbachev might have wanted to carry forth, he could not have moved forward without the support of the generals. Gorbachev’s generals were not moved by any lust for freedom or desire for more McDonald’s in downtown Moscow. They were moved by the increasing impact of microchip and other computer technology on the future of warfare and the ability of the state to control the individual.

At a recent Moscow conference, a leading American libertarian declared that he opposed the military industrial complex in the U.S. as much as in the Soviet Union. It is a statement that goes unchallenged in most American centers of learning. But in Moscow, his commentator and other Soviets protested. They declared that without the U.S. military industrial complex, he could not even be celebrating liberty in Moscow and no one would be discussing free markets in the Kremlin.

When President Reagan introduced the Strategic Defense Initiative, it was widely denounced and caricatured as an impossible dream of a perfect shield in the sky. But neither Reagan nor any of his advisors ever imagined that it could be perfect. They did know that it would be good enough to send a message to Gorbachev and his generals.

Nonetheless, the experts in the academy, the media, and even in the defense establishment said it was impossible to program the needed equipment. For one thing, they argued, in order to master the command, control and communications of such a system, you would need a supercomputer a hundred times more powerful than the most powerful supercomputers of the day.

Now we know that the experts once again underestimated American ingenuity. The new supercomputers will indeed be a hundred times as powerful as existing computers. But it will be possible to put one of them in each of the many thousands of interceptors in the missile defense array. It will be possible to create a completely decentralized kind of strategic defense in which the previous perplexities of command control and communications can be entirely overcome. Such weapons will be crucial in defense against accidental attacks or Third World madmen like Saddam Hussein. For the first time it will also be possible to create defensive equipment with pattern recognition powers usable against terrorists as well as military aggressors.

The message received by Gorbachev and his generals was not merely the potential capability of SDI. It was the recognition that a totalitarian government, no matter how many scientists, engineers, and mathematicians it produces, and no matter how many five-year plans it devises, cannot keep pace with the fruits of freedom in an open society.

An Image of Sand and Glass

If Gorbachev and his generals could get the message, perhaps we can now dare to pass it on to American college campuses and faculties, which, unlike Moscow, are still benighted by the shades of Karl Marx. Let us send them instead an image of sand and glass.

The sand comes in the form of a silicon microchip inscribed with a logical pattern as complex as a street map of the United States, switching its traffic flawlessly in trillionths of seconds.

The glass comes in the form of fiber optic threads as thin as human hair and as long as Long Island, fed by laser diodes as small as a grain of salt and brighter than the sun. Each system, in place today for AT&T between Chicago and the East Coast, can send the equivalent of a thousand Bibles a second across the land and indeed could transmit the entire contents of the Library of Congress down one fiber in eight hours. Using copper technology, by contrast, it would take five hundred years.

This image is not some far-off dream. It is here now and will be gathering momentum for the next decade. Together the two technologies made from ordinary sand—fiber optics and silicon chips—will form a global network of computers and cables, a world-wide web of glass and light that leaves all previous history in its wake.

Consisting of technologies that defy the normal constraints of time and space, this vision also transcends the materialist superstitions that have governed most of human history. For thousands of years, the route to power was the control of land and armies. Today when you can put “worlds on a grain of sand,“ control of specific territories declines in importance. What matters is not the control of lands but the liberation of minds.

The American frontier, celebrated by George Roche and Hillsdale College, is a frontier of mind and spirit, and it epitomizes the disciplines and values that have brought about the technological revolution, which itself opens new frontiers of mind and spirit. But it is not any peculiarly American character trait that has made our nation the world leader on this technological frontier. It is American freedom beckoning to the world with a vision, not of service to the state, but of service to others—it is the values of freedom, family, faith and courage defended by schools like Hillsdale College, and which are embodied in our Constitution and our Judeo-Christian heritage.

As we move through the 1990s, with its promise of million-fold gains in computing, it is crucial that our schools transmit the values of American freedom and faith to future generations. As President Reagan told the Moscow students: “In the beginning was the spirit and it was from this spirt that the material abundance of creation issued forth.“