The computers of the future, seen from 1964

(written by lawrence krubner, however indented passages are often quotes). You can contact lawrence at: lawrence@krubner.com

This is from 1964, though parts of it seem to be talking about the modern era:

Today there are probably more than twenty thousand computers in use within the United States, and correspondingly large numbers are installed in many other countries around the world. Computers run at speeds of up to millions of operations per second, and do so with negligible rates of error. Their linguistic abilities have been broadened impressively through development of elaborate programming systems, and their memories can be virtually unlimited in size over a range of times of recall.

By achieving reliability along with capability, computers have won broad commercial acceptance. But what of the future? What can we expect as computers enter their third decade? Some conservatives have been predicting a deceleration of computer growth for at least five years now. Is there a plateau just over the horizon?


But a complication arises when we try to distribute small chunks of computation widely on a regular basis. The electric utility finds it easy to accommodate numerous customers consuming as little as 1 kilowatt-hour or 1 watt-hour at a time. It does not even have to charge a premium for the privilege of using small chunks if the total monthly consumption of a customer is large enough.

Not so for computation, as indicated by present experiments with computer systems that share their time among a number of concurrent demands. These experiments, while demonstrating the feasibility of making a conventional computer accessible to many small remote users simultaneously, also demonstrate the sizable hidden cost of such service. Overhead in supervising user programs, as well as in shuffling them around memory, can increase actual costs to several times the figure implied by a naive analysis based on more conventional computer techniques. But today’s computers were not built to be time-shared. With a new generation of computers, overhead of the kind mentioned may shrink to relative insignificance.

…The concept of an information-processing utility poses many questions. Will the role of information utilities be sufficiently extensive and cohesive to create a whole new industry? If so, will this industry consist of a single integrated utility, like American Telephone and Telegraph, or will there be numerous individual utilities, like Consolidated Edison and the Boston Gas Company? Will the design and manufacture of computing components, terminal equipment, and programming systems be accomplished by subsidiaries of the information utility, as in the telephone industry, or will there be a separate industry of independent private manufacturers, like General Electric and Westinghouse in today’s electrical equipment industry?

…Perhaps the most important question of all concerns the legal matter of government regulation. Will the information utility be a public utility, or will it be privately owned and operated? Will some large companies have their own information utilities, just as some companies today have their own generating plants?

This bit foresees the conflict between regulation and market forces regarding the rise of the big cloud providers such as Amazon:

Given the advanced state of development of present communications lines, it is unlikely that information utilities will wish to invest in their own communication networks. This may be taken as an argument against the necessity for stifling free competition and placing information utilities under public regulation; yet, there is another massive investment that the information utilities will not be able to sidestep as easily, if at all — namely, investment in the large programming systems required to supervise the operation of the information utility and provide its services. The information utility should be able to shift part of this burden to the shoulders of its customers, but it will have to bear responsibility itself for the design, maintenance, and modification of the core of the programming system. The vast potential magnitude of this system, plus the fact that its usefulness may not extend beyond the physical machinery for which it was constructed, plus the possibility of programming waste from having too many entries in the field, may tip the balance in favor of a regulated monopoly.

Interesting to note that the unusual use of “user” (that is widely accepted in the tech industry) was already in use in 1964:

The organizational impact of the information utility will extend well beyond the one or two industries directly concerned. User industries, such as banking and retailing, may also be greatly affected.

This was written when credit cards were still rare in the USA (and at at a time when debit cards were wholly unknown):

Suppose, for example, that businesses of all sizes have simple terminals linking them electronically to a central information exchange. Then each business can make instantaneous credit checks and offer its customers the convenience of universal credit cards. These cards, referred to by some as “money keys.” together with the simple terminals and information exchange, can all but eliminate the need for currency, checks, cash registers, sales slips, and making change. When the card is inserted in the terminal and the amount of the purchase keyed in, a record of the transaction is produced centrally and the customer’s balance is updated. A signal is transmitted to the terminal from the central exchange if the customer’s balance is not adequate for the sale. Positive credits to the customer’s account, such as payroll payments, benefits, dividends, and gifts are entered in a similar way. Periodic account statements are figured automatically and delivered to customers, perhaps directly to a private terminal for some, or by postal service for others.

And, just think, marketing could be customized to each person, for industries such as insurance:

Personalized insurance would have considerable marketing appeal, and offers several subtle advantages.

I am terrified and sad to consider how long people have been developing a line of thought that, to me, points to agent-based simulations, and yet we are not there yet:

Among the chief potential users of custom information are persons engaged in simulation studies and dynamic modeling. Simulation is about the most promising approach known for the general analysis of complex systems and stochastic processes. On the operating level, it affords the user a way of asking the question, what if. The use of simulation by staff specialists, systems analysts, decision makers, social scientists, and others will markedly expand as the information utility makes powerful computers and programming systems easily accessible.

Most users of simulation will not have the knowledge or desire to build their own models, especially as simulation starts being applied by line managers and operating personnel. Assistance in the formulation, adjustment, and validation of models will be provided by an on-line simulation center, joined by the information utility to both the users and the relevant information sources. Simulation service, like information, will be obtained by a procedure as simple as dialing a telephone number.

Here is the rare case where someone makes a prediction about life 36 years in the future, and gets it exactly correct:

Barring unforeseen obstacles, an on-line interactive computer service, provided commercially by an information utility, may be as commonplace by 2000 AD as telephone service is today. By 2000 AD man should have a much better comprehension of himself and his system, not because he will be innately any smarter than he is today, but because he will have learned to use imaginatively the most powerful amplifier of intelligence yet devised.

Source