The Comparative Analysis Of The History Of The Computer Science And The Computer Engineering In The USA And Ukraine.

The Comparative Analysis Of The History Of The Computer Science And The Computer Engineering In The USA And Ukraine.

The Comparative Analysis Of The History Of The Computer Science

And The Computer Engineering In The USA And Ukraine.

HOWARD H. AIKEN AND THE COMPUTER

Howard Aiken's contributions to the development of the computer -notably the Harvard Mark I (IBM ASSC) machine, and its successor the Mark II - are often excluded from the mainstream history of computers on two technicalities. The first is that Mark I and Mark II were electro-mechanical rather than electronic; the second one is that Aiken was never convinced that computer programs should be treated as data in what has come to be known as the von Neumann concept, or the stored program.

It is not proposed to discuss here the origins and significance of the stored program. Nor I wish to deal with the related problem of whether the machines before the stored program were or were not “computers”. This subject is complicated by the confusion in actual names given to machines. For example, the ENIAC, which did not incorporate a stored program, was officially named a computer: Electronic Numeral Integrator And Computer. But the first stored-program machine to be put into regular operation was Maurice Wiles' EDSAC: Electronic Delay Storage Automatic Calculator. It seems to be rather senseless to deny many truly significant innovations (by H.H.Aiken and by Eckert and Mauchly), which played an important role in the history of computers, on the arbitrary ground that they did not incorporate the stored-program concept. Additionally, in the case of Aiken, it is significant that there is a current computer technology that does not incorporate the stored programs and that is designated as (at least by TEXAS INSTRUMENTSâ) as “Harvard architecture”, though, it should more properly be called “Aiken architecture”. In this technology the program is fix and not subject to any alteration save by intent - as in some computers used for telephone switching and in ROM.

OPERATION OF THE ENIAC.

Aiken was a visionary, a man ahead of his times. Grace Hopper and others remember his prediction in the late 1940s, even before the vacuum tube had been wholly replaced by the transistor, that the time would come when a machine even more powerful than the giant machines of those days could be fitted into a space as small as a shoe box.

Some weeks before his death Aiken had made another prediction. He pointed out that hardware considerations alone did not give a true picture of computer costs. As hardware has become cheaper, software has been apt to get more expensive. And then he gave us his final prediction: “The time will come”, he said, “when manufacturers will gave away hardware in order to sell software”. Time alone will tell whether or not this was his final look ahead into the future. DEVELOPMENT OF COMPUTERS IN THE USA

In the early 1960s, when computers were hulking mainframes that took up entire rooms, engineers were already toying with the then - extravagant notion of building a computer intended for the sole use of one person. by the early 1970s, researches at Xerox's Polo Alto Research Center (Xerox PARC) had realized that the pace of improvement in the technology of semiconductors - the chips of silicon that are the building blocks of present-day electronics - meant that sooner or later the PC would be extravagant no longer. They foresaw that computing power would someday be so cheap that engineers would be able to afford to devote a great deal of it simply to making non-technical people more comfortable with these new information - handling tools. in their labs, they developed or refined much of what constitutes PCs today, from “mouse” pointing devices to software “windows”. Although the work at Xerox PARC was crucial, it was not the spark that took PCs out of the hands of experts and into the popular imagination. That happened inauspiciously in January 1975, when the magazine Popular Electronics put a new kit for hobbyists, called the Altair, on its cover. for the first time, anybody with $400 and a soldering iron could buy and assemble his own computer. The Altair inspired Steve Wosniak and Steve Jobs to build the first Apple computer, and a young college dropout named Bill Gates to write software for it. Meanwhile. the person who deserves the credit for inventing the Altair, an engineer named Ed Roberts, left the industry he had spawned to go to medical school. Now he is a doctor in small town in central Georgia.

To this day, researchers at Xerox and elsewhere pooh-pooh the Altair as too primitive to have made use of the technology they felt was needed to bring PCs to the masses. In a sense, they are right. The Altair incorporated one of the first single-chip microprocessor - a semiconductor chip, that contained all the basic circuits needed to do calculations - called the Intel 8080. Although the 8080 was advanced for its time, it was far too slow to support the mouse, windows, and elaborate software Xerox had developed. Indeed, it wasn't until 1984, when Apple Computer's Macintosh burst onto the scene, that PCs were powerful enough to fulfill the original vision of researchers. “The kind of computing that people are trying to do today is just what we made at PARC in the early 1970s,” says Alan Kay, a former Xerox researcher who jumped to Apple in the early 1980s.

MACINTOSH PERFORMA 6200/6300

Researchers today are proceeding in the same spirit that motivated Kay and his Xerox PARC colleagues in the 1970s: to make information more accessible to ordinary people. But a look into today's research labs reveals very little that resembles what we think of now as a PC. For one thing, researchers seem eager to abandon the keyboard and monitor that are the PC's trademarks. Instead they are trying to devise PCs with interpretive powers that are more humanlike - PCs that can hear you and see you, can tell when you're in a bad mood and know to ask questions when they don't understand something.

It is impossible to predict the invention that, like the Altair, crystallize new approaches in a way that captures people's imagination. Top 20 computer systems

From soldering irons to SparcStations, from MITS to Macintosh, personal computers have evolved from do-it-yourself kits for electronic hobbyists into machines that practically leap out of the box and set themselves up. What enabled them to get from there to here? Innovation and determination. Here are top 20 systems that made that rapid evolution possible. MITS Altair 8800

There once was a time when you could buy a top-of-the-line computer for $395. The only catch was that you had to build it yourself. Although the Altair 8800 wasn't actually the first personal computer (Scelbi Computer Consulting`s 8008-based Scelbi-8H kit probably took that honor in 1973), it grabbed attention. MITS sold 2000 of them in 1975 - more than any single computer before it. Based on Intel`s 8-bit 8080 processor, the Altair 8800 kit included 256 bytes of memory (upgradable, of course) and a toggle-switch-and-LED front panel. For amenities such as keyboard, video terminals, and storage devices, you had to go to one of the companies that sprang up to support the Altair with expansion cards. In 1975, MITS offered 4- and 8-KB Altair versions of BASIC, the first product developed by Bill Gates` and Paul Allen`s new company, Microsoft. If the personal computer hobbyists movement was simmering, 1975 saw it come to a boil with the introduction of the Altair 8800.

Apple II

Those of you who think of the IBM PC as the quintessential business computers may be in for a surprise: The Apple II (together with VisiCalc) was what really made people to look at personal computers as business tools, not just toys. The Apple II debuted at the first West Coast Computer Fair in San Francisco in 1977. With built-in keyboard, graphics display, eight readily accessible expansion slots, and BASIC built-into ROM, the Apple II was actually easy to use. Some of its innovations, like built-in high-resolution color graphics and a high-level language with graphics commands, are still extraordinary features in desk top machines. With a 6502 CPU, 16 KB of RAM, a 16-KB ROM, a cassette interface that never really worked well (most Apple It ended up with the floppy drive the was announced in 1978), and color graphics, the Apple II sold for $1298.