Discover the Exciting Stories of the IT Industry from the 1960s

Topics
curiosities
technology
computing
Published On
February 17, 2026
Reading Time
22 Min
Share This Article
Discover the Exciting Stories of the IT Industry from the 1960s
Discover the Exciting Stories of the IT Industry from the 1960s
The 1960s was perhaps the most contradictory and exciting decade in the history of computing. While the world was gripped by Cold War tensions, the Vietnam War, and the hippie movement, a quiet but all-consuming revolution was taking place in the depths of research laboratories and corporate boardrooms. This was the decade when the computer evolved from an exotic toy for scientists into an indispensable infrastructure of modern civilization. Not only did technology advance, but the mindset itself changed. The software industry was born, plans for the first global networks were created, and humanity interacted with machines in real-time for the first time. The story of the 1960s is not about bits and bytes, but about risk-taking, genius, and the foundations upon which our digital world was built.

The "Big Blue" All-or-Nothing Game – The IBM System/360 Legend

In the early 1960s, IBM (International Business Machines) was already undisputedly the king of computing, but its empire stood on unstable legs. The company's product portfolio was chaotic. They manufactured seven different computer families that were neither hardware nor software compatible with each other. If a customer outgrew their machine and wanted to upgrade to a larger one, everything (the entire software code, data structures, and hardware control processes) had to be rewritten, which was an expensive and painful process that opened the door for competitors.

In 1961, under the leadership of Vice President T. Vincent Learson, the SPREAD (Systems Programming, Research, Engineering, and Development) committee was formed, which articulated the future in a radical 80-page report. A single, unified computer family must be created that uses the same architecture and software from the smallest model to the most powerful. This idea was considered heresy at the time. Engineers rebelled, saying that a machine optimized for scientific calculations could not be good for business data processing as well. However, Thomas J. Watson Jr., IBM's president, understood the strategic significance and approved the project.

The scale of development was incomprehensible. The $5 billion budget (equivalent to tens of billions of dollars in today's value) exceeded the cost of the United States' atomic weapon development program, the Manhattan Project. This was capitalism's largest privately financed commercial risk venture in history. Watson later said: "This was the biggest gamble I ever played". If the System/360 failed, IBM would probably go bankrupt.

Chief architect Gene Amdahl and his team worked on the technical challenges, introducing concepts that are now considered fundamental, such as the 8-bit byte (instead of the previous 6-bit), the 32-bit word length, and general-purpose registers. However, completing the hardware was only the beginning. The real nightmare was the software, the OS/360 operating system. The project's leadership fell to Fred Brooks, who directed thousands of programmers in ever-growing chaos. This is when the famous "Brooks's Law" was born, which he later formulated in his seminal work "The Mythical Man-Month": "Adding manpower to a late software project makes it later". Training new people and increasing the number of communication channels takes more time than the extra manpower brings.

Finally, on April 7, 1964, IBM announced the System/360. The success was overwhelming. Companies loved the promise of "compatibility". They could buy the smaller model and know that their software would run on the larger machines years later. With this step, IBM not only swept the market but also created the basic model of modern computing: platform thinking. The System/360 architecture proved so robust that IBM's mainframe computers today are still capable of running some of the binary code written in the 1960s.

The "Seven Dwarfs" and the BUNCH – Survival in IBM's Shadow

IBM's dominance was so overwhelming (they controlled more than 70% of the market) that the press and the industry only referred to the market players as "Snow White and the Seven Dwarfs". The dwarfs—Burroughs, UNIVAC, NCR, Control Data Corporation (CDC), Honeywell, General Electric (GE), and RCA—fought a desperate battle for the crumbs. The brutality of the situation is well illustrated by the fact that by the end of the decade, two giants, GE and RCA, also gave up the fight and exited the computer business because they saw no prospect of profitability against IBM. At that point, the acronym changed to "BUNCH" (Burroughs, UNIVAC, NCR, CDC, Honeywell), which aptly described the remaining quintet.

The key to survival was specialization. Since they couldn't win by "force", competitors tried to succeed through technological innovation or special market niches:

  • Burroughs: They bet on technical elegance and the banking sector. Their B5000 machine was revolutionary. The hardware was specifically designed to support high-level programming languages (especially ALGOL). While Assembly ruled on other machines, Burroughs engineers believed that the future belonged to structured languages. Their stack-based architecture was decades ahead of its time and in many respects was safer and more efficient than IBM's solutions.
  • UNIVAC: The former market leader (which was synonymous with computers in the 1950s) remained strong in the government and large enterprise sector. They were the first to experiment with dual-processor systems to increase reliability.
  • NCR (National Cash Register): As their name suggests, they came from commerce. Their strategy was brilliant. They didn't compete for supercomputers but targeted the digitization of cash registers and back-end systems, dominating retail data processing.
  • Honeywell: They followed the "cheaper and simpler" strategy. Their famous "Liberator" software could convert programs written for the IBM 1401 to run on Honeywell machines, thus luring away price-sensitive customers.

This competitive situation, although it seemed like a comfortable monopoly for IBM, actually created a constant pressure to innovate. It was the BUNCH companies that often introduced technological innovations first (virtual memory, multiprocessing), which IBM only adopted later, but with greater marketing.

Seymour Cray and the "Wizard of Chippewa Falls" – The Speed Obsessives

Of the "Seven Dwarfs", Control Data Corporation (CDC) chose the boldest path. William Norris, the company's CEO, set a single goal: they would manufacture the world's fastest scientific computers. The key figure in this vision was a reclusive, brilliant engineer, Seymour Cray.

Cray didn't like corporate bureaucracy. In 1962, he told Norris that he could only build the new supercomputer if he could move away from the company's Minneapolis headquarters. Norris agreed and built him a lab in his hometown, Chippewa Falls, in the woods of Wisconsin. Here, far from the harassment of managers, Cray and a handful of 34-person team (including the janitor and cleaner) set out to do the impossible: defeat IBM's development army of thousands.

The result was the CDC 6600, unveiled in 1964. This machine wasn't just faster than its competitors; it was the first machine to be called a "supercomputer". The CDC 6600's secret lay in its revolutionary architecture. Cray recognized that the central processing unit (CPU) was slowed down by administrative tasks such as reading data or printing. Therefore, in the 6600, he built 10 smaller so-called "peripheral processors" (PP) alongside a single, brutally fast central processor. These "small" processors did the dirty work, allowing the central brain to focus exclusively on mathematical calculations. This configuration was the precursor to today's modern GPUs and heterogeneous systems.

The machine's speed (3 megaflops) was three times that of IBM's then top machine, the Stretch (IBM 7030). The cooling was also unique. The heat from thousands of transistors was dissipated by circulating freon gas, making the inside of the machine look like an industrial refrigerator. The CDC 6600 immediately became a favorite of nuclear research institutes (such as Los Alamos), meteorological services, and the military.

When Thomas Watson, IBM's president, saw the market data, he wrote an angry internal memo (the famous "Watson memorandum"): "I don't understand how it's possible that a 34-person laboratory, where the janitor is included in the headcount, was able to defeat us, the world's largest computer manufacturer". Cray's response was reportedly laconic: "It seems Mr. Watson answered his own question". CDC's success proved that in the IT industry, sheer size cannot substitute for genius and focused engineering work.

The Minicomputer Rebellion – When the Machine Came Down to the People

While IBM and CDC followed the "bigger is better" principle, at an old factory building in Maynard, Massachusetts, Digital Equipment Corporation (DEC) went in a completely different direction. Ken Olsen, the founder (who had previously worked at MIT's Lincoln Laboratory), recognized that there was a huge demand for computers that didn't cost millions, didn't require a separate wing of a building, and were interactive.

DEC's philosophy was that the computer should be a tool, not a deity. In 1960, they introduced the PDP-1 (Programmed Data Processor). Although its $120,000 price still seemed high, it was a fraction of a mainframe's cost. The PDP-1's real distinction was the cathode ray tube (CRT) display and keyboard. The user immediately saw what they typed, and the machine responded immediately. This direct feedback seemed magical to engineers accustomed to punch card batch processing.

However, the real breakthrough came with the PDP-8, introduced in 1965. This was history's first successful minicomputer. Its price was just $18,000, and it was the size of a large refrigerator (or a small closet), so it fit in an average laboratory, university department, or even a submarine. More than 50,000 PDP-8s were sold, which was an astronomical number at the time.

The minicomputer revolution was not only technological but also cultural. Because these machines were cheaper, access to them was not as strictly regulated. Students and young researchers could spend nights in front of the machines, experimenting, playing, "hacking". This freedom laid the groundwork for the later hacker culture, the UNIX operating system, and ultimately the philosophy of personal computers. DEC showed that computing could be decentralized and personal.

"Spacewar!" and the Big Bang of the Gaming Industry

The interactive capabilities of the DEC PDP-1 also inspired the world's first true computer game. In 1962, a group of students led by Steve "Slug" Russell at MIT (members of the Tech Model Railroad Club) decided to demonstrate the new machine's capabilities. They didn't choose a boring computational task but translated their sci-fi fandom into code. Thus Spacewar! was born.

In the game, two spaceships—the slender "Needle" and the stocky "Wedge"—fought each other on the screen while the gravitational field of the star in the center attracted both. The game's physics was surprisingly realistic, and the controls were handled using switches on the front of the machine (later they made special control boxes for it—the first gamepads). Spacewar! spread like wildfire at American universities. Since software was not yet copyrighted at the time, everyone who had access to a PDP-1 copied the code and even improved it. Someone added a "hyperspace" function (panic button that randomly teleported the ship), others made the starry background more accurate.

Although Spacewar! never went into commercial distribution (since the hardware needed to run it cost $120,000), its cultural impact is immeasurable. This was the first evidence that the computer could also be a tool for entertainment and art. Nolan Bushnell, who later founded Atari, encountered the game during his college years, an experience that inspired the video game industry explosion of the 1970s.

The "Software Crisis" and the Birth of Software Engineering

By the mid-1960s, the IT industry faced a strange paradox. Hardware was developing exponentially (Moore's Law was starting to kick in), but software couldn't keep pace. Programs were getting bigger and more complex, and development projects were failing one after another. They were late, over budget, full of bugs, and unmaintainable. The previous "artist" or "cowboy" programming approach (where one brilliant lone hero wrote the code) no longer worked for multi-million-line systems.

This phenomenon was called the "software crisis". To solve the problem, the NATO Science Committee convened a conference in October 1968 in Garmisch-Partenkirchen, Germany. This conference became historically significant. It was here that the term "Software Engineering" was officially adopted. The goal was to make software development a disciplined engineering science like bridge building or mechanical engineering, with standards, methodologies, and quality assurance.

The conference highlighted that programming is not just coding but also design, documentation, and testing. Concepts began to crystallize such as structured programming (Edsger W. Dijkstra's famous rant against the "GOTO" statement also occurred during this period), modularity, and the seeds of object-oriented design (through the Simula 67 language).

At the same time, NASA's Apollo program proved in practice the importance of software engineering. Margaret Hamilton, software director at MIT's Instrumentation Laboratory, led the development of the Apollo Guidance Computer (AGC) software. Hamilton's team wrote such robust code that it was capable of handling hardware failures during the critical moments of the moon landing. When Apollo 11 sent the famous "1201" and "1202" error codes during descent (indicating that the computer was overloaded), the asynchronous execution system designed by Hamilton automatically discarded less important tasks (such as updating radar data) and concentrated all resources on engine control and landing. Hamilton's work elevated software development to the ranks of respected engineering professions; the widespread adoption of the term "software engineering" itself is largely thanks to her.

SABRE – The Real-Time Business Revolution

While engineers struggled in laboratories, a revolution was also taking place in the business world. In the early 1960s, airline ticket booking was a nightmare. At American Airlines (AA) headquarters, in a huge room, agents sat around rotating tables, keeping track of flights on paper cards. Checking and recording a reservation took an average of 90 minutes, and errors (overbooking, lost data) were everyday problems.

The solution came from a chance encounter. C.R. Smith, AA's president, and R. Blair Smith, an IBM sales executive, sat next to each other on a flight. From their conversation came the idea for SABRE (Semi-Automated Business Research Environment). The goal was a system that could access and modify the central database in real-time from anywhere in the country.

The technical challenge was enormous. IBM used the experience from the SAGE air defense system (which was developed to track Soviet bombers). By 1964, SABRE was in full operation. Two IBM 7090 mainframes (the most powerful machines of the era) handled tens of thousands of transactions daily. Available seats appeared in seconds on agents' terminals, and reservations were immediately "burned" into central memory. SABRE not only made American Airlines a market leader but also created the foundations of modern e-commerce and online transaction processing (OLTP). Every modern bank transfer, online purchase, or hotel reservation is a distant descendant of SABRE.

The Data Storage Leap – The "Disk Pack" and Portability

In the early 1960s, data was typically stored on magnetic tapes or punch cards. Tape was cheap and had high capacity, but it had one huge drawback: it was sequential. If the data was at the end of the tape, you had to wind through the whole thing, which took long minutes. The lack of random access hindered the development of real-time systems (like SABRE).

In 1962, IBM introduced the 1311 Disk Storage Drive, which brought a brilliant innovation: the removable disk pack. The device looked like an industrial washing machine, and the disk pack consisted of six 14-inch magnetic disks stacked on top of each other in a protective enclosure. One pack's capacity was 2 million characters (about 2 MB). The revolutionary innovation was that the disks could be removed from the drive and placed on a shelf or transferred to another machine. This enabled the physical mobility of data and "infinite" storage capacity (you just had to insert a new pack). Disk packs became an iconic visual element of 1960s and 1970s data centers and established the dominance of hard disk drives (HDDs) for the next 40 years.

BASIC – The Democratization of Programming at 4 AM

On May 1, 1964, at 4 AM in Hanover, New Hampshire, at Dartmouth College's computing center, two tired but excited professors, John Kemeny (János Kemény) and Thomas Kurtz, ran the first program in a completely new programming language. The program was simple, just two lines: PRINT "Hello, World!" and END. But these two lines made history. BASIC (Beginner's All-purpose Symbolic Instruction Code) was born, which in the following decades would be millions of people's first encounter with programming.

Kemeny and Kurtz's goal was radical in the context of the era: they wanted anyone (not just mathematicians and engineers) to be able to use the computer. In the early 1960s, programming was a matter of genius or at least years of study. Assembly language was hardware-specific, FORTRAN was scientific, COBOL focused on business applications. All were difficult, full of mysterious syntax and cryptic error messages.

BASIC, in contrast, used simple English words. PRINT printed, INPUT requested data, GOTO jumped to another line, IF...THEN meant conditional branching. A ten-year-old child could understand it. The language was interactive; the user typed the command and immediately saw the result. This direct feedback seemed magical in a world accustomed to batch processing, where you submitted a program on punch cards, waited 2-3 hours (or days), and then received a printout saying there was a syntax error on line 5.

However, BASIC's true revolutionary power lay in combination with the Time-Sharing System (TSS), which Kemeny and Kurtz also developed. Previously, a computer could only be used by one person at a time, and even then only through punch cards. The Dartmouth TSS allowed 100 users simultaneously to work on the central machine, each on their own terminal, feeling as if the entire machine was theirs alone. This technology is the ancestor of modern cloud computing and multi-user operating systems.

BASIC's impact goes beyond technology. This was the language in which Bill Gates and Paul Allen wrote the first Microsoft product (Altair BASIC in 1975), this was what Steve Wozniak learned to program in, and this was the first language "burned" into the ROM (Read-Only Memory) of personal computers. The Commodore 64, Apple II, TRS-80, all booted with BASIC. An entire generation learned to program by typing code from computer magazines and then modifying it until they understood how it worked. BASIC was the gatekeeper.

Douglas Engelbart's "Mother of All Demos" – A 90-Minute Vision of the Future

On Monday afternoon, December 9, 1968, at the Brooks Hall conference center in San Francisco, a reclusive, bespectacled engineer stood on the podium to present "some interesting things". The approximately one thousand people gathered for the event (computer professionals, military leaders, and university researchers) had no idea that in the next 90 minutes they would glimpse the future. Douglas Engelbart's presentation later received the name "The Mother of All Demos" and is still considered one of the most important events in computer history.

Engelbart presented the oN-Line System (NLS), on which an entire team (himself and colleagues from the Stanford Research Institute (SRI) Augmentation Research Center) had been working for years. The demonstration was stunning because it showed things that seemed like science fiction at the time but are natural today:

1. The mouse: Engelbart's wooden prototype was the world's first working mouse. The naming was also funny: since the cables formed the "tail", they called it a "mouse". The audience watched in amazement as Engelbart moved the device and a cursor on the screen moved, precisely following the direction and speed of the movements. The birth of the mouse was not accidental. Engelbart experimented with various pointing devices (light pen, trackball, joystick) for years, and the mouse proved to be the most accurate and easiest to use.

2. Graphical user interface (GUI) and windows: In the NLS system, the screen was divided into multiple independent parts, "windows". In one window he edited text, in another he saw a diagram, in the third code. This is completely natural today (probably multiple windows are running on your machine right now), but in 1968 this was magic. Most computers still displayed green characters on a black background.

3. Hypertext and hyperlinks: Engelbart showed how to "jump" between documents using links. He clicked on a word and immediately jumped to another document. This was the conceptual predecessor of HTML and the World Wide Web (WWW), although Tim Berners-Lee only realized it 20 years later.

4. Real-time collaborative editing: One of the most spectacular parts of the demo was when Engelbart established a live video connection with Bill English, who was sitting 48 kilometers away in Menlo Park. Both worked on the same document simultaneously and saw each other's changes in real-time. This is a basic function of Google Docs today, but in 1968 it seemed unimaginable.

5. Video conferencing: He not only exchanged messages with his colleagues at SRI but also saw each other on a giant projector. This was the predecessor of modern Zoom and Teams, and it worked, in 1968, when the internet didn't even exist yet!

The audience's reaction was mixed. Many burst into a storm of applause (standing ovation), but others were skeptical. Some didn't understand the practical benefit of "opening windows" or "clicking" on text. The industry only began to adopt these ideas in the 1980s (with the appearance of the Xerox Star, then the Apple Macintosh). But Engelbart laid the foundations, and the "Mother of All Demos" proves that vision can precede technology's widespread adoption by decades.

The Birth of ARPANET and the First "LO" – The Ancestor of the Internet

In the mid-1960s, the United States Department of Defense faced a delicate problem. At the height of the Cold War, after the Cuban Missile Crisis, the threat of nuclear war was constant, and military and scientific computers operated in isolation. If the Soviet Union launched a nuclear attack and communication centers were destroyed, the entire network would collapse. A communication system was needed that had no center and could continue to function even if any node was taken out.

The Advanced Research Projects Agency (ARPA, later DARPA) asked the best researchers to solve the problem. Key figures included Paul Baran, a Polish-born engineer who worked at the RAND Corporation. In 1964, Baran published a revolutionary study titled "On Distributed Communications", in which he described the concept of a packet-switched network.

Baran's idea was simple but brilliant. Instead of sending a message through a single, continuous "wire" (as in a phone call), the message is broken into small packets. Each packet contains the source and destination address, as well as a sequence number. The packets travel on different routes through the network and reassemble at the destination. If a route breaks (say, a city is destroyed in an attack), the packets automatically go another way. The system is thus "resilient" and "survives" partial destruction.

Independently of Baran, in the United Kingdom, Donald Davies at the National Physical Laboratory also developed the theory of packet switching (and he coined the name "packet"). ARPA project leader Lawrence Roberts combined the two concepts and launched the development of ARPANET.

The first nodes began to connect in 1969. UCLA was the first; on September 2, the IMP, the Interface Message Processor router built by Bolt, Beranek and Newman, was installed. The second was the Stanford Research Institute (SRI) in October.

And then came that historic moment when on October 29, 1969, at 10:30 PM, Charley Kline, a UCLA student, sat down at the terminal. The phone line was set up with Stanford, where Bill Duvall was waiting. Kline tried to log in to the SRI machine. The plan was to type the word LOGIN, letter by letter, and Duvall would confirm by phone that everything arrived correctly.

  • Kline typed L. Duvall on the phone: "I see the L"
  • Kline typed O. Duvall: "I see the O"
  • Kline typed G. And the system crashed.

Thus the very first message in internet history was LO. In English, the beginning of the phrase "Lo and behold", which means "Behold!" or "Look at that!". The irony is that the first message was an error message, but about an hour later they successfully fixed the bug, and the full word LOGIN went through. By the end of the year, four nodes (UCLA, SRI, UC Santa Barbara, University of Utah) formed ARPANET, and the digital communication revolution began.

Packet Switching – How to Survive Nuclear War (and YouTube)

Paul Baran's theoretical work was not only of military significance but also foreshadowed the technology that today forms the backbone of the internet. Baran examined three types of network topologies:

  1. Centralized: There is a central node, everyone connects to it. Fast, but if the center dies, the entire network goes down. This was the model of early telephone exchanges.
  2. Decentralized: Multiple "centers" exist, which are interconnected. Somewhat better at surviving partial destruction, but still vulnerable.
  3. Distributed: There is no center. Every node is equal and connects in multiple directions. If one or more nodes fail, messages bypass the dead nodes. This was Baran's vision, and this became the basis of the internet.

Another genius of packet switching lies in asynchronicity and redundancy. In a traditional phone call, the line is "busy" for the entire duration of the call, even if there's silence. With packet switching, you only use bandwidth when you're actually sending data, and the packet "shares" the wires with other packets. That's why millions can watch YouTube simultaneously without permanently "occupying" any single part of the network.

Baran's original work was initially rejected by AT&T (the American telephone monopoly). They said it "wouldn't work" and that "the packet would get lost". In reality, they feared that packet switching would make their existing telephone networks obsolete. So ARPA developed the first routers (IMPs) with Bolt, Beranek and Newman (BBN) and proved that Baran was right.

ASCII and Bob Bemer – Creating a Common Language

In the early 1960s, computing was plagued by the fact that every manufacturer used a different coding system to store letters and numbers. IBM used EBCDIC, others used BCD, teleprinters used Baudot code. This meant that if you wanted to transfer text written on an IBM machine to a UNIVAC, it had to be converted, which caused errors and was expensive.

In 1963, the American Standards Association (now ANSI) adopted the ASCII (American Standard Code for Information Interchange) standard, which used 7 bits to store a character (128 different characters: English alphabet upper and lowercase, numbers, punctuation, and control characters). ASCII created the "common language" that enabled communication between different systems.

One of ASCII's main designers was Bob Bemer, an IBM engineer who fought for standardization for decades. One of Bemer's legendary inventions was the ESC (Escape) key, which he created so that terminals could send control characters (e.g., color change, cursor movement) separated from text. The ESC is still on every keyboard today, and whenever anyone presses it, they have Bemer to thank.

Although ASCII was "American" (it only contained the English alphabet), later international versions (e.g., Latin-1, then UTF-8) extended the character set to handle letters from other languages as well. UTF-8 is still backward compatible with ASCII today, since the first 128 characters are the same. Bemer's work thus affects every email, web page, and text file we use today.

The Founding of Intel – The "Traitorous Eight" and the Birth of Silicon Valley

On July 18, 1968, two respected engineers, Robert Noyce and Gordon Moore, left Fairchild Semiconductor and founded their own company. Their move was part of a longer story that became part of the Silicon Valley myth: the story of the "Traitorous Eight".

In 1957, eight young engineers (including Noyce and Moore) worked at William Shockley's (one of the inventors of the transistor) laboratory in Palo Alto, California. Shockley was a brilliant scientist but a catastrophic manager. Paranoid, tyrannical, and unpredictable. The eight engineers secretly contacted a venture capitalist, Sherman Fairchild, and left Shockley to found Fairchild Semiconductor. Shockley angrily called them "traitors". Ironically, this "betrayal" became the moment of Silicon Valley's birth because it established the culture that it's not a sin to leave a company for a better opportunity.

Fairchild became successful (they developed the integrated circuit), but after 10 years, Noyce and Moore felt that the parent company on the East Coast was too bureaucratic. When they approached Arthur Rock (a San Francisco investor), Rock raised $2.5 million for them in half an hour, because investors believed in them so much.

Initially, they founded the company under the name NM Electronics (the two founders' monograms), but they quickly realized that a better name was needed. The choice was "Intel", short for "Integrated Electronics". The goal was clear: to manufacture semiconductor memories to replace slow and unreliable magnetic core memories.

Intel's culture was completely different from East Coast corporate style. Noyce and Moore rejected hierarchy. Everyone (even they) worked in open offices (cubicles, not walls). There was no dress code, no privileges for top executives. This "startup culture" later became Silicon Valley's gold standard. Although the microprocessor (the famous 4004) only came in 1971, Intel's 1968 founding was the moment when Silicon Valley truly began to become the world's technology center.

The Olivetti Programma 101 – The Forgotten First PC

When we look for the inventors of the "personal computer", we often think of the Apple, Altair, or IBM PC. But there was a device that preceded them by years and can rightfully claim the title of "the world's first personal computer". This was the Italian Olivetti Programma 101, which was presented at the 1965 New York World's Fair, where it was a huge success.

The P101 was designed by engineer Pier Giorgio Perotto and his team at Olivetti's laboratory in Ivrea. Adriano Olivetti, the company's charismatic leader (who was also a humanist and social visionary), supported the project even when his colleagues considered it "madness". Olivetti believed that computing was not only for large corporations but also for individuals. An idea that preceded Apple's "Think Different" philosophy by decades.

The Programma 101's design was futuristic. It was designed by industrial designer Mario Bellini, and the machine was so elegant that it was later included in the permanent collection of the Museum of Modern Art (MoMA) in New York as one of the most beautifully designed objects of the 20th century. The machine didn't evoke the cold whiteness of data centers but looked like a device for a writer's desk.

The P101 was also technically remarkable. It included a magnetic card reader on which programs and data could be saved (this was the precursor to "portable storage"). It also had a small printer that printed results on paper tape. It was programmable, although programming was still simple: loops, conditional branches, and mathematical operations, but this was already more than a calculator.

The machine was hugely successful. NASA also bought some; P101s were used for Apollo 11 spacecraft trajectory calculations at the ground control center. The U.S. military used them in the Vietnam War to coordinate artillery fire. The world's largest banks and insurers also ordered them. About 44,000 units were sold worldwide, which was a gigantic number for such a device at the time.

Unfortunately, Olivetti couldn't capitalize on the advantage. Adriano Olivetti suddenly died in 1960 (heart attack on a train), and the new management didn't understand the strategic importance of computing. Under financial pressure, they sold the computer division and exited the market just as it was explosively growing. The Programma 101 thus became one of the "what if" stories, (proof of Italian genius) that couldn't change the world but showed the way.

The HP 9100A – The "Programmable Calculator" That Wasn't Really a Calculator

In 1968, Hewlett-Packard introduced the HP 9100A, a desktop-sized programmable device that was technologically a full-fledged computer. But Bill Hewlett, the co-founder, categorically forbade calling it a "computer". Why? For business reasons.

If the machine was called a "computer", then in corporate purchasing departments, the decision would go to the IT department, to the "data processing center gurus". These people thought in terms of mainframes, and they wouldn't take seriously a device that "only" cost $4,900 (not millions). But if it's called a "programmable scientific calculator", then engineers and physicists could buy it on their own authority, from their own budgets, like paper forms. A few signatures and done.

The 9100A was indeed a wonderful machine. It was capable of trigonometric functions (sin, cos, tan), logarithms, exponentials, and could store programs of up to 196 steps on magnetic cards. Programming used Reverse Polish Notation (RPN), which although it seemed strange, was extremely efficient, and HP's brilliant engineers claimed it was faster than algebraic notation. RPN later became a trademark of HP calculators.

The machine was hugely successful in the scientific and engineering community. Universities, research labs, and aerospace companies bought it. NASA, the space agency (again!) also used the 9100A. Science magazine in 1968 pointed out that this was "a new device category" that bridged the gap between mechanical calculators and large computers.

The term "personal computer" appeared in advertising related to the HP 9100A. This was one of the first times this word was used in print. The personal computer era thus didn't begin with the Altair (1975) or the Apple II (1977), but the concept had already emerged in the 1960s.

ELIZA – Artificial Intelligence's First "Psychologist"

In 1966, Joseph Weizenbaum, an MIT professor, created a program that seemingly could converse in natural English with a human. The program's name was ELIZA, after the flower girl in George Bernard Shaw's "Pygmalion", who is taught refined speech. The program's most famous version, "DOCTOR", imitated a Rogerian psychotherapist who questions back the patient's words.

A typical ELIZA conversation looked like this:

  • User: "I have a headache." ELIZA: "Why do you say you have a headache?"
  • User: "I'm nervous." ELIZA: "How long have you been nervous?"
  • User: "Because of my mother." ELIZA: "Tell me about your mother."

The trick was simple. ELIZA searched for keywords (e.g., "mother") and responded with pre-written templates. It didn't "understand" the meaning of sentences, just pattern matching. If it didn't find a keyword, it responded with general phrases like "I see. Go on!" or "That's interesting. Tell me more!".

Weizenbaum's original goal was to demonstrate the limitations of natural language processing. He wanted to show how simple tricks could deceive people. But reality shocked him. People who used ELIZA began to emotionally bond with it. Even Weizenbaum's own secretary, who knew how the program worked, asked the professor to leave her alone because she wanted to discuss "private things" with ELIZA.

This phenomenon became the "ELIZA effect", which pointed out that people tend to attribute human qualities (empathy, understanding, intelligence) to the computer, even when they know it's just a program. Weizenbaum, who was originally an enthusiastic supporter of AI (artificial intelligence), later became one of the technology's loudest critics. He wrote a book titled "Computer Power and Human Reason" (1976), in which he warned the world that it's dangerous to entrust machines with decisions that require ethical judgment.

ELIZA was a precursor to modern chatbots (Siri, Alexa, ChatGPT) and practical applications of the Turing test. But Weizenbaum's warning is still relevant today: we easily believe that the machine "understands us" while it just follows templates.

Unimate – The First Industrial Robot on the Assembly Line

While programmers experimented with AI, another revolution began in industry: robots entering factories. In 1961, at General Motors' (GM) Trenton (New Jersey) plant, a 1.5-ton hydraulic robot arm was installed that lifted hot metal parts from the mold and placed them on the assembly line. This was Unimate, the world's first industrial robot.

Unimate was invented by engineer George Devol in the 1950s, but for years he couldn't find a buyer. Factories didn't believe that a machine could reliably perform repetitive tasks. Joseph Engelberger, a young entrepreneur (who was later called the "father of robotics"), believed in Devol's vision and founded the Unimation company. Eventually they managed to convince GM to do a pilot project.

The robot's success was overwhelming. Unimate could work 24 hours without stopping, didn't tire, didn't make mistakes, and wasn't injured by hot metal or toxic fumes. GM soon ordered dozens more robots, and other automakers followed suit. By the end of the 1960s, Unimate robots were welding, painting, and assembling cars throughout America and Japan.

The introduction of industrial robots also caused tension. Unions feared that machines would take people's jobs. But manufacturers argued that robots do the dangerous, monotonous work, freeing people to focus on higher-level, more creative tasks. This debate continues today and is increasingly relevant in the age of AI.

Why Are the 1960s the Most Important Decade in IT History?

Looking back at the 1960s, we see an era when computing stepped out of the experimental phase and formed into a real industry. Mainframes were no longer rare, minicomputers democratized access, programming began to become an engineering science, and the foundations of network communication were born. At the beginning of the decade, computers were still room-sized, air-conditioned sanctuaries, outputting results on paper tape, and only specialists understood them. By the end of the decade, they were interactive, communicated in real-time, and visions of the future (the mouse, windows, hypertext) had appeared.

The pioneers of the 1960s (Engelbart, Hamilton, Cray, Kemeny, Baran, Perotto, and others) didn't just build machines but created culture. The culture of hackers, engineers, entrepreneurs, and dreamers. This decade was characterized by two paradoxes: the speed of progress and skepticism. Many didn't believe that the mouse, packet switching, or minicomputers would ever be important, but the pioneers knew what they saw and persisted with their visions.

Every element of our digital life today, from browser windows to cloud services, from video chat to robotic manufacturing, grew from a 1960s "crazy idea". These stories remind us that the greatest innovations often come from the most unexpected places: from a game, a failed project, a conversation on a flight, or a forest lab in Wisconsin.

The 1960s are not just the past but the foundation of the present and the blueprint for the future.


Subscribe to our Newsletter

Stay updated with the latest news and insights from our team.

Enter your name and email address to subscribe