Home > The 10 most definitive technologies

The 10 most definitive technologies

Supplier News

From a pop culture standpoint, 1975 was a mixed bag: On the one hand, you had Saturday Night Live; on the other, you had the Bay City Rollers and The Six Million Dollar Man. From a technology standpoint, the results were less mixed: You had a couple of kids named Jobs and Wozniak tinkering with circuit boards and parts in a Silicon Valley garage, hoping to turn the computer world on its head. Then there was that Gates guy, messing around with code, dropping out of Harvard and launching a company that would eventually go on to make business history.

In fact, the year marked the start of an exciting new era in electronics and computer technology, in its own way every bit as groundbreaking and revolutionary as the social and artistic turmoil of the 1960s.

So here we are 30 years later. Technologies have come and gone; a few have had staying power, and only a mere handful can be said to have truly transformed everyday life. We’re focusing on those last ones.

Mobile phones

Imagining a world with no mobiles is kind of like thinking about life without cars, TVs or, uh, interruptions while you’re eating. But in 1975, unless you were lucky enough to be inside a vehicle with a rare and expensive car phone, phone chatting was generally limited to the home, the office or a pay phone. “The [mobile] is a technology that meshed perfectly with public demand,” says Alan Nogee, a principal analyst at technology research firm In-Stat. “People like to talk, and mobiles let them do it from anywhere.”

The first mobile call took place on 3 April 1973, when Martin Cooper, then general manager of Motorola’s communications systems division, dialled up a rival at AT&T’s Bell Labs. By the time commercial mobile service began, in 1983, Cooper’s shoebox-like 30-ounce (850 g) phone had morphed into the 16-ounce (453 g) Motorola DynaTAC (priced at US$3,500). Within seven years, there were a million subscribers in the United States. Today there are more mobile subscribers in the world than wireline phone subscribers, and mobile phones weigh as little as 3 ounces (85 g). “It’s not unheard of, in certain developing countries, for people to pay up to 20 percent of their yearly income in order to have a mobile phone,” says Nogee.

Today, the arrival of 3G broadband networks is transforming the mobile phone from a voice-only device into a multimedia handset that lets users send and receive photos and videos and take advantage of an array of high-speed data-oriented services. This trend is, in turn, opening new sales opportunities for electronic component vendors, ranging from processor makers to display and audio subsystems manufacturers.


Fairchild Semiconductor’s Frank Wanlass invented complementary-MOS (CMOS) technology. In the early 1960s, Wanlass realised that a complementary circuit of negative-channel MOS (NMOS) and positive-channel MOS (PMOS) would draw very little current. After some tinkering, he was able to create a CMOS devicee that shrank standby power by six orders of magnitude, compared to equivalent bipolar logic gates. On 5 December 1967, Wanlass was issued US Patent 3,356,858 for “Low Stand-By Power Complementary Field Effect Circuitry”.

Yet Wanlass’ invention, for all its potential, would have to wait a few years to become truly significant. The technology revolution that began in the mid-1970s required chips that were both powerful and power-thrifty. CMOS ICs provided the ideal solution. “The digital watch was actually the first important application of CMOS technology,” says Nathan Brookwood, principal analyst at Insight64, a semiconductor industry research firm. Today CMOS forms the basis of the vast majority of high-density ICs manufactured. The technology is found in almost every electronic product, from handheld devices to mainframes.

Because they consume less power, CMOS devices also operate at lower temperatures than comparable bipolar devices. “CMOS allows high-density chips the size of a postage stamp to be cooled without exotic techniques,” notes Brookwood. “Without low-power CMOS technology, microprocessors would run so hot that they would literally melt.”

Although it’s a low-power technology, CMOS still generates some heat. Therefore, as densities increase, chip makers are facing the same problems they experienced with bipolar devices, says Len Jelinek, a principal analyst at research firm iSuppli. “Although CMOS hasn’t reached the end of its useful life by a long shot,” he says, “there is a need to begin looking toward new materials, such as strained silicon, which will take us to the next level of power and heat conservation.”

Configurable ASICs

As the tech revolution drove processor use to new heights, a need arose for devices that were pre-configured for specific uses. Such a “hardwired” technology (as opposed to adaptable general-purpose microprocessors) would boost performance and cut costs, by eliminating needless fetching, interpreting and other types of operational overhead. Available by the early 1980s, application-specific integrated circuits (ASICs) enabled the manufacturers of products ranging from mobile phones to video game consoles to order chips that were custom-tailored to their unique needs.

After a good, strong run, traditional ASIC technology gave way in the early 21st century to a new, improved version: the configurable ASIC. Like its predecessor, a configurable ASIC provides an architecture that’s tuned to a specific application, providing high performance at low cost. With the configurable design, however, reconnecting the chip’s function units in different ways allows the chip’s structure to be changed. In other words, new data path structures can be formed simply by changing how the data flows from one unit to another. Data path control can also be easily reprogrammed. “You have the ASIC’s advantages, but it can be reconfigured by the customers so that it has some of the advantage of a processor,” says Alan Varghese, a research director at technology research firm Allied Business Intelligence.

A configurable ASIC’s architecture determines just how configurable it is. If only slight modifications are needed, the device can be designed to have only minimal configurability and little increased cost. If wide flexibility is needed, the ASIC can be designed to be much more configurable, albeit at a higher cost.

Aditya Prasad, an analyst at technology research company Frost & Sullivan, is optimistic that ASIC technology can evolve to meet future needs. “The growth in end user markets will keep ASICs in contention,” he says.

EDA Software

The need in the 1970s to create increasingly complex circuitry threatened to outstrip the ability of engineers to “handcraft” designs. Fortunately, technology itself provided an answer to this problem, in the form of electronic design automation (EDA) tools.

EDA software uses computers to create, verify and simulate the performance of electronic circuits on either a chip or a PCB. EDA is perhaps the most significant of all the technologies created over the last three decades, for without EDA it would be impossible to design the highly dense, intricate circuits required by modern electronic devices. “It’s like the way writing has progressed from pen-and-ink to PCs,” says Insight64’s Brookwood.

EDA has moved through several different eras, including gate-level tools and register transfer level (RTL) technology. “Today we are moving up to the electronic system level,” says Gary Smith, chief EDA analyst at technology research firm Gartner. “That’s where electronic hardware and software are designed concurrently.”

To say that EDA is essential would be to understate the technology’s importance. “All products start with a design,” says Smith. “The electronics market would not exist without EDA tools.”


Although the configurable computing concept was first proposed in the 1960s, the first commercially available FPGA didn’t appear until 1985. Providing cells of logic blocks connected via software-configurable interconnections, an FPGA is essentially a blank slate on which a programmer can build a chip on-the-fly. Early FPGAs contained only a few logic cells and took entire seconds to reconfigure. Today, an FPGA can contain hundreds of thousands of cells and can be reconfigured in microseconds.

The FPGA’s evolution has followed the general principle of Moore’s Law: The number of logic cells embeddable in a chip has doubled every 18 months. Soon FPGAs should have millions of logic cells, making them incredibly powerful and inexpensive for large-scale computational tasks. Rather than relying on multiprocessing systems, FPGA-based systems can be reconfigured on the fly to perform extremely complex algorithms. FPGAs also provide integrated chip emulation, which speeds up the process of producing ASICs.

All that is required to improve FPGA performance is vendor-developed software, says Prasad. “FPGAs also have a much shorter time to market than ASICs,” he says.


Photolithography—a process akin to developing film in a darkroom—dates back to the early 19th century and the roots of photography. In chip making, photolithography transfers a pattern image from a photomask (which is comparable to a photographic negative) to the surface of a wafer substrate. In the mid-1970s, the combination of projection printing and positive photoresist technology revolutionised photolithography in chip production. Without mask/wafer contact, defect rates were dramatically lowered and yields improved substantially.

In 1979 GCA developed a step-and-repeat photolithography technology for wafer exposure that allowed increasing finer resolutions and paved the way to line-width shrinks. This breakthrough, along with the move from visible light to ultraviolet light and ever shorter wavelengths, launched the rapid move toward increasingly finer process technologies that continues to this day. “Think about the problem of printing a billion points on a piece of silicon that is less than an inch on each side,” says Insight64’s Brookwood. “Photolithography is one of the key mechanisms that’s used to keep Moore’s Law on track.”

RFID technology

During World War II, the British had a desperate need to distinguish their own returning aircraft, flying in across the English Channel, from those of the enemy. A radio transponder was installed on Allied aircraft to give an affirmative response to an interrogating “Identify: Friend or Foe” (IFF) signal, and RFID technology was born.

RFID remained primarily an aviation-oriented technology until the 1980s. That’s when microelectronics shrank tag sizes and prices to the point where the tags could be economically used on industrial assembly lines. Although early RFID technology tracked mostly large assets, such as production line carriers and motor vehicles, prices had fallen low enough by the early 21st century that they could be embedded directly into shipping pallets and even individual products. The technology is also beginning to be used to track people, such as children and hospital patients, sometimes to the consternation of privacy watchdogs.

Apart from improved technology, a concerted move toward standardisation has helped bring RFID into the mainstream. “Organizations, ranging from MIT’s Auto-ID Center to leading supply chain players such as Wal-Mart have played a powerful role in benchmarking and popularising RFID,” says Marcus Torchia, a senior wireless analyst at the research firm Yankee Group. “RFID is well on its way toward becoming as ubiquitous as bar code technology.”

RISC processors

During the mid-1970s, improved performance measurement tools showed that the execution of most applications on then-dominant complex instruction set computer (CISC)-based systems was dominated by a few simple instructions. The vast majority of instructions, as it turned out, were seldom used. In short order, researchers realised that compilers could be used to generate software routines that would perform complex instructions that were currently done in hardware on CISC machines.

In October 1975, a project was launched at IBM’s Watson Research Center that, four years later, gave birth to a 32-bit RISC microprocessor. As planned, the device turned out to be faster than its CISC counterparts and could be designed and built more economically. These advantages eventually led to a revolution in computer performance and cost. “Even though the original IBM 801 processor was never used in a production system, it did become the basis for IBM’s later Power and PowerPC RISC processors,” says Linley Gwennap, principal analyst at research firm The Linley Group.

Despite its advantages, RISC has made only limited headway in the Windows-oriented PC/server markets, where, due to software compatibility issues, Intel’s x86 platform reigns supreme. Yet this situation is slowly changing. Most newer PC processors, beginning with Intel’s P6 microarchitecture, essentially function as RISC devices that emulate a CISC architecture. RISC’s rising dominance in other computer markets, however, continues unabated. The technology is now the overwhelming leader in workstations, mobile phones and video games.

Surface-mount technology

Before the late 1970s, electronic components such as ICs, transistors and resistors were designed for installation by people rather than machines. The arrival of surface-mount technology flipped things around. Surface-mount components, which typically look like little specks of metal, are mechanically designed with small metal tabs that are soldered directly to the surface of a PCB (unlike conventional components, which typically have wire leads that are inserted through board holes and soldered onto the opposite side). For this reason, surface-mount components are usually made as small and lightweight as possible. In fact, surface-mount components are a quarter to a 10th of the size and weight, and half to a quarter of the cost, of wire-mounted parts.

Surface-mount technology arrived just in time for the great move toward product miniaturisation. Besides trimming assembly costs, surface-mount devices have allowed engineers to cram more technology into smaller form factors. Imagine, for example, if you tried to make an iPod with conventional components, it would look like a boom box.

Surface-mount technology revolutionised circuit board assembly, in much the same way ICs changed discrete component technology, says iSuppli’s Jelinek. “It opened the opportunity to create all the handheld devices we have today, simply because we’ve been able to shrink the packages.”

802.11 wireless networks

A few years ago, a “hot spot” was an exciting restaurant or nightclub, not a coffee shop or a hotel room. The arrival of 802.11 wireless local-area network technology changed all that. “There was wireless LAN technology long before 802.11, but this was the standard the industry really grouped around, opening up the market,” says In-Stat’s Alan Nogee.

The IEEE 802.11b standard, ratified in late 1999, allowed data to fly through the air at Ethernet speeds: up to 11 Mbit/s (although real-world speeds were often half that, due to various forms of electrical and physical interference). In early 2000, major networking vendors such as Cisco Systems and Lucent Technologies, plus a host of smaller companies, quickly jumped on 802.11b.

Equipment prices subsequently fell, and popularity soared. Newer 802.11 standards, such as the 54 Mbit/s 802.11g, have boosted data throughput and reliability. The 108 Mbit/s 802.11n specification is currently awaiting final approval, which may occur as soon as next year.

The future of 802.11 appears almost limitless. Although most current 802.11 devices are either desktop or laptop PCs, the technology is rapidly spreading out to IP telephones, PBXes, media players, televisions and other products that can utilise text, audio or video data.

“Wireless LANs will be a utility, just like electricity, gas or phone service,” predicts Nogee. “Manufacturers will build all types of products, assuming that customers have wireless Internet access in their home or office.”

Newsletter sign-up

The latest products and news delivered to your inbox