Moore’s law is dead… and that’s fine

Moore’s law is dead… but that’s fine

A passion for computing power: The origins

When I was around 6, I used to play a lot on my dad’s phone. It was a Nokia 3310 and I was addicted to playing snake. This snake game was extremely entertaining and I spent literal hours on it trying to finish it (spoiler alert, I never did).

Nokia 3310’s snake game

With no phone or computer for myself, this was the closest I got to what chips & computing could do. It seems silly now, compared to what we have now, but at the time, it was peak technology.

And that’s how far we’ve come !

I later on had the chance to get my hands on better technology and got interested in software development very early on. And, every year that passed came with new technology. Faster, thinner, better. And all of this was in big part due to how much computing power you could fit in a small form-factor. That’s where microprocessor come in.

A computing device (a computer, a phone, a tablet) is constituted of a lot of parts, but some of the main ones are the processor, the battery & the screen.

To make sure that you can have a thin and light device that still stays powerful, you have to make sure to keep the processor as small and low-consumption as possible, otherwise you’ll need a huge battery or space for the processor itself.

A transistor

Roughly speaking, a processor is mostly just a bunch of transistors packed together very tightly.

The more transistors you’re able to put on a chip the “faster” it will be.

The rise and fall of Moore’s law

What is Moore’s law ?

Moore’s Law is named after Intel cofounder Gordon Moore. He observed in 1965 that transistors were shrinking so fast that every year twice as many could fit onto a chip, and in 1975, Moore adjusted his observation to a pace of doubling every two years.

He observed that the technology around transistors advanced so fast that we managed to shrink them more and more every year, allowing to put more in processors.

He then stated that the number of transistors in chips would double every 18 months (later revised to 2 years). That roughly meant that you could get 2 times faster chips every two years.

The limits of Moore’s Law

As an example, the very first intel chip was Intel 4004, in 1971, which held 2300 transistors, each technology node of a size of 10μm (micro-meter, 0.000 01m, 10^-5).

Intel 4004 chip

In 1999, Intel announced the Intel Pentium III, a processor counting 9.5 million transistors with a technology node of a size of 180nm (nano-meter, 0.000 000 180m, 1.8x10^-7).

In just 28 years, that’s an increase in transistors of about 4.1 thousand times. Considering Moore’s law, if every two years, the number of transistors doubled, we’d have seen an increase of 2^(28/2)=16 000.

Moore’s law being more of a rule of thumb, we can totally consider it applied at the time (even though we have a factor of 4 between expected and real life)

But if we look at later examples, a few days ago, Intel announced it will launch its Intel Core i9–12900K chip counting 2.95B transistors with a technology node size of 7nm (nano-meter, 0.000 000 007m, 7x10^-9).

We’re talking about a miniaturization of about 1500 times and 130 million times more transistors in a single chip.

Intel Core i9–12900K chip

While this is incredibly impressive there’s 51 years between the release of those 2 chips. So following Moore’s law in a simple way, we should have gone from 2300 transistors to (2300 x 2**(51/2))=109 142 205 468 so around 110B transistors. So about a 1000 times more than what we have for real.

If this isn’t evidence enough that Moore’s law doesn’t apply anymore for transistors, I don’t know what is.

The slowing down of the rise due to physics

Moore’s law is no longer relevant

In 2015, Nvidia’s CEO, Jen-Hsun Huang declared that Moore’s law was not applying anymore in terms of computing power. Especially at the time, it was a bold move, Moore’s law had been fueling a lot of interest in technology and was seen as the pinacle of computing development. Even as the CEO of Nvidia, such a claim will meet a lot of push back. But the facts and numbers don’t lie.

Nvidia’s CEO, Jen-Hsun Huang

There’s actually a very good explanation for this downfall. When you’re trying to miniaturize transistors and put as many as possible on a little piece of silicone, you start reaching the fundamental limits of physics (for example the speed of light limit and quantum interactions only working at a very small scale).

Considering our current chips architecture those limits can not really be overcome. And even though we talk a lot about new ways of rethinking computing, it has basically been the same for years now. We talk a lot about quantum computing and other technologies, but as of now, no other technology can really withstand our current needs in computing.

The battle for the best quality/size ratio

One of the reasons why we invested so much time in miniaturizing computing was to get mobile devices of a small size. Everyone wants a MacBook to be slim and light but still perform extremely well on software and that’s where those chips came to play: the enormous computers of the past are not really something nowadays.

Everyone has extremely powerful machines in their pockets but software continue to grow in computing demands. An example is the transition to 4k compared to 1080 or the support of 120fps compared to the historical 24fps.

Comparison between 1080p and 4K

What now ? Exporting calculations to the cloud

Say hello to cloud computing

But what if, instead of trying to put as much power in a small device that you carry on you, you could find a way to access it from a distance and only have an “empty shell” of a device, with a screen and input methods ?

Streaming software from distant servers

This is a whole idea that you can run a lot of calculations on the distant machine is not new. This is the whole concept of websites communicating with distant servers that are doing all the calculations for you.

Most machines connected to the internet right now can stream. Netflix made sure of this, and with a good enough internet connection, your whole OS can run on a distant server and stream the pixels back to you !

No more limitations with Flaneer

That’s actually the whole concept of Flaneer, being able to have incredibly powerful machines not limited by size on any device, be it a laptop a tablet or even a phone !

Let’s meet here, to revolutionize the way you use your computer.

Let’s meet here, to revolutionize the way you use your computer.

(You can find the original article here)

--

--

--

Flaneer lets you run any software in the cloud and use them in a browser.

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

What Is The Future of Input Devices?

LG Researchers are exploring more pixel rich displays for VR/AR

Eaton Wi-Fi smart universal dimmer review: Great on paper, less thrilling in person

Innovation in the time of COVID-19: our Interview with Sightline Innovation

Is Metaverse the Future of Music Industry?

Does the Apple Watch Fuel Your Tech Obsession?

Black and white photo of a pensive man wearing an Apple Watch

Durians Daily #27: Esports continues to level up in this Covid-19 world

Welcome Pixelcraft Studios&Aavegotchi Co-founder and COO-Jesse Johnsons join in Metaverse Club!

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Flaneer

Flaneer

Flaneer lets you run any software in the cloud and use them in a browser.

More from Medium

Why Insecurity Crossing?

This is the REASON WHY cellphone prices are going up and innovation can not keep up

Even YOU Feel Safer With a Great Cybersecurity Team

The Ninja Sensei’s Logbook: Understanding the Transfer Limitation Obligation