I was reading an article on PC Mag online, which was about the start of mass production for the M2 chip by Apple.
What caught my eye was this particular phrase and the measurement size: The M2 will be a 5nm processor.
Nm is nanometre, and nanometres are measured in billionths of a metre. One nm is one billionth of a metre, and it is used internationally to measure items like computer processing chips, atoms and DNA!
For an idea of scale, see the below picture:
Think about it...an atom can be 0.1 to 0.5 nm across. A chip can be composed of atoms. So in a 5nm chip, imagining that each atom is 0.5nm, there are 100 atoms. Wow!
Back in 27 September 2011, an article was released, called: Is 14nm the end of the road for silicon chips? Obviously, we now know it isn't, and this was 10 years ago.
Technology has processed in leaps and bounds, and what was amazing only a few years ago is now seen as ancient!
Will the size of things ever stop getting smaller? Perhaps, but I reckon that minaturisation is finite. After all, the smaller something is, the less scope you have to make it complex, and there is only so much detail you can add to something so small.
An excerpt from an article published 5 June 2020 by the Massachusetts Institute of Technology confirms my thinking with its in-depth look at the evolvement of computer hardware tech:
'In 1965, Intel co-founder Gordon Moore predicted that the number of transistors that could fit on a computer chip would grow exponentially —- and they did, doubling about every two years. For half a century Moore's Law has endured: computers have gotten smaller, faster, cheaper and more efficient, enabling the rapid worldwide adoption of PCs, smartphones, high-speed Internet and more.
This miniaturization trend has led to silicon chips today that have almost unimaginably small circuitry. Transistors, the tiny switches that implement computer microprocessors, are so small that 1000 of them laid end-to-end are no wider than a human hair. For a long time, the smaller the transistors were, the faster they could switch.
But today, we're approaching the limit of how small transistors can get. As a result, over the last decade researchers have been scratching their heads to find other ways to improve performance so that the computer industry can continue to innovate.'
The article goes on to explain why innovation must occur, since the progress of machine learning, virtual reality and robotics 'will require huge amounts of computational power that miniaturization can no longer provide'.
Humans have always been creative, adaptive, and innovative. I'm sure some mad brain out there will figure out a way, and computer technology will continue on apace!
No comments:
Post a Comment