jamahir
ELITE MEMBER
- Joined
- Jul 9, 2014
- Messages
- 28,132
- Reaction score
- 1
- Country
- Location
We’re approaching the limits of computer power – we need new programmers now
John Naughton
The Guardian11 January 2020
Way back in the 1960s, Gordon Moore, the co-founder of Intel, observed that the number of transistors that could be fitted on a silicon chip was doubling every two years. Since the transistor count is related to processing power, that meant that computing power was effectively doubling every two years. Thus was born Moore’s law, which for most people working in the computer industry – or at any rate those younger than 40 – has provided the kind of bedrock certainty that Newton’s laws of motion did for mechanical engineers.
But computing involves a combination of hardware and software and one of the predictable consequences of Moore’s law is that it made programmers lazier. Writing software is a craft and some people are better at it than others. They write code that is more elegant and, more importantly, leaner, so that it executes faster. In the early days, when the hardware was relatively primitive, craftsmanship really mattered. When Bill Gates was a lad, for example, he wrote a Basic interpreter for one of the earliest microcomputers, the TRS-80. Because the machine had only a tiny read-only memory, Gates had to fit it into just 16 kilobytes. He wrote it in assembly language to increase efficiency and save space; there’s a legend that for years afterwards he could recite the entire program by heart.
The Mythical Man-Month[/a], which was published in 1975 and has never been out of print, for the very good reason that it’s still relevant. And in the process, software became bloated and often inefficient.
As Moore’s law reaches the end of its dominion, Myhrvold’s laws suggest that we basically have only two options. Either we moderate our ambitions or we go back to writing leaner, more efficient code. In other words, back to the future.
What just happened?
Writer and researcher Dan Wang has a remarkable review of the year in technology on his blog, including an informed, detached perspective on the prospects for Chinese domination of new tech.
Algorithm says no
There’s a provocative essay by Cory Doctorow on the LA Review of Books blog on the innate conservatism of machine-learning.
Fall of the big beasts
“How to lose a monopoly: Microsoft, IBM and antitrust” is a terrific long-view essay about company survival and change by Benedict Evans on his blog.
---
@Hamartia Antidote @RealNapster @fitpOsitive @ps3linux
John Naughton
The Guardian11 January 2020
Way back in the 1960s, Gordon Moore, the co-founder of Intel, observed that the number of transistors that could be fitted on a silicon chip was doubling every two years. Since the transistor count is related to processing power, that meant that computing power was effectively doubling every two years. Thus was born Moore’s law, which for most people working in the computer industry – or at any rate those younger than 40 – has provided the kind of bedrock certainty that Newton’s laws of motion did for mechanical engineers.
But computing involves a combination of hardware and software and one of the predictable consequences of Moore’s law is that it made programmers lazier. Writing software is a craft and some people are better at it than others. They write code that is more elegant and, more importantly, leaner, so that it executes faster. In the early days, when the hardware was relatively primitive, craftsmanship really mattered. When Bill Gates was a lad, for example, he wrote a Basic interpreter for one of the earliest microcomputers, the TRS-80. Because the machine had only a tiny read-only memory, Gates had to fit it into just 16 kilobytes. He wrote it in assembly language to increase efficiency and save space; there’s a legend that for years afterwards he could recite the entire program by heart.
The Mythical Man-Month[/a], which was published in 1975 and has never been out of print, for the very good reason that it’s still relevant. And in the process, software became bloated and often inefficient.
As Moore’s law reaches the end of its dominion, Myhrvold’s laws suggest that we basically have only two options. Either we moderate our ambitions or we go back to writing leaner, more efficient code. In other words, back to the future.
What just happened?
Writer and researcher Dan Wang has a remarkable review of the year in technology on his blog, including an informed, detached perspective on the prospects for Chinese domination of new tech.
Algorithm says no
There’s a provocative essay by Cory Doctorow on the LA Review of Books blog on the innate conservatism of machine-learning.
Fall of the big beasts
“How to lose a monopoly: Microsoft, IBM and antitrust” is a terrific long-view essay about company survival and change by Benedict Evans on his blog.
---
@Hamartia Antidote @RealNapster @fitpOsitive @ps3linux