What's new

We’re approaching the limits of computer power – we need new programmers now

jamahir

ELITE MEMBER
Joined
Jul 9, 2014
Messages
28,132
Reaction score
1
Country
India
Location
India
We’re approaching the limits of computer power – we need new programmers now


John Naughton
The Guardian11 January 2020

Way back in the 1960s, Gordon Moore, the co-founder of Intel, observed that the number of transistors that could be fitted on a silicon chip was doubling every two years. Since the transistor count is related to processing power, that meant that computing power was effectively doubling every two years. Thus was born Moore’s law, which for most people working in the computer industry – or at any rate those younger than 40 – has provided the kind of bedrock certainty that Newton’s laws of motion did for mechanical engineers.

But computing involves a combination of hardware and software and one of the predictable consequences of Moore’s law is that it made programmers lazier. Writing software is a craft and some people are better at it than others. They write code that is more elegant and, more importantly, leaner, so that it executes faster. In the early days, when the hardware was relatively primitive, craftsmanship really mattered. When Bill Gates was a lad, for example, he wrote a Basic interpreter for one of the earliest microcomputers, the TRS-80. Because the machine had only a tiny read-only memory, Gates had to fit it into just 16 kilobytes. He wrote it in assembly language to increase efficiency and save space; there’s a legend that for years afterwards he could recite the entire program by heart.

The Mythical Man-Month[/a], which was published in 1975 and has never been out of print, for the very good reason that it’s still relevant. And in the process, software became bloated and often inefficient.

As Moore’s law reaches the end of its dominion, Myhrvold’s laws suggest that we basically have only two options. Either we moderate our ambitions or we go back to writing leaner, more efficient code. In other words, back to the future.

What just happened?
Writer and researcher Dan Wang has a remarkable review of the year in technology on his blog, including an informed, detached perspective on the prospects for Chinese domination of new tech.

Algorithm says no
There’s a provocative essay by Cory Doctorow on the LA Review of Books blog on the innate conservatism of machine-learning.

Fall of the big beasts
“How to lose a monopoly: Microsoft, IBM and antitrust” is a terrific long-view essay about company survival and change by Benedict Evans on his blog.

---

@Hamartia Antidote @RealNapster @fitpOsitive @ps3linux
 
.
Good programmers are still.weitirng efficient code even with 4 GLs

The progress has been too fast. We can easily slow down and save the planet as well.
 
.
Good programmers are still.weitirng efficient code even with 4 GLs

The progress has been too fast. We can easily slow down and save the planet as well.

I am reminded of two things by your post :

1. Programming in user-space can be done graphically on the front-end, and the graphical compiler can in turn convert the graphic forms into efficient code. This will remove the need for learning any language.

2. Certain things in GUIs are not required. For example, zooming out a picture when a mouse passes over it. Instead, just the borders can be highlighted. This will reduce some code I believe.
 
.
We’re approaching the limits of computer power – we need new programmers now


John Naughton
The Guardian11 January 2020

Way back in the 1960s, Gordon Moore, the co-founder of Intel, observed that the number of transistors that could be fitted on a silicon chip was doubling every two years. Since the transistor count is related to processing power, that meant that computing power was effectively doubling every two years. Thus was born Moore’s law, which for most people working in the computer industry – or at any rate those younger than 40 – has provided the kind of bedrock certainty that Newton’s laws of motion did for mechanical engineers.

But computing involves a combination of hardware and software and one of the predictable consequences of Moore’s law is that it made programmers lazier. Writing software is a craft and some people are better at it than others. They write code that is more elegant and, more importantly, leaner, so that it executes faster. In the early days, when the hardware was relatively primitive, craftsmanship really mattered. When Bill Gates was a lad, for example, he wrote a Basic interpreter for one of the earliest microcomputers, the TRS-80. Because the machine had only a tiny read-only memory, Gates had to fit it into just 16 kilobytes. He wrote it in assembly language to increase efficiency and save space; there’s a legend that for years afterwards he could recite the entire program by heart.

The Mythical Man-Month[/a], which was published in 1975 and has never been out of print, for the very good reason that it’s still relevant. And in the process, software became bloated and often inefficient.

As Moore’s law reaches the end of its dominion, Myhrvold’s laws suggest that we basically have only two options. Either we moderate our ambitions or we go back to writing leaner, more efficient code. In other words, back to the future.

What just happened?
Writer and researcher Dan Wang has a remarkable review of the year in technology on his blog, including an informed, detached perspective on the prospects for Chinese domination of new tech.

Algorithm says no
There’s a provocative essay by Cory Doctorow on the LA Review of Books blog on the innate conservatism of machine-learning.

Fall of the big beasts
“How to lose a monopoly: Microsoft, IBM and antitrust” is a terrific long-view essay about company survival and change by Benedict Evans on his blog.

---

@Hamartia Antidote @RealNapster @fitpOsitive @ps3linux
From hand plugs to relays, people devised transistor concept. The main concept was switching. Switching at electrical speed. Programming? It's just how you use that switching. Software is just controlling hardware, to its core.
Now, how much speed do we need? I mean bullet goes with an speed, and that is sufficient for most of the current needs. Speeds are not always the motive, it's just like salt, some times its needed, other times not.
Next step in speed dimension will take place in matter Arena first, then in programming. Fraunhofer institute is on its way on this, they are way ahead in light based processors.
 
.
Fraunhofer institute is on its way on this, they are way ahead in light based processors.

I always had a question in mind about light-based processors.

As I understand, light generated by LEDs goes through holes to light sensors and the switching on and off is supposed to represent 1s and 0s.

My question is, won't the heat generated by the LEDs deform the holes and prevent the correct transmission of light to the sensors ??
 
.
I always had a question in mind about light-based processors.

As I understand, light generated by LEDs goes through holes to light sensors and the switching on and off is supposed to represent 1s and 0s.

My question is, won't the heat generated by the LEDs deform the holes and prevent the correct transmission of light to the sensors ??
Actually once I read this subject, totally forgotten now.
But mainly photo injection and photo detector diodes are used. But how exactly they work, I don't know.
 
.
I am reminded of two things by your post :

1. Programming in user-space can be done graphically on the front-end, and the graphical compiler can in turn convert the graphic forms into efficient code. This will remove the need for learning any language.

2. Certain things in GUIs are not required. For example, zooming out a picture when a mouse passes over it. Instead, just the borders can be highlighted. This will reduce some code I believe.
When vb Arrived first , it was said that now a clerk would be able.yo.write programs. Yet in 2020, and we need developers. It would not happen any sooner. Ecsuse technolohy is always changing and hence the applications.

Such kind of applications exist now. They are called work.flows. There are also good such as tools for BDD. But you have to ultimately write code. And a lot.of.code.
So, humans will still be needed always. And.if they won't be needed, they will destroy those.comlanies because they will have nothing to do :)
 
.
We’re approaching the limits of computer power – we need new programmers now


John Naughton
The Guardian11 January 2020

Way back in the 1960s, Gordon Moore, the co-founder of Intel, observed that the number of transistors that could be fitted on a silicon chip was doubling every two years. Since the transistor count is related to processing power, that meant that computing power was effectively doubling every two years. Thus was born Moore’s law, which for most people working in the computer industry – or at any rate those younger than 40 – has provided the kind of bedrock certainty that Newton’s laws of motion did for mechanical engineers.

But computing involves a combination of hardware and software and one of the predictable consequences of Moore’s law is that it made programmers lazier. Writing software is a craft and some people are better at it than others. They write code that is more elegant and, more importantly, leaner, so that it executes faster. In the early days, when the hardware was relatively primitive, craftsmanship really mattered. When Bill Gates was a lad, for example, he wrote a Basic interpreter for one of the earliest microcomputers, the TRS-80. Because the machine had only a tiny read-only memory, Gates had to fit it into just 16 kilobytes. He wrote it in assembly language to increase efficiency and save space; there’s a legend that for years afterwards he could recite the entire program by heart.

The Mythical Man-Month[/a], which was published in 1975 and has never been out of print, for the very good reason that it’s still relevant. And in the process, software became bloated and often inefficient.

As Moore’s law reaches the end of its dominion, Myhrvold’s laws suggest that we basically have only two options. Either we moderate our ambitions or we go back to writing leaner, more efficient code. In other words, back to the future.

What just happened?
Writer and researcher Dan Wang has a remarkable review of the year in technology on his blog, including an informed, detached perspective on the prospects for Chinese domination of new tech.

Algorithm says no
There’s a provocative essay by Cory Doctorow on the LA Review of Books blog on the innate conservatism of machine-learning.

Fall of the big beasts
“How to lose a monopoly: Microsoft, IBM and antitrust” is a terrific long-view essay about company survival and change by Benedict Evans on his blog.

---

@Hamartia Antidote @RealNapster @fitpOsitive @ps3linux

Supposedly there will be a day when all code generation is automated. Unfortunately before that can be done the language the computer is using has to be bug free or it won’t match the expected outcome of the automated code writer.

So now the language has to be fixed. Unfortunately even that has bugs where methods are not as precise as expected. This is where you build a schematic of how the language SHOULD work but when implementing it you realize in certain cases the schematic fails and it becomes a bunch of edge case footnotes being appended here and there. This would frustrate a logic based computer trying to write code.

Just think of it in terms of look and feel of your desktop and how much of it is similar to previous Windows versions with this and that removed and this and that added.
 
.
Back
Top Bottom