What's new

DARPA creates first 1THz computer chip, earns Guinness World Record

Mujahid Memon

SENIOR MEMBER
Joined
Apr 24, 2012
Messages
2,818
Reaction score
-4
Country
Pakistan
Location
Pakistan
DARPA creates first 1THz computer chip, earns Guinness World Record

By Sebastian Anthony on October 31, 2014 at 10:50 am
Comment

Guinness World Record for DARPA's 1THz chip
Share This article

DARPA, the US military’s R&D division, has been awarded a Guinness World Record for creating the world’s fastest solid-state chip, clocked at one terahertz — or 1,000 gigahertz, if that’s easier to digest. DARPA’s chip handily beats out the previous record holder, which was only capable of a paltry 850GHz. Computers and radio systems that operate up in the terahertz range have some very interesting and powerful properties, from the creation of hand-held tricorders and security scanners, through to wireless networks that are hundreds of times faster than 2.4 and 5GHz WiFi.

DARPA has been interested in terahertz chips for a long while now — not so much for super-fast computers (though they are being looked at), but for creating radio waves in the sub-millimeter-wave terahertz range. These waves (sometimes called T-rays), as far as the military are concerned, are a prime way of supercharging everything from radar, to imaging and reconnaissance, to ultra-high-bandwidth wireless networks. Basically, because the wavelength of T-rays are so short (less than a millimeter), they can provide much higher resolution than, say, conventional radar, which uses frequencies of a much longer wavelength (anywhere from 100 meters to a few centimeters). The density of T-rays makes them very good for carrying large amounts of data, too — and, as an added bonus, they’re very good at safely penetrating a few millimeters of skin (yes, airports are interested in terahertz security scanners).
A top-down view of a solid-state terahertz amplifier

A top-down view of a solid-state terahertz amplifier. In this case, this is the previous record holder, which hit 850GHz. (I can’t find any imagery of the new chip, except for the awful image below.)

Read: How terahertz laser scanners will spy on you in airports

As you can imagine, though, it’s hard to build a chip that’s capable of switching on and off one trillion times per second. That seems to be the main breakthrough here: DARPA, working with Northrop Grumann, have built a 10-stage monolithic amplifier using fairly standard CMOS processes. Exact details aren’t available, but I believe the chip is fabricated out of indium phosphide (InP), which is capable of switching at much higher frequencies (and at higher power levels) than common semiconductors such as silicon or gallium arsenide.
Official PR image for DARPA's World Record 1THz chip

Official PR image for DARPA’s World Record 1THz chip. Yup.

The DARPA/Northop chip reportedly has excellent properties, with a gain of 9dB at 1THz, and 10dB at 1.03THz. “Gains of six decibels or more start to move this research from the laboratory bench to practical applications — nine decibels of gain is unheard of at terahertz frequencies” says Dev Palmer, DARPA’s THz Electronics program manager. “This opens up new possibilities for building terahertz radio circuits.”

A solid-state amplifier is just one piece of the terahertz puzzle, of course. To actually create usable T-rays, you need a complete transceiver and antenna — and I don’t think DARPA is quite there yet. Still, the creation of transistors that are capable of switching at 1,000GHz, however, is exciting. This doesn’t mean that you’re going to magically start seeing computers that operate in the 1THz range — the power requirements and heat dissipation would be utterly insane — but we can at least begin to sketch a roadmap towards a future where everything from computers, to networks, to surveillance and medical imaging are supercharged way beyond what is currently possible.


DARPA creates first 1THz computer chip, earns Guinness World Record | ExtremeTech
 
so, this terahertz "computer chip" is actually a radiowave generator... it has no use beyond that... as for actual microprocessors, the future is not at terahertz but at zero hertz... clock-less processor... which my socialist movement is involved in designing at the moment.
 
so, this terahertz "computer chip" is actually a radiowave generator... it has no use beyond that... as for actual microprocessors, the future is not at terahertz but at zero hertz... clock-less processor... which my socialist movement is involved in designing at the moment.
Are you an Electrical/Electronic Engineer ???

If yes then you're talking about infinite state combinational logic, and ur designer movt doesn't know that this type of thing will require an infinite-transistored chip.........
 
so, this terahertz "computer chip" is actually a radiowave generator... it has no use beyond that... as for actual microprocessors, the future is not at terahertz but at zero hertz... clock-less processor... which my socialist movement is involved in designing at the moment.


Socialist Movement of Silicon Activists.

That will make you a SMoSA.
 
so, this terahertz "computer chip" is actually a radiowave generator... it has no use beyond that... as for actual microprocessors, the future is not at terahertz but at zero hertz... clock-less processor... which my socialist movement is involved in designing at the moment.

see this....
Difference between Combinational and Sequential Logic

Combinational Logic Circuits Vs Sequential Logic Circuits

even the asynchronous cpu's were not fully "clockless"
asynchronous - What happened to clockless computer chips? - Stack Overflow

And you're designing a thing in which intel failed.........cheers!
 
Are you an Electrical/Electronic Engineer ???

If yes then you're talking about infinite state combinational logic, and ur designer movt doesn't know that this type of thing will require an infinite-transistored chip.........

And you're designing a thing in which intel failed.........cheers!

i am not engineer... i am a designer, so i don't have the limited thinking of most engineers... our project simplifies ideas for itself which is why we achieve...

by your logic, spacex and qnx os will never have existed.

sorry, but you don't seem to have confidence in others' proposing new things because you don't seem to have confidence in yourself to think other than what exists presently... your present attitude is defeatist, which must change.

you presently live within others' achievements but you don't seem to realize how they achieved their achievement... you are just repeating "industry" wordings, without thinking whether they are wrong or right... you will never create an industry with this attitude.

can you write a microkernel operating system??

please do take all this as sincere advise by your senior friend... :-)

and please do wait for our project... below is part of my discussion with sahasranama from ( Bhukhari should be sent to Pakistan, says Yogi Adityanath | Page 4 )... i describe our microprocessor...
instruction set -- done -- less than 20 instructions.
i/o system -- new method of external data input and external data control... connection to processor is asynchronous serial.
memory system architecture -- ongoing.
alu design -- ongoing.
processor implementation -- will be on fpga.
control program ( os ) -- will be written once processor is implemented.
graphical user interface -- designed.
application programming -- applet form.
 
Last edited:
i am not engineer... i am a designer

ce527768c25146fa8dfc28c05b169f23.jpg
 
samosa... :lol: we are the samosas... :rofl: i will use that as codeword...

Sure. Just remember that word is copyrighted by me.

hahahah


So use it as freely, but credit should be given to Faujhistorian.

SMoSA.

Glad you liked it.
 
so, this terahertz "computer chip" is actually a radiowave generator... it has no use beyond that... as for actual microprocessors, the future is not at terahertz but at zero hertz... clock-less processor... which my socialist movement is involved in designing at the moment.
Clock is just a synchronization mechanism. So you intend to do asynchronous computing ?
 
Same here as I didn't understand ur quoted comment.........

when i said...
i am not engineer... i am a designer, so i don't have the limited thinking of most engineers... our project simplifies ideas for itself which is why we achieve...

i meant...

engineers in the modern world go through colleges... and college type of places teach what is current, because the goal of 99.99% of colleges is their students getting jobs... the goal of modern colleges is not to educate, not to induce the student to question the basis and first principles...

what if the basis itself is wrong??

can anyone tell me why is a clock needed in a modern microprocessor??

i am a natural designer... i dropped out of college in 1996... i don't have to un-learn anything because i learnt by question the basis of everything... which is also why i am a socialist... so, i simplify things for myself... and by nature, i am not a specialist who can work in only one field... i am a jack of all trades...

on the other hand, the modern engineer who was "educated" in college will find it difficult to un-learn many things... in this context, the need for the clock in processors... and the modern engineer is "trained" in one field only... and will generally work on sub-systems... he does not look at the entire system... the entirety... he has limited view in a project...

Clock is just a synchronization mechanism. So you intend to do asynchronous computing ?

instruction execution and the external i/o will be clock-less... but one thing that will require fixed intervals is video ( still and moving ), but that need not use a internal clock crystal... and i am still designing the network architecture... but truly, removing clock simplifies the overall system greatly.
 
when i said...


i meant...

engineers in the modern world go through colleges... and college type of places teach what is current, because the goal of 99.99% of colleges is their students getting jobs... the goal of modern colleges is not to educate, not to induce the student to question the basis and first principles...

what if the basis itself is wrong??

can anyone tell me why is a clock needed in a modern microprocessor??

i am a natural designer... i dropped out of college in 1996... i don't have to un-learn anything because i learnt by question the basis of everything... which is also why i am a socialist... so, i simplify things for myself... and by nature, i am not a specialist who can work in only one field... i am a jack of all trades...

on the other hand, the modern engineer who was "educated" in college will find it difficult to un-learn many things... in this context, the need for the clock in processors... and the modern engineer is "trained" in one field only... and will generally work on sub-systems... he does not look at the entire system... the entirety... he has limited view in a project...



instruction execution and the external i/o will be clock-less... but one thing that will require fixed intervals is video ( still and moving ), but that need not use a internal clock crystal... and i am still designing the network architecture... but truly, removing clock simplifies the overall system greatly.
What is the peak FLOPS I can expect from this socalist processor .... in say LINPACK?
 
What is the peak FLOPS I can expect from this socalist processor .... in say LINPACK?

1. how can i give a clock-based number ( something per second ) when the processor is clock-less...

2. the control program ( os ) architecture is hybrid ( microkernel + monolithic ), and not connected to any present os or their program portings ( linpack )...

3. the first prototype will be implemented in fpga... so, fpga speed limits will be the natural limits of that prototype... but in asic form, i expect the processor to be quite fast... how fast, i can only demonstrate by the response of the system's running programs... :-)

4. the entire project has political basis ( socialist ), and meant to replace present systems and companies, with a natural and simpler system of computing... so i don't really bother with justifying or judging to present standards. :-)
 
Back
Top Bottom