What's new

Temples of New India #2 : The Ultra Fast "Super Computers" in 2014

Its NOT in Top 10 Now.
At the time of its unveiling, it was the 4th fastest supercomputer in the world and the fastest in Asia.
As of 16 September 2011, it is ranked at 58.
I was just about to edit that.... bt u quoted!!!I was little confused...
Anyway thnx very informative thread!!!
 
"The Planning Commission agreed in principle to provide the funds to the Indian Space Research Organsiation (ISRO) and Indian Institute of Science (IIS), Bangalore to develop a supercomputer with a performance of 132.8 exaflops (132 quintillion floating operations per second). A quintillion has 18 zeros (a million has six).
The Indian supercomputer will not be used only for enhancing the country’s space abilities, it will also be used to predict monsoon and precise weather inputs to boost agriculture N Balakrishnan, associate professor at IIS-Bangalore, said the target being set is “ambitious” while referring to achieving the exaflop – or next level of computing performance -- by 2017. “We have planned everything minutely.”

Rs 10,000-crore push for India’s supercomp plan - Hindustan Times
Are exaflops computers even possible by 2017?.
I read somewhere that by the current trend of progress in supercomputing development,1 exaflop capacity can only be reached by 2019
 
Even with such high performance computers with us, why IMD is not able to do correct predictions. I remember last year for couple of months their predictions were very accurate and we were in Delhi all wondering how come IMD had improved so much. But we are back to normal. Was it that IMD got some 60 days trial version of some high quality software from International firm and now they are still looking for Pirated version rather than buying license? :-):-)
 
To be honest its NOT Exactly That Way.
I have been closely associated with Installation of PARAM 10,000s.

I can tell you 75% Components ( The Sun Sparc Ultra 400 MHz CPU , The Monitors.. Tech. ) ALL were Imported from USA.
What we did was, to Invent the PARAM NET SWITCH. And The Cluster Worked Flawlessly.
100 Giga Flops was the Peak Performance of the PARAM Units at Pune and Delhi ( Referre as National PARAM Supercomputing Facility ).

Many NITs and IITs were given a Clustor of 8 CPU Bases System , which Operated on Linux Red Hat. With 16 Nodes.
supercomputers are all about combining more and more processors to do massive parallel processing. Most countries Import chips and electronics so no problem here but me Made net switches and architecture that most important in Supercomputer.
Btw we have good Semiconductor design Industy.
 
Are exaflops computers even possible by 2017?.
I read somewhere that by the current trend of progress in supercomputing development,1 exaflop capacity can only be reached by 2019

Let me Tell You Something More Exiting , The Future of Supercomputing Lies with :

1. Super Conductivity [ Reduced Resistance, More Transistors ]
2. Quantum MIcroprocessors [ Several Processes at the Same Time ]

With Both of These .. I guess Sky is the Limit.
 
Let me Tell You Something More Exiting , The Future of Supercomputing Lies with :
1. Super Conductivity [ Reduced Resistance, More Transistors ]
2. Quantum MIcroprocessors [ Several Processes at the Same Time ]
With Both of These .. I guess Sky is the Limit.

Can confirm both.

My supervisors firm is working on both (and yes, more often than not, small firms tend to be more efficient at cutting edge research than the giants). The productivity gains are huge.
 
Let me Tell You Something More Exiting , The Future of Supercomputing Lies with :

1. Super Conductivity [ Reduced Resistance, More Transistors ]
2. Quantum MIcroprocessors [ Several Processes at the Same Time ]

With Both of These .. I guess Sky is the Limit.
if Quantum Processors become as compact as todays pc processors then we will see rise of Machines wiser than humans.
in one thing even todays supercomputers can't beat Human Brain is Parallel processing. Our Brain is Massive parallel processor but our brain lose in brute force algorithms.
 
Heard it could be for making nuclear weapons...but then India had tested its first nuclear weapon explosion in 1974 so why an arms embargo in late 1980s???

Kind of yes, we can simulate nuclear detonation and it's attributes in softwares, as per scientists I interacted in India, India has blueprint to develop 600 kt nuclear device based on data gathered from simulations.

Here, before fabrication, we run simulations first to determine the expected parameters, a simple nanophotonics and quantum electronics simulation can take 3-7 days. So you get the idea of the scale of computing we are talking about.

To add further, I use Intel Xeon with 128 gb ram in my workstation, costs $6000 and more.
 
Kind of yes, we can simulate nuclear detonation and it's attributes in softwares, as per scientists I interacted in India, India has blueprint to develop 600 kt nuclear device based on data gathered from simulations.

Here, before fabrication, we run simulations first to determine the expected parameters, a simple nanophotonics and quantum electronics simulation can take 3-7 days. So you get the idea of the scale of computing we are talking about.

To add further, I use Intel Xeon with 128 gb ram in my workstation, costs $6000 and more.

With Xeon E5s ?
 
Super Conductivity

That is important for two reasons
1. the power consumption and the Ohmic heating losses. Super computers are super hungry power guzzlers too. and a lot of power we feed goes waste as resistive heat. At super-conducting regions, all this loss can be made close to zero, leading to improvement in efficiency.
2. Moore's Law: As we go on cramming more and more transistors in already small electronic real estate, they start to interfere with each other. Now as we move towards Nano computing, it'll be possible to have transistor of size of few atoms. Needless to say, with smaller transistors, you can pack more the number on a single chip, and in turn getting higher processing power. and when you combine these powerful chips in cluster, along with data buses and very fast data storage devices, you get faster and better s-computers.
Transistor_Count_and_Moore's_Law_-_2011.svg.png
 
Let me Tell You Something More Exiting , The Future of Supercomputing Lies with :

1. Super Conductivity [ Reduced Resistance, More Transistors ]
2. Quantum MIcroprocessors [ Several Processes at the Same Time ]

With Both of These .. I guess Sky is the Limit.
Superconductivity is very possible but not cost effective as nearing absolute zero requires a huge amount of power.Type 2 superconductors could be used by they are not too effective.
Quantum Microprocessors(very efficient ones) will take quite some time.Bose-Einstein condensate based quantum computers are theoritically most effective ones but they are not quite developed yet.
But if anyone in India could do it,it will be the ISRO guys.I have complete faith in them as they have never let the nation down.
 
Superconductivity is very possible but not cost effective as nearing absolute zero requires a huge amount of power.Type 2 superconductors could be used by they are not too effective.
Quantum Microprocessors(very efficient ones) will take quite some time.Bose-Einstein condensate based quantum computers are theoritically most effective ones but they are not quite developed yet.
But if anyone in India could do it,it will be the ISRO guys.I have complete faith in them as they have never let the nation down.

You are Forgetting :

- BARC
- IISc
- RCI
- TIFR
 
Kind of yes, we can simulate nuclear detonation and it's attributes in softwares, as per scientists I interacted in India, India has blueprint to develop 600 kt nuclear device based on data gathered from simulations.

Here, before fabrication, we run simulations first to determine the expected parameters, a simple nanophotonics and quantum electronics simulation can take 3-7 days. So you get the idea of the scale of computing we are talking about.

To add further, I use Intel Xeon with 128 gb ram in my workstation, costs $6000 and more.
for simulations we require algorithms , and for checking if our algorithms are correct , we have to match simulation data with real test data. so India need more testing, for getting data very high Yield designs and for Confidence in our design.
i read it somewhere that India is asking French to give their nuclear test data for our simulations in exchange for rafale deal
 
You are Forgetting :

- BARC
- IISc
- RCI
- TIFR
Yes those guys.Except for BARC I have no idea what the others are always upto.But BARC is cool.They did commendable work on Arihant reactor and Thorium reactor.
BTW what is RCI?
 
You are Forgetting :

- BARC
- IISc
- RCI
- TIFR
sab DRDO ki marne me lage rahte hai:lol: , ISRO ne koi teer nahi mar diya MOM se , We percieve ISRO is successful in comparision to DRDO , beacuse we dont have substitute for ISRO's products so we have to bear it and no constrained for timelines. like MOM is nothing infront of Maven but it is Kamchalaau so we cheer for it , but when it comes to DRDO products we want world calss products like Kaveri engine is producing 81KN that is good but we want 100kn.
 

Back
Top Bottom