What's new

Micro stories - small news bits too small to have their own thread

Future potential of brain chip is limitless after man controls robot arm with his thoughts - The Washington Post

Scientists at Caltech reported Thursday that they had developed an implantable chip that gave a tetraplegic man, Erik G. Sorto, the power to drink beer with a robot arm. This is just the beginning.

Researchers are working on all manner of silicon-based devices that go inside the body and manipulate the body’s signals to create motion. They believe these chips will not only be able to help those with paralysis one day -- but also usher in a new era of robot adjuncts controlled by someone’s thoughts that will be able to perform all manner of jobs from lifting dangerous objects to filing papers.

Here’s a look at some of the other promising research:

A group in Lausanne, Switzerland announced in January that they had helped mice with near-severed spines walk again with a ribbon of stretchable silicon that is placed under the nerve tissue. They used the gadget to second electrical signals through the animals as well as to deliver chemicals for nerve impulse transmission. In six weeks, the mice could not only walk but run and climb stairs again. Scientific American likened the technique to fixing a cuts in a telephone cable.

Signals that start in the brain are supposed to travel down nerves in the spinal cord to muscles, but breaks in the nerves interrupt them. Patching the breaks with new wires, jumping over the cut in the phone line, should restore communication.

In Ohio last year, doctors operated on a 22-year-old man to insert a chip into his brain that is connected to a port leading to a cable that is plugged into a computer programmed to decode messages from the brain. According to a report in The Washington Post, here’s how it would work:

The electrodes were designed to pulse and stimulate muscle fibers so that the muscles could pull on tendons in his hand.

If it all worked, a man who was paralyzed from the chest down would think about wiggling his finger, and in less than one-tenth of a second, his finger would move.

They would bypass his broken spinal cord and put a computer in its place.

The man, Ian Burkhart, was able to move his hand and fingers.

Much of the hope and promise of chips to aid those with spinal cord injuries comes from the 2011 case of Rob Summers, a college baseball star who was injured in a hit-and-run accident and paralyzed below the neck. According to Reuters:

The 2.5-ounce (72-gram) device began emitting electrical current at varying frequencies and intensities, stimulating dense bundles of neurons in the spinal cord. Three days later he stood on his own. In 2010 he took his first tentative steps.

The team wasn’t as optimistic about their next patients – unlike Summers, they didn’t have any sensation in their legs – and they were surprised when four men who were paralyzed from the chest down were able to voluntarily move their legs and feet after being implanted with the device. While the men were not able to walk, the experiment was hailed as a major success and offered new hope for the more than 6 million paralyzed Americans.

What's next? Writing in the Wall Street Journal, two scholars contend "brain implants today are where laser eye surgery was several decades ago." Gary Marcus, a professor of psychology at New York University and Christof Koch, chief science of the Allen Brain Institute for Brain Science in Seattle wonder:

What would you give for a retinal chip that let you see in the dark or for a next-generation cochlear implant that let you hear any conversation in a noisy restaurant, no matter how loud? Or for a memory chip, wired directly into your brain's hippocampus, that gave you perfect recall of everything you read? Or for an implanted interface with the Internet that automatically translated a clearly articulated silent thought ("the French sun king") into an online search that digested the relevant Wikipedia page and projected a summary directly into your brain?

 
SpaceX Dragon Cargo Capsule Splashes Down in Pacific Ocean

spacex-dragon-splashdown.jpg


 
Hey guys!!! I often find myself confronted with something interesting, but don't feel it deserves its own thread. So I'm starting a thread, that I will be updating daily, dedicated to things that are interesting but don't need their own threads due to their lack of length or depth. Micro stories for short - mostly anything I or anyone else finds interesting and wants to share!

Anyone can contribute, but as noted, I'll at least sustain it on my own, though I always welcome contributions!

Thanks and I hope you enjoy, it'll be random!

SvenSvensonov

I'll start with:

These anamorphic drawings will screw up your brain

Italian artist Alessandro Diddi is back with more mind-boggling anamorphic drawings that seem to be popping out of the paper. Even if I know that these are plain 2D drawings, my brain keeps telling me it's 3D.

View attachment 199617

View attachment 199618

View attachment 199619

View attachment 199620 '

View attachment 199621

View attachment 199623

From These anamorphic drawings will screw up your brain

@levina @Gufi @Nihonjin1051 @thesolar65 @Gabriel92 @Jungibaaz
Saw this thread in Members Club...Will this help? Not a Story though!

 
A 'FOURTH INDUSTRIAL REVOLUTION' IS ABOUT TO BEGIN (IN GERMANY)

IMG_4134_1.JPG


Factories are about to get smarter. The machines that make everything from our phones to our sandwiches rely on creaking technology -- but not for long. "We will have a fourth industrial revolution," says professor Detlef Zühlke, a lead researcher in the factories of the future. And that fourth revolution is all about making factories less stupid.

Zühlke and his team have spent the past decade developing a new standard for factories, a sort of internet of things formanufacturing. "There will be hundreds of thousands of computers everywhere," Zühlke tells WIRED.co.uk. "Some of these technologies will be disruptive".

In Germany this impending revolution is known as Industry 4.0, with the government shovelling close to €500m (£357m) into developing the technology. In China, Japan, South Korea and the USA big steps are also being made to create global standards and systems that will make factories smarter. The rest of the world, Zühlke claims, is "quite inactive".

Zühlke is head of one of the largest research centres for smart factory technology in the world. The facility, located at theGerman Artificial Intelligence Research Centre (DFKID) in the south-western city of Kaiserslautern, houses a row of boxes packed with wires and circuitry.

At first it looks like any factory, but then you notice all the machines are on wheels. This, Zühlke explains, is the factory of the future. His vision is based on cyber physical systems, combining mechanical systems with electronics to connect everything together. And the wheels? One day different modules in the factory could potentially drive themselves around to allow factories to alter the production line. For now, moving the modules is done by humans.

The demo factory is currently producing business card holders. Each module performs a different task and they can be rearranged into any order, with the modules able to understand when it is their turn to carry out a task. A storage module feeds into an engraver, a robot arm, a laser marker, a quality control module and so forth. New modules can be added at any time, a process Zühlke compares to playing with Lego.

The idea owes a lot to how we've all been using home computers for years. For more than a decade it has been easy to plug in a new printer or other USB device and have it instantly recognised. On a computer this is known as "plug and play", in a factory Zühlke describes it as "plug and produce". A key breakthrough has been the development of a USB port on an industrial scale, Zühlke explains. This cable, which looks more like a giant hose, sends data and pressurised air to modules in a smart factory, with a control centre receiving information back.

In two years Zühlke expects the first wave of factories using smart technology to be fully operational, with widespread adoption in factories around the world in the next decade. For now, smart factories remain a research project.

There's still a lot of work to be done. Agreeing international standards for smart factories is key, Zühlke says. Such agreements may sound dull, but they will make manufacturing cheaper, quicker and more reliable. RFID, NFC and OPC UA, the machine to machine equivalent of HTML, are all making it easier for factories to standardise how different modules and components talk to one another.

Standards are the talk of the industry. Heiki Haarmann, head of corporate communications at industrial automation firm Festo tells WIRED.co.uk smart factories will only become a reality when everyone works together. If different factories create their own proprietary systems, she says, "we will not succeed". Regulators will also need to catch up. Such modular systems aren't approved by the US Food and Drug Administration (FDA) for the manufacture of food and drugs, which currently have to be created in a set order.

The rise of automation and smart factories also brings with it the fear people will lose their jobs. Haarmann says it may not mean less employees, but "different" employees educated to specialise in mechatronics and IT. Zühlke agrees, arguing that factory workers will have to re-skill to keep their jobs.

But the shift to smaller, more flexible manufacturing could allow companies to operate closer to their customers, potentially revitalising manufacturing in struggling economies. "It will make manufacturing more affordable in high wage countries," Zühlke explains, posing a "major threat to China". "If cheap labour isn't cheap anymore," he argues, companies will stop manufacturing in China.

A 'fourth industrial revolution' is about to begin (in Germany) (Wired UK)
 
Neat photo shows a US Navy sailor inside a F/A-18 Hornet afterburner

1263489745907027115.jpg


It looks like the entryway to a portal. Or like the pod of some spacecraft. But it’s a US Navy sailor checking out the afterburners of a fighter jet. The US Navy: “Aviation Machinist’s Mate 3rd Class Ryan Draper, from Palmdale, Calif., inspects an F/A-18 afterburner in the jet shop aboard the Nimitz-class aircraft carrier USS George Washington (CVN 73).”
 
Supersonic Decelerator Gets a Lift to Prepare for Launch

NASA teams are continuing preparations for the Low-Density Supersonic Decelerator (LDSD) test off the coast of Hawaii June 2-12. This week the team completed a number of key pre-test procedures, including a successful mate between the test vehicle and balloon support systems.

Unknown-768x1024.jpeg


So, you may be wondering what this LDSD technology is – and why it’s important to future missions to Mars. Put simply, it’s about mass, speed and safety. NASA is planning ambitious robotic and human missions to Mars, which will require larger, more complex spacecraft than we’ve ever flown before. They’ll need to haul sizeable payloads to accommodate long stays on the Martian surface, and must fly back and forth more quickly to minimize human exposure to space radiation. That means finding new ways to slow down when our spacecraft reach their destinations, effectively countering those faster flights and payloads of greater mass.

Current deceleration technologies date back to NASA’s Viking Program, which put two landers on Mars in 1976. The basic Viking parachute design has been used ever since, such as during the 2012 delivery of the Curiosity rover to Mars.


Now NASA seeks to use atmospheric drag as a solution. NASA’s LDSD project, led by the Jet Propulsion Laboratory in Pasadena, California, and sponsored by NASA’s Space Technology Mission Directorate in Washington, is conducting this full-scale flight test of two breakthrough technologies: a supersonic inflatable aerodynamic decelerator, or SIAD, and an innovative new parachute. These devices potentially will help us deliver double the current amount of payload — 1.5 metric tons — to the surface of Mars. They also will greatly increase the accessible surface area we can explore, and will improve landing accuracy from a margin of approximately 6.5 miles to a little more than 1 mile.

All these factors will dramatically increase the success of future missions on Mars.

From Supersonic Decelerator Gets a Lift to Prepare for Launch | LDSD 2015 Launch Status Updates
 
NASA Releases Best Images Ever of "Alien Lights" on Ceres

bright-spots-on-ceres.jpg


NASA has released the most detailed and clear images of the mysterious lights on the dwarf planet Ceres, but unfortunately the agency is no closer to explaining exactly what they are.

The Dawn probe took the image above from a distance of 4,500 miles and they are the most detailed images ever taken of the tiny planet that wasn't meant to be.

The bright spots on Ceres have so far completely stumped scientists working on the mission, who have only offered up speculations about their origin.

"Dawn scientists can now conclude that the intense brightness of these spots is due to the reflection of sunlight by highly reflective material on the surface, possibly ice," said Christopher Russell, principal investigator for the Dawn mission from the University of California, Los Angeles.

Dawn will move even closer to Ceres on June 6, closing in on the dwarf planet at a distance of 2,700 miles in an effort to discover whether or not volcanic activity is present or not. Scientists continue to hope as Dawn moves closer and closer to largest object in the asteroid belt between Mars and Jupiter, that it will also be able to uncover more of the mystery surrounding these "Alien Lights."

Dawn was first launched in September 2007 with a mission of studying two of the three known protoplanets in the asteroid belt, Vesta and Ceres. Dawn's first stop was at Vesta arriving in orbit on July 16, 2011, where it spent 14 months surveying Vesta before leaving for Ceres in late 2012.

It entered orbit around Ceres on March 6, 2015 where it caught the first ever image of Ceres that showed the mysterious bright spots.

Dawn is the first NASA craft to use ion propulsion, which enabled it to enter and leave orbit of multiple celestial bodies. Previous crafts, such as the Voyager program, used conventional drives which restricted them to only flybys.

Dawn made history being the first spacecraft to visit a dwarf planet and the first craft to orbit more than one body in space. Dawn will remain in orbit until the conclusion of its mission spending months studying the dwarf planet and is expected to remain there orbiting the dwarf planet long after the Dawn mission has come to a completion. Dawn was launched with the hopes of learning more about other bodies in our solar system and how our solar system was formed billions of years ago.

From NASA Releases Best Images Ever of "Alien Lights" on Ceres : SPACE : Science Times
 
'Tomorrowland' in Dolby Cinema: the best picture I've seen in a theater - CNET

In the Dolby Cinema presentation of Disney's "Tomorrowland," fiber-optic-fed laser light engines, 4K resolution, HDR contrast and Atmos sound combine to create a breathtaking cinema experience. Tomorrow's theater experience can't come home soon enough.

George Clooney's grizzled face filled the screen, and I was in awe.

Not at the face per se, but at all the wrinkles. The individual creases and lines. Fine hairs and pores. Then there were the subtle variations in skin tones. Or how his white tunic was so bright (yeah, it was a different scene than the one above), yet the darkness of his hair was so deep.

It's a face with which the world is familiar, but rendered with a level of fidelity I'd never seen.

I was sitting in Walt Disney Studio's premiere cinema, the El Capitan Theater in Hollywood, about to experience the most impressive all-around movie presentation I'd ever witnessed.

The sights of tomorrow
"Tomorrowland" is the first movie released in Dolby Vision, which we've been talking about for ages. It promises to deliver improved contrast, brightness and color for both theatrical cinema and home video, and in early demos we've seen, it lives up to the hype.

Dolby Cinema is a new type of theater certification, sort of like "THX" or "IMAX." It combines all of Dolby's various cinema technologies, including Dolby Vision as well as Atmos sound, 3D, and more. "Tomorrowland" is the first movie to take advantage of it all. "Inside Out" is next.

In cinema form, there are a few things at work here. The first is that the image is unusually bright for big-screen cinema. Dolby is claiming 31 foot-Lamberts (fL), on the 45-foot-wide-by-25-foot-tall Harkness Matt Plus white screen used at the El Capitan. 31 fL is a lot for a screen that size, and more than double the brightness of a typical cinema. CNET calibrates TVs to 40 fL for its comparisons. In the darkened theater, the image popped like you'd expect a good TV to do.

Another factor is the contrast, which is something digital projectors have struggled with since their inception. There was one moment, early on, where the screen went black, and I actually thought, "Wow, an actual black."

I remember the early days of the digital cinema transition when "black" was the same mediocre gray we were getting in LCDs, DLP projectors and early plasma TVs at home. Dolby is claiming 1,000,000:1 contrast ratio. Unlike every other time a number like that is thrown around, this actually looked like it could be legit. Deep, beautiful images that almost seemed 3D. Thankfully, this presentation was in 2D, so I didn't have to wear glasses and could enjoy its full brightness.

Then there was the color. "Tomorrowland" used the relatively wide P3 color space, a part of the Digital Cinema Initiatives (DCI) standard. A wider color space allows display of more colors, making the image appear closer to reality. P3 isn't quite as broad as the Rec. 2020 color space (which Dolby Vision can handle too, apparently), but it's a lot more than your TV can deliver.

Watching "Tomorrowland" there was an extra richness to the colors. Yellows and oranges especially caught my eye as being more vibrant than I'd expect to see on a regular TV or home projector.

And of course, the resolution. I was sitting six rows back, but the El Capitan has a stage, so I was probably more like 12 to 15 rows from the screen in a normal theater. From this distance, on a screen that huge, the detail was incredible. Far sharper than I'd seen in most theaters, though that's not saying much since the vast majority of digital cinema theaters are 2K, and finding a decent film projection these days is like trying to find a typewriter repairman.

Early digital cinema projectors were so low-resolution that from any seat you'd want to sit in, you'd be able to see the tell-tale grid of pixel structure, known as the "screen door effect" because it seemed like you were looking through a screen door. Here, sitting fairly close, the door was wide open, and all I saw was glorious, glorious detail. Like I've always said, in big screens, bring on the 4K.

The thanks for a lot of this goes to the projector, a Christie 4K unit with dual RGB laser projection heads. Yep, actual lasers. Red, green and blue lasers generated in their own towers (with serious cooling), and run to the projector using fiber optics. Not sure that could sound more futuristic.

So when can I take it home?
While we're not getting lasers in the home any time soon -- unless you count expensive home-theater projectors like the Epson LS10000 CNET recently reviewed -- we are getting TVs with HDR, expanded color and of course 4K.

You'll need a pretty big TV to see all the detail with a 4K TV, but you'll easily see the benefits of HDR and expanded color.

HDR TVs are coming out this year, and some, like Samsung's JS9500 series, are on sale already. Only one announced so far, the Vizio Reference Series, uses Dolby Vision, but Dolby expects others in 2016 and later.

There are other potential HDR standards to compete against Dolby Vision, however, and forthcoming HDR-capable TVs might support more than one. We'll be picking apart that potential mess when we get more info, probably closer to the launch of 4K Blu-Ray this holiday season. I, for one, am optimistic that it will work out in the consumer's favor, if only because this time around, there won't be a format war.

Check out HDR Arrives and HDR for photography vs. HDR for TVs: What's the difference? for more info.

Expanded Color arrives this year as well. TVs are shipping this summer, and again, 4K BD should bring us the content, eventually.

Check out Ultra HD 4K TV color, part I: Red, green, blue and beyond and Ultra HD 4K TV color, part II: The (near) future for the full story.

Higher sounds
Of course, visuals are only part of a movie. The El Capitan, like a growing number of theaters, features Dolby Atmos. Atmos adds height speakers directly above the audience, but it also creates a new way to address those (and other) speakers. Called "object-based" surround, it allows sound designers to place "objects" (a gunshot, say) anywhere in the 3D space of the theater. The Atmos processing decides which speakers must be used to re-create that sound, in roughly that place, in any Atmos theater.

Atmos allows more flexibility in audio mixing and surround sound design. I've seen several movies in Atmos now, and it works really well. You get far more of a natural enveloping of sound than the traditional surround-speaker-here-surround-speaker-there we've had for decades.

Atmos at home
Atmos-enabled receivers are here, as well as a few so-enabled speakers like the Pioneer Elite SP-EBS73 -- and more are on the way. A smattering of Blu-ray discs offer Atmos soundtracks now, and more will arrive with 4K Blu-ray later this year.

If you want the full Atmos experience in your home, by the way, you don't need to mount speakers on your ceiling. Home speakers utilize upward-firing drivers (above). These bounce the sound off the ceiling, and do a decent job (from what I've heard so far) of approximating speakers actually mounted on the ceiling.

Do you have to upgrade to Atmos? No, but it does add another level of home-theater immersion. DTS has announced a similar surround method called DTS:X.

A reason to go to the theater tomorrow
I enjoyed "Tomorrowland." Brad Bird knows how to craft an enjoyable movie. But I was more impressed with the visuals and audio than the movie itself, to be honest. Combined, they're some of the first new cinema tech I'm actually excited about. It's not a step back or a distraction like many recent theater "advancements," such as 3D.

Will you be able to see Dolby Cinema in a theater near you? Maybe. Last month AMC announced that in addition to four initial installations, they'll have 50 Dolby Cinema theaters by the end of 2018, including theaters in San Francisco, Las Vegas, Philadelphia, Miami, Boston, Denver, and Seattle, with 100 planned by the end of 2024. That's not exactly a speedy roll-out, but that's just one company. We'll have to see how many others sign on.

In the mean time, if there is one near you, definitely check it out. Like, yesterday.
fqwabw.jpg

 
The Trillion Fold Increase In Computing Power, Visualized

1266833760202738858.gif


It’s easy to get hung up over the imperfections in our technology (srsly Apple, is it that hard to give a phone a back button?) and forget just how astounding modern processing power is. A community of IT professionals called Experts Exchange has now produced a fascinating infographic to remind us.

The visualization below, inspired by the recent 50th anniversary of Moore’s law, tells the story of the trillion fold increase in computing performance we’ve witnessed over the past sixty years. That’s impressive enough, but some of the other finds are downright astounding. The Apollo guidance computer that took early astronauts to the moon, for instance, has the processing power of 2 Nintendo Entertainment Systems, while the Cray-2 supercomputer from 1985—the fastest machine in the world for its time—roughly measures up to an iPhone 4.

Plenty of interesting insights to be found here. You can check out Expert Exchange’s original post for more info on their sources and methodology.

1266833760461379754.jpg
 
Last edited:
A Rare Glimpse Into the Eye of Typhoon Dolphin

1266858961485570088.jpg


To remote sensing scientists, peering directly into the eye of a tropical storm is like hitting a hole in one. That’s exactly what NASA’s CloudSat satellite did on May 16th, completing a stunning overpass of Typhoon Dolphin as the category 4 storm churned across the west Pacific.

CloudSat, which comprises part of NASA’s Earth-orbiting observatory, sends pulses of microwave energy through our planet’s atmosphere, some of which is reflected back to the spacecraft. Conceptually, CloudSat is similar to another tool we looked at last week, RapidScat, which NASA uses for mapping wind speed and direction. The strength of the signal CloudSat receives is related to the amount of ice or water in a cloud, while the time delay can be used to calculate the distance between the cloud and the Earth’s surface.

In the image directly below the aerial view of Typhoon Dolphin, we see the storm in cross section, with darker blues representing heavier precipitation. Combining this data with infrared images collected from Japan’s MTSTAT satellite, researchers produced another cross-sectional view of the storm’s eye and its overall cloud structure:

1266858961531369256.jpg


What’s amazing about these images is that CloudSat, scanning the entire Earth, has a field of view of just 0.5 square miles. Zeroing in on a tropical storm is difficult enough, but most cyclones are over 250 miles wide, while the eye of the storm is a small fraction of that size. A hole in one, indeed.
 
A Stunning, Multi-Wavelength Image Of The Solar Atmosphere

1266928191063956809.jpg


This fantastical image comes courtesy of NASA’s Solar Dynamics Observatory. It depicts a brilliant array of “coronal loops,” magnetic fluxes which form around sunspots and extend into the solar atmosphere.

More from NASA:

The Atmospheric Imaging Assembly (AIA) instrument aboard NASA’s Solar Dynamics Observatory (SDO) images the solar atmosphere in multiple wavelengths to link changes in the surface to interior changes. Its data includes images of the sun in 10 wavelengths every 10 seconds. When AIA images are sharpened a bit, such as this AIA 171Å channel image, the magnetic field can be readily visualized through the bright, thin strands that are called “coronal loops.”

Loops are shown here in a blended overlay with the magnetic field as measured with SDO’s Helioseismic and Magnetic Imager underneath. Blue and yellow represent the opposite polarities of the magnetic field. The combined images were taken on Oct. 24, 2014, at 23:50:37 UT.
 
An Eerie Look into the Atomic Age:

This year, we aw top-secret photos of the birth of the atom bomb finally declassified. The photos of how the US government used that technology after World War II are just as interesting.

The Department of Energy has only existed since 1977. But its roots go way back to projects overseen by multiple other agencies, like the Army Corps of Engineers and the Atomic Energy Commission, which was appointed to lead the charge into our wonderful, clean, nuclear-powered future after the war—but was abolished in the 1970s as the environmental impact and human dangers of radiation emerged. But for three decades, the AEC oversaw a broad range of projects, from putting a “nuclear heart” in cow to the slightly less dramatic task of figuring out how to design safe nuclear power plants.

The Department of Energy keeps plenty of archival photos from the post-war era on Flickr, including one gigantic album of AEC-affiliated facilities, from Fermilab to the Stanford Linear Accelerator. It’s a vivid look at a complicated, sprawling organization whose legacy ranges from important to morally indefensible. Below you’ll find some of the photos, but go check out the DOE’s huge archive—seriously, it’s well worth a few minutes of your day.

1267873732113984686.jpg

This innocuous-looking image is actually historic. It shows the first moment that nuclear power was using to generate electricity—on December 20, 1951, at Argonne National Lab, outside of Chicago.Argonne was home to something called the EBR, or the Experimental Breeder Reactor, which the lab explains was “the first reactor to demonstrate the breeder principle—generating, or ‘breeding,’ more nuclear fuel than it consumed.”

1267873732287420846.jpg

The structure above is called a Cockcroft–Walton generator, and it’s a circuit that was used to generate the high voltages needed for particle accelerators. Photographer Mark Kaletka has a good description of what’s happening here. “The legs are resistors (blue cylinders), capacitors (silver doughnuts) and diodes,” he writes on Flickr. “The silver domes at the top get charged to 750,000 volts, which accelerates ionized hydrogen into the accelerator complex.”

1267873732349600686.jpg

A big part of the AEC’s mission was to develop facilities to test nuclear power plant designs. In the 1969, in Idaho, it operated the Zero Power Physics Reactor—a low-power nuclear reactor that existed solely to let scientists test out different designs and assemblies for real, full-scale nuclear power plants.

1267873732518704558.jpg

The Fast Flux Test Facility was another “test reactor” built to test nuclear power plant designs—this one in Richland, Washington. “The original purpose of the facility, although not a breeder reactor, was to develop and test advanced fuels and materials,” explains the DOE, as well as isotopes for medical research.


1267873732710828718.jpg

The research being done at these test facilities quickly made its way into full-scale plants, like this reactor called Browns Ferry, in Alabama, seen here in 1970. This is Unit 1, the original reactor, and it’s still in operation today—after a $1.8 billion renovation in 2002.


1267873732885192110.jpg

A reactor assembly.


1267873733029843630.jpg

“More than 2500 grams of fully deuterated isotope hybrid blue-green algae were produced for use by the AEC’s Argonne scientists in the extraction, purification and characterization of proteins, the DOE explains about this vivid 1972 photo. Below, an employee inspects grasshoppers with a magnifying glass in 1958—though for what purpose is lost.


1267873733111050158.jpg

“Light pipes are used to transmit the light flashes which occur when high energy particles pass through a scintillator to a photomultiplier tube,” says the department of this image, also from the 1970s.

 
Exoplanet Hunters Will Comb Starlight With Lasers

1269243334373939492.jpg


This picture shows the spectrum of light. You may have seen similar images in the past, but this one is something special—because it’s made of star light.

In April 2015, two so-called laser frequency combswere installed at the High Accuracy Radial velocity Planet Searcher (HARPS) planet-finding instrument of the European Southern Observatory’s 3.6-metre telescope at the La Silla Observatory in Chile. ESO explains what these devices and the spectra they produce are good for:

A laser frequency comb can be used as a ruler with which spectra from astronomical objects can be measured with unprecedented precision. They will allow the tiny changes in stellar velocity induced by an Earth-like planet as it orbits a star to be detected. [...] The increase in accuracy made possible by this new installation should in future allow HARPS to be able to detect Earth-mass planets in Earth-like orbits around other stars for the first time.
 
Back
Top Bottom