What's new

Micro stories - small news bits too small to have their own thread

NASA, Industry Complete Third Phase of UAS Flight Testing

Detect and Avoid | NASA

ed15-0184-05.jpg

The Ikhana UAS soars over the Mojave Desert during a flight from NASA Armstrong Flight Research Center, Edwards, California.
Credits: NASA Photo / Carla Thomas


Evolving technologies necessary for Unmanned Aircraft Systems (UAS) to safely avoid other aircraft while moving through the nation's skies recently were put to the test using NASA's remotely piloted Ikhana aircraft.

Equipped with a prototype system of Detect-and-Avoid (DAA) sensors working in concert with airborne and ground-based computers, Ikhana made 11 flights involving more than 200 scripted encounters with approaching aircraft.

Depending on the specific scenario, either Ikhana detected one or more approaching aircraft and sent an alert to its remote pilot to take action, or Ikhana itself took action on its own by flying a programmed maneuver to avoid a collision – an aviation first.

ed15-0201-07.jpg

NASA researchers (from left) Martin Hoffman, John Freudinger, and Ed Koshimoto observe one of the FT3 tests from the Research Ground Control Station at NASA Armstrong. Credits: NASA Photo / Ken Ulbrich

"We recorded some valuable data that will take some time to analyze fully, and we expect we'll need to make some minor refinements to our algorithms, but from what we saw during the tests, the results look promising," said Dennis Hines, NASA's director for programs for Armstrong Flight Research Center at Edwards, CA.

Staged from Armstrong and flown over the high desert of California, the DAA research was designated FT3, the third in a series of flight test campaigns for NASA’s Unmanned Aircraft Systems Integration in the National Airspace System (UAS-NAS) project.
“The successful completion of this flight test campaign represents the maturity of our detect-and-avoid system,” said Frank Pace, president of Aircraft Systems for General Atomics Aeronautical Systems, Inc.

As a NASA industry partner, the company developed one of the three primary DAA sensors flown on Ikhana, in this case a prototype radar system. It also contributed Ikhana system and self-separation and collision avoidance alerting logic software.

ed15-0201-56.jpg

Honeywell supplied a specially instrumented twin-engine King Air to serve as an intruder for NASA’s Ikhana UAS.
Credits: NASA Photo / Ken Ulbrich


The other two sensors included an Automatic Dependent Surveillance – Broadcast (ADS-B) from BAE Systems, and a second generation Traffic alert and Collision Avoidance System (TCAS) from Honeywell International, Inc.

ADS-B is a satellite-based navigation tool in which an aircraft determines its position and then broadcasts that information, enabling other nearby airplanes equipped with the same tool to know where everyone is at in the sky.

As its name implies, TCAS keeps an electronic eye on the sky immediately surrounding an airplane. Should another airplane with a similar device fly too close, an alert will prompt the pilot to take action.

Honeywell also provided software that enabled the three sensors to work together, as well as a specially instrumented aircraft to play the role of an intruder encroaching on Ikhana's airspace.

"This phase of flight tests, and our ability to meet the challenge of integrating UAS into the NAS, wouldn't be possible without the strong partnership that exists between NASA and its aeronautical industry partners," Hines said.

Knowledge gleaned from the data recorded during this third phase of UAS-NAS flight tests not only will help researchers plan the next phase of flight tests – now targeted for next spring – but also will help inform organizations developing UAS-related operating standards.
 
Boeing Wants To Harvest Electricity From the Roar of Your Plane Taking Off

1442280959884427300.png


Jet engines are extraordinarily loud at roughly 140 decibels–and airports have struggled with mitigating their roar since the early days of commercial flight. An engineer at Boeing wants to make the cacophony more useful, if not silence it for good.

For Boeing, patents seem like a way to publicize the company’s most futuristic R&D; recent filings including nuclear- and laser-powered jet engines, Star Trek-style force fields for cars, and drones that turn into submarines. The company’s latest patent-turned-future-report? A filing from employee Chin Toh for a “method for producing electricity from airport acoustical energy.”

The idea is to harness the intensity of the noise generated at airports by planes taking off and landing, and turning it into electricity that the airport could use to power operations. “This acoustic energy is left to dissipate and represents a lost energy resource,” Toh writes. “Heretofore, there has been no way to recycle the acoustic energy generated by aircraft during takeoffs and landings.”

Toh describes all those soundwaves as wasted energy–that we should have started putting to good use a long time ago. The design would change how the average runway looks, lining them with thousands of “acoustic wave collectors” situated along the edges of the runways, not unlike a lighting system. These devices would collect and focus the sounds—eg the vibrations—of the engine’s roar while taking off or landing. Then this vibrational energy would be converted into air flow, powering a turbine that then generates electricity. A substation at the end of each link would collect and transmit this energy to the ultimate source.

1442280960074377252.png


In theory, it’s a sound idea (har har), but it’s tough to know exactly what the energy output would be from the system–a crucial piece of information, considering that Toh’s design would require quite a bit of money and work to install at a working airport.

A few years ago, MIT’s School of Engineering published a useful explainer on harvesting sound energy, pointing out that the sounds humans can bear hearing don’t pack much of a punch compared to, say, photovoltaics:

What the human ear perceives as clanging cacophony—the roar of a train engine or the whine of a pneumatic drill—only translates to about a hundredth of a watt per square meter. In contrast, the amount of sunlight hitting a given spot on the earth is about 680 watts per meter squared.

Still, “the idea is definitely there, and it’s quite promising,” says a fellow at MIT’s Materials Science and Engineering department named David Cohen-Tanugi in the piece.

So until our own technology for harvesting soundwaves improves in efficiency, Toh’s patent probably won’t make much practical sense. But it’s still an interesting example of how ambient phenomena in our environment could be harnessed as a source of energy.

Airport administrators are increasingly looking for ways to deal with noise pollution–a few years ago, Amsterdam’s Schipol airport experimented with a land art installation that actually dampened runway noises using basic acoustics. Not every airport has the means or space to dampen the roar, but perhaps one day they’ll have the technology to make it useful.
 
NASA Developed Technology Aims to Save Commercial Airlines Fuel, Time

NASA Developed Technology Aims to Save Commercial Airlines Fuel, Time | NASA

15-192_tasar_in_cockpit_0.jpg

The TASAR application can be seen in the far right screen. Credits: NASA/David C. Bowman

Two passenger airlines soon will test NASA-developed software designed to help air carriers save time and reduce fuel consumption and carbon emissions.

During the next three years, Virgin America and Alaska Airlines will use the Traffic Aware Planner (TAP) application, to make "traffic aware strategic aircrew requests" (TASAR).

"TAP connects directly to the aircraft avionics information hub on the aircraft," said David Wing, TASAR project lead at NASA’s Langley Research Center in Hampton, Virginia. "It reads the current position and altitude of the aircraft, its flight route, and other real-time information that defines the plane's current situation and active flight plan. Then it automatically looks for a variety of route and/or altitude changes that could save fuel or flight time and displays those solutions directly to the flight crew."

TAP also can connect with the plane's Automatic Dependent Surveillance-Broadcast (ADS-B) receiver and scan the ADS-B signals of nearby air traffic to avoid potential conflicts in any proposed flight path changes, making it easier for air traffic controllers to approve a pilot's route change request.

For airlines with Internet connectivity in the cockpit, TAP also can access information -- such as real-time weather conditions, wind forecast updates and restricted airspace status -- to further increase flight efficiency. The software is loaded onto a tablet computer, which many airline pilots already use for charts and flight calculations.

Wing and his team already have tested the TASAR software twice aboard a Piaggio P180 Avanti aircraft, a high-performance technology test bed owned and operated by Advanced Aerospace Solutions, LLC of Raleigh, North Carolina. The system worked well on its initial test flight from Virginia to Kentucky, according to its test pilot, former airline captain William Cotton.

"We used it to make a route change request from air traffic control, which they granted," said Cotton. "We got a shortcut that saved four minutes off the flight time."

Even four minutes of flight time shaved off of each leg of a trip made by an airline could result in massive fuel and time savings, according to researchers. The software provided similar results as flight tests continued in the northeast corridor. A second round of flight tests was recently completed to ensure readiness for operational use by partner airlines.

The TASAR flight tests came after a dozen pilots provided feedback on the technology in a simulation at the University of Iowa Operator Performance Laboratory in Iowa City, Iowa. In addition, aerospace systems manufacturer Rockwell Collins of Cedar Rapids, Iowa, analyzed TASAR to make sure it is safe and can be readily certified by the Federal Aviation Administration.

15-192_tasar_19.jpg

The Piaggio P. 180 Avanti aircraft used for TASAR testing. Credits: NASA/David Wing

"We’re excited to partner with NASA to test this new technology that has the potential to help reduce fuel consumption and carbon emissions and save our guests time in the air.” said Virgin America Chief Operating Officer Steve Forte in Burlingame, California.

"Up until now there has been no way to deliver comprehensive wind and congestion data to pilots in near-real time," said Tom Kemp, Alaska Airlines’ vice president of operations in Seattle, Washington. "TASAR is a 'super app' that will give our pilots better visibility to what’s happening now versus three hours earlier when the flight plan was prepared."

Developers say the new technology won't require changes to the roles and responsibilities of pilots or air traffic controllers, which would allow the system to be implemented fast and start producing benefits right away.

"The system is meant to help pilots make better route requests that air traffic controllers can more often approve," said Wing. "This should help pilots and controllers work more effectively together and reduce workload on both sides from un-approvable requests. TASAR takes advantage of NASA's state-of-the-art TAP software, flight information directly from the aircraft and the emerging ADS-B and Internet infrastructure to help pilots get approved to fly the most efficient or time-saving trajectory possible."

NASA researchers expect this and other aviation technologies under development will help revolutionize the national airspace system, reducing delays and environmental impacts and improving passenger comfort and efficiency, even as the demand for air travel continues to grow.
 
The Spores of These Ancient Plants Literally Hop Along on Four Little Legs

1442270671542146850.jpg


These are horsetails. They are hundred-and-fifty-million-year-old plants and the last of their kind. Instead of seeds, they give off spores. And instead of flying or swimming, these spores use humidity to walk, or even hop, on four little legs.

Okay, this is incredible. Horsetails, or equisitem, are one of those “living fossil” plants that have managed to hang on after most of the rest of their family has died off. They look like someone cut a single branch off a Christmas tree and stuck it in the ground. When they reproduce, they send out spores, which look like little round seeds with four “arms” curved around their body. As it gets humid, here’s what happens.


Yep. The offspring of something named after horses trots on four legs. It’s not as graceful as a horse, but it’s been around for 150 million years, so clearly it gets the job done.
 
Bloodhound SSC: This 1,000 MPH Rocket Is The World's Most Powerful Car

1443812211861581999.jpg


The Bloodhound SSC landspeed racer is what happens when the people who build the car that broke the sound barrier try to build a car that beats 1,000 MPH. It’s already the most powerful land vehicle with over 135,000 thrust horsepower.

In 1997, RAF fighter pilot Andy Green piloted the first supersonic car, but that apparently wasn’t enough for him or for team leader Richard Noble and thus the Bloodhound SSC was born.

Though we’ve seen most of the car before, today was the first glimpse the team has offered of the full vehicle in racing spec. Speaking of “racing spec,” where the hell do you race a car like this? The team says they’ve found a stretch of land in South Africa that’s the requisite 12 miles long and three miles wide.

We’ll see in 2016 if they can do it.
 
Bloodhound SSC: This 1,000 MPH Rocket Is The World's Most Powerful Car

1443812211861581999.jpg


The Bloodhound SSC landspeed racer is what happens when the people who build the car that broke the sound barrier try to build a car that beats 1,000 MPH. It’s already the most powerful land vehicle with over 135,000 thrust horsepower.

In 1997, RAF fighter pilot Andy Green piloted the first supersonic car, but that apparently wasn’t enough for him or for team leader Richard Noble and thus the Bloodhound SSC was born.

Though we’ve seen most of the car before, today was the first glimpse the team has offered of the full vehicle in racing spec. Speaking of “racing spec,” where the hell do you race a car like this? The team says they’ve found a stretch of land in South Africa that’s the requisite 12 miles long and three miles wide.

We’ll see in 2016 if they can do it.



This Plane Will Soar to the Edge of Space on Giant Air Currents

1436433245737003920.jpg


A glider designed to float to the edge of space on air currents will attempt its first flight on Wednesday. Next year, the Perlan Mission II will launch to soaring altitudes of 90,000 feet, where it’ll harvest invaluable data on Earth’s atmosphere and climate.

Sponsored by commercial airplane manufacturer Airbus Group, the Perlan Project is on a quest to soar to record heights using its Perlan Mission II glider. The first Perlan Project set the the existing manned glider altitude record of 50,722 feet in 2006, by taking advantage of air currents known as “stratospheric mountain waves”— basically, the ocean waves of the sky. Our friends over at Flightclub explain:

From 1992-98, Perlan’s founder and NASA test pilot Einar Enevoldson collected evidence on a weather phenomenon that no one at the time even knew existed: stratospheric mountain waves. Like huge ocean waves, these waves of air are kicked off by strong winds blowing over the tops of high mountain ranges like the Andes. These waves of air then shoot straight up towards space. As a pilot, Einar quickly figured out that you can use a glider to ride those waves all the way up to near space. And he set out to prove it.

The Perlan Mission II, which began in 2014, intends to best its own record by a wide margin, gliding well into the stratosphere after launching from a gusty mountain ridge in the Andes. The team will be trekking down to Argentina next year in search of a launch site close to the southern polar vortex, an air current that drives mountain waves into the stratosphere.

Gliding to the edge of space will give scientists a chance to study interactions between different layers of the atmosphere, which could pave a path toward high-altitude commercial flight. It’ll also afford researchers the opportunity to study Earth’s climate from a dizzying new perspective:

“Currently climate change models are based on a theoretical understanding of how different layers of the atmosphere interact with each other,” James Darcy, a spokesperson for Airbus, told Climate Central. “Models are perhaps more simple than they should be. The scientific aim of Perlan will be to better understand the weather in the upper reaches of the atmosphere and build a more accurate model of what’s happening. That will drive more accurate predictability with respect to climate change.”

But first things first, the little glider needs practice. Next week — weather permitting — a small jet will tow the Perlan 2 to an altitude of 5,000 feet, where it’ll be released to fly around for about 45 minutes before landing. We’re keeping our fingers crossed!

Test flight successful for 'edge of space' sailplane - AOPA
 
Paraplegic Man Walks Using Own Legs With Brain Signals Re-Routed to Knees

1444268488879485354.jpg


A team of scientists has successfully re-routed the signals from a paraplegic man’s brain to his knees, allowing him to walk using his own legs for the first time in five years.

The Guardian reports that researchers from the University of California at Irvine have developed a system that captures brain waves using an electroencephalogram (EEG) electrode cap, sending them wirelessly to a computer. There, a series of algorithms process the data to work out if the wearer wishes to stand still or walk, before beaming commands to micro-controllers which sends impulses to nerves that then move muscles in the legs.

The system has been tested on a 26-year-old man who has been wheelchair-bound since an accident left him paralysed from the waist down five years ago when his spinal cord was severed. He underwent 20 weeks of training during the build-up to the experiments, improving muscle tone in his legs as well as learning how to create the right brain signals to reliably trigger the device.

But it worked: using a walking frame and harness to stop himself from falling over, the man was able to use his own legs to walk a 3.5-meter course. The experiment demonstrates that it’s possible to take brain signals and re-route them around an area of damage using just electronics. The research is published today in the Journal of NeuroEngineering and Rehabilitation.

The researchers do point out, though, that they’ve only tested the technique in one patient and that many more trials will be required in order to assess whether it can be used successfully by a wider number of people. While the patient managed to walk the 3.5-meter course, the computer occasionally faltered: the researchers claim that the brain signals required to aid balance can become confused with those which stimulate the walking motion.

And if it’s to be used to help people walk freely, then the team must also overcome the fact that an external computer is currently required. But the researchers write in their reports that “the cumbersome nature of the current noninvasive system... can potentially be addressed by a fully implantable brain-computer interface system, which can be envisioned to employ invasively recorded neural signals.”

We’re still some way off restoring full walking abilities to the paraplegic, then — but today, researchers just took a step closer to making it happen.
 
Paraplegic Man Walks Using Own Legs With Brain Signals Re-Routed to Knees

1444268488879485354.jpg


A team of scientists has successfully re-routed the signals from a paraplegic man’s brain to his knees, allowing him to walk using his own legs for the first time in five years.

The Guardian reports that researchers from the University of California at Irvine have developed a system that captures brain waves using an electroencephalogram (EEG) electrode cap, sending them wirelessly to a computer. There, a series of algorithms process the data to work out if the wearer wishes to stand still or walk, before beaming commands to micro-controllers which sends impulses to nerves that then move muscles in the legs.

The system has been tested on a 26-year-old man who has been wheelchair-bound since an accident left him paralysed from the waist down five years ago when his spinal cord was severed. He underwent 20 weeks of training during the build-up to the experiments, improving muscle tone in his legs as well as learning how to create the right brain signals to reliably trigger the device.

But it worked: using a walking frame and harness to stop himself from falling over, the man was able to use his own legs to walk a 3.5-meter course. The experiment demonstrates that it’s possible to take brain signals and re-route them around an area of damage using just electronics. The research is published today in the Journal of NeuroEngineering and Rehabilitation.

The researchers do point out, though, that they’ve only tested the technique in one patient and that many more trials will be required in order to assess whether it can be used successfully by a wider number of people. While the patient managed to walk the 3.5-meter course, the computer occasionally faltered: the researchers claim that the brain signals required to aid balance can become confused with those which stimulate the walking motion.

And if it’s to be used to help people walk freely, then the team must also overcome the fact that an external computer is currently required. But the researchers write in their reports that “the cumbersome nature of the current noninvasive system... can potentially be addressed by a fully implantable brain-computer interface system, which can be envisioned to employ invasively recorded neural signals.”

We’re still some way off restoring full walking abilities to the paraplegic, then — but today, researchers just took a step closer to making it happen.
Really happy to hear this :yahoo::yahoo::yahoo::yahoo:
 
Paraplegic Man Walks Using Own Legs With Brain Signals Re-Routed to Knees

1444268488879485354.jpg


A team of scientists has successfully re-routed the signals from a paraplegic man’s brain to his knees, allowing him to walk using his own legs for the first time in five years.

The Guardian reports that researchers from the University of California at Irvine have developed a system that captures brain waves using an electroencephalogram (EEG) electrode cap, sending them wirelessly to a computer. There, a series of algorithms process the data to work out if the wearer wishes to stand still or walk, before beaming commands to micro-controllers which sends impulses to nerves that then move muscles in the legs.

The system has been tested on a 26-year-old man who has been wheelchair-bound since an accident left him paralysed from the waist down five years ago when his spinal cord was severed. He underwent 20 weeks of training during the build-up to the experiments, improving muscle tone in his legs as well as learning how to create the right brain signals to reliably trigger the device.

But it worked: using a walking frame and harness to stop himself from falling over, the man was able to use his own legs to walk a 3.5-meter course. The experiment demonstrates that it’s possible to take brain signals and re-route them around an area of damage using just electronics. The research is published today in the Journal of NeuroEngineering and Rehabilitation.

The researchers do point out, though, that they’ve only tested the technique in one patient and that many more trials will be required in order to assess whether it can be used successfully by a wider number of people. While the patient managed to walk the 3.5-meter course, the computer occasionally faltered: the researchers claim that the brain signals required to aid balance can become confused with those which stimulate the walking motion.

And if it’s to be used to help people walk freely, then the team must also overcome the fact that an external computer is currently required. But the researchers write in their reports that “the cumbersome nature of the current noninvasive system... can potentially be addressed by a fully implantable brain-computer interface system, which can be envisioned to employ invasively recorded neural signals.”

We’re still some way off restoring full walking abilities to the paraplegic, then — but today, researchers just took a step closer to making it happen.
 
The Pentagon's Research Arm is Putting Money Into DNA Manufacturing

1445561281895214227.jpg


The Defense Advanced Research Projects Agency (DARPA) is the governmnet division that tries to take cutting-edge technology and turn it into something the military can use. The agency’s latest target? Genetic engineering.

The Foundry is a facility which is part of the Broad Institute, a joint MIT-Harvard biomedical research institute. It started work on a project involving “assembling massive genetic systems involving many genes” two years ago, with $7 million in seed funding from DARPA; now that the theory has been proved, DARPA is pouring in a lot more money, in the form of a $32 million contract.

What kind of stuff are they trying to achieve at the Foundry? Well, no-one’s talking about genetically enhanced soldiers just yet. Rather, the focus is on things like agriculture and medicine: for example, reengineering genomes to change the baterial processes that convert nitrogen in the air to ammonia, reducing the need to fertizile crops.

It’s an exciting technology with virtually endless potential applications; funding from anywhere, even the creepy arm of military research, should help advance the project. Just keep an eye out for any of the Marvel comic villains in the meantime.

...

enclave_recruitment_poster_by_tagailog.png


For genetic manipulation:partay:.

389
 
Last edited:
This 17-Ton Magnet Is Now Ready To Study Mysterious Particles

1446097693941643814.gif


Last year, a massive, 17-ton, 52-foot wide electromagnet was successfully shipped from Long Island to Illinois. This week, it hit another milestone: It was successfully chilled to absolute zero temps after 10 years’ inactivity, proving it’s ready to solve a whole new decade’s physics mysteries.

The magnet, built in the 90s at Brookhaven National Laboratory in New York, sat for 10 years unused before it was decided it’d be put to better use at Fermilab in Illinois. And so began the most daunting, improbable-sounding odyssey ever: Ship the magnet 3,200 from NY to the Midwest, without deconstructing or twisting any of its extremely delicate and complex superconducting rings. Hmm, sure, okay.

Symmetry magazine reports that the magnet was shipped on a barge from Long Island to Florida, and then sent up a bunch of rivers all the way to Illinois, where “a specially designed truck gently drove it the rest of the way to Fermilab.”

That’s not to mention its steel base, which did involve reconstruction, and took the good part of the last year: Symmetry reports that there were two dozen 26-ton steel parts and a dozen 11-ton smaller pieces were involved, which sounds like the heaviest jigsaw puzzle ever.

But with the recent accomplishment (plummeting the superconductor to minus 450 Fahrenheit and powering it back up again), it’s all systems go. Cold temps are needed to slow particles down enough so they can be studied. This magnet will use a powerful particle beam that’s being constructed to continue to study mysterious subatomic particles called muons.

By trapping the beam-produced muons in a magnetic field, scientists can figure out if there are hidden subatomic forces affecting muon movement. This can help them learn more about those forces, about undiscovered particles, and more about the nature of the universe. Those new studies are set to begin in 2017.
 
An Autonomous Shuttle Is Driving Public Streets for the First Time

1446677283674246573.gif


This fall, a city in the Netherlands will become the first to allow fully autonomous shuttles regularly on its public roads–in the form of a small bus carting people between two towns.

They’re called WEpods, and they’re only large enough to fit six people comfortably. It’s a project of the town of Wageningen, which is in the central part of the Netherlands where farming is big business. The community is using the buses to shuttle visitors in between the towns of Ede and Wageningen (about a 17-minute drive) as well as around its university, a center for agriculture research. Autonomous buses will lend it an air of “new, flexible, sustainable and social mobility” for visiting businesspeople and tourists, the project’s website explains.


The buses–which are an altered version of those made by Swiss robotics company EasyMile and have been tested in several private projects–won’t go terribly fast: They’ll peter along at roughly 15 miles per hour, as BigThinkreports. They also won’t go very far, and a human will always be watching remotely via camera to make sure nothing goes awry. But it’s still a big deal, since it’ll be the first regular use of totally autonomous shuttle on a public road. While Google and others have been testing their driverless cars in public for a while now, but they have humans inside in case of emergencies–meanwhile, smaller autonomous prototypes have seen short tests in public,but nothing permanent.

Seemingly anticipating public anxiety, the project’s creators launched an online forum where people can ask questions prior to the November 30th launch date. Some of these comments are fairly nuts (“I would feel in such a car as a cookie in the cookie jar, which are short lived.”). But another discussion on the forum is actually pretty informative–a postdoc researcher named Joris Ijsselmuiden, who studies robotics and agriculture and works on the project, posted a gif that shows how the pods identify street signs and objects using computer vision.

1446677283793818029.gif



While of course the buses use GPS data, they also use computer vision to glean information about where the bus is heading independently. Ijsselmuidenexplains:

If the accuracy of the GPS system decreases, for instance by trees along the road, it must be helped by landmark detection. Here the cameras detect objects along the way and compare them with known objects from earlier recordings. The position of these objects is known and so the vehicle can calculate where it is located.

It’s pretty cool to see this kind of machine learning literally in motion—even if it’s only going 15mph.
 
Elon Musk on Tesla’s autonomous cars - Tech Insider

Elon Musk says Tesla’s fully autonomous cars will hit the road in 3 years

"Tesla’s self-driving vehicles are not far off.

During an interview earlier this week with the Danish news site Borsen, Tesla CEO Elon Musk said the company is rolling out its "Autopilot" feature to the masses next month and the company’s fully autonomous vehicles will be ready in just a few short years.

“The Tesla that is currently in production has the ability to do automatic steering autopilot on highway. That is currently being beta tested and will go into wide release early next month. So, we are probably only a month away from having autonomous driving at least for highways and for relatively simple roads,” Musk said. “My guess for when we will have full autonomy is approximately three years.”

While Tesla will be ready to roll out its self-driving cars in a just a few short years, the government may not be ready for them.

Tesla is already testing its self-driving car technology on public roads in California, but Musk said that he doesn’t expect regulators to allow fully autonomous vehicles on the road beyond testing purposes for another one to three years after Tesla finishes its self-driving car.

“In some markets regulators will be more forward leaning than others, but in terms of when it will be technologically possible, it will be three years,” Musk said.

Twenty years from now, Musk said it’s likely all cars being built will be self-driving and a very large percentage of vehicles will be electric.

Major automakers and tech companies — including Google, BMW, Mercedes-Benz, and reportedly Apple — are all currently working on autonomous driving technology. However, like Tesla, many car makers are first rolling out Autopilot-like functions into their newer cars, which enable the vehicle to do things like autonomously drive on the highway and self-park."
 
More Sad Remains Of The Soviet Buran Space Shuttle Program

1447925131156528559.jpg


Pictures of the Soviet Space Shuttle in its hanger have been making the rounds on the internet recently, but there’s another shuttle out there. Russian photographer Aleksander Markin came across the remains of the original wooden model, used for wind tunnel testing.

The pictures were uploaded back in 2013, and show off the 1:3 scale model of the Buran Space Shuttle, rotting away in the elements.

1447925131230994351.jpg


The scale model would have been used in a wind tunnel to figure out how the final spacecraft would fare returning to Earth.

1447925131287678639.jpg


As of a couple of years ago, it looks like it was abandoned alongside the rest of the shuttle program. It’s a bit of a shame to see that these items haven’t been preserved: they’re an interesting note in space history.

Take a look at the full album here.
 

Pakistan Affairs Latest Posts

Back
Top Bottom