Drones learn to navigate autonomously by imitating cars and bicycles

A new algorithm allows drones to fly completely by themselves through the streets of a city and in indoor environments. The algorithm had to learn traffic rules and adapt training examples from cyclists and car drivers. Researchers of the University of Zurich and the National Centre of Competence in Research NCCR Robotics developed DroNet, an algorithm that can safely drive a drone through the streets of a city. Designed as a fast 8-layers residual network, it produces two outputs for each single input image: a steering angle to keep the drone navigating while avoiding obstacles, and a collision probability to let the drone recognise dangerous situations and promptly react to them.

drone.jpg

Credit: University of Zürich

One of the most difficult challenges in Deep Learning is to collect several thousand ‘training examples’. To gain enough data to train their algorithms, Prof. Scaramuzza and his team collected data from cars and bicycles, that were driving in urban environments. By imitating them, the drone automatically learned to respect the safety rules, such as “How follow the street without crossing into the oncoming lane,” and “How to stop when obstacles like pedestrians, construction works, or other vehicles, block their ways.” Even more interestingly, the researchers showed that their drones learned to not only navigate through city streets, but also in completely different environments, where they were never taught to do so. Indeed, the drones learned to fly autonomously in indoor environments, such as parking lots and office’s corridors.

Source (University of Zurich. “Drones learn to navigate autonomously by imitating cars and bicycles.” ScienceDaily. ScienceDaily, 23 January 2018.)

Original paper: Loquercio, A., Maqueda, A.I., Del-Blanco, C.R. and Scaramuzza, D., 2018. Dronet: Learning to fly by driving. IEEE Robotics and Automation Letters3(2), pp.1088-1095.

Brain-Like Tissue 3D Printed for the First Time

A new 3D-printing technique can create tissues as soft as a human’s squishy brain or spongy lungs — something that has not been possible before. “Additive manufacturing,” or 3D printing, promises to allow doctors to produce tailored organs for patients using the patients’ own cells, which could help bring down the severe shortage of organs available for people who need transplants. However, the technology still has significant limitations. To create these organs, bioengineers need to 3D print scaffolds that mimic the structure of the organs, which are then populated with the cells. So far, only relatively stiff materials can be 3D-printed. But some organs in the body, such as the brain and the lungs, have an extremely soft structure.

B

Credit: Zhengchu Tan et al./Imperial College London

“We have used a very soft material, which is a composite hydrogel, and printed the softer tissues similar to the brain and possibly lung as well,” Tan told Live Science. But the problem with 3D printing very soft materials is that the underlying layers tend to collapse as additional layers are added on top of them during the 3D-printing process, Tan said. Indeed, the process of 3D printing involves creating an object layer by layer, which means that the lower layers need to be able to support the weight of the growing structure.

To get around this problem, the researchers cooled things down — literally. “We are using a cryogenic printing process, which means that the previous layer is frozen,” Tan said. “Freezing makes the layer very solid and stable so that the next layer can be printed on top of that and the 3D object doesn’t collapse under its own weight.” After the printing is complete, the engineers can slowly thaw the object, and it keeps its shape, she said.

To 3D print the scaffold, the researchers used a novel composite hydrogel that consists of two components: a water-soluble synthetic polymer polyvinyl alcohol, and a jelly-like substance called Phytagel. Then, they coated the resulting structure with collagen and populated it with human cells. For the purposes of the experiment, however, the researchers used skin cells instead of brain cells on scaffold designed to mimic the human brain.

Source (Tereza Pultarova, “Scientists 3D-Printed Squishy, Brain-Like Tissue for the 1st Time”, Live Science, 13.01.2018)

Original paper: Tan, Z., Parisi, C., Di Silvio, L., Dini, D. and Forte, A.E., 2017. Cryogenic 3D printing of super soft hydrogels. Scientific reports7(1), pp.1-11.

Spaceships could use blinking dead stars to chart their way

Using only the timing of radiation bursts from pulsating stellar corpses, an experiment on the International Space Station was able to pinpoint its location in space in a first-ever demonstration. The technique operates like a stellar version of GPS, researchers with the Station Explorer for X-ray Timing and Navigation Technology experiment, SEXTANT, reported at a news conference January 11 during a meeting of the American Astronomical Society.

sextant.jpg

Credit: NASA

Known as pulsars, the dead stars emit beams of radiation that sweep past Earth at regular intervals, like the rotating beams from a lighthouse. Those radiation blips could allow a spaceship to find its location in space (SN: 12/18/10, p. 11). It’s similar to how GPS uses the timing of satellite signals to determine the position of your cell phone – and it would mean spacecraft would no longer have to rely on radio telescope communications to find their coordinates. That system becomes less accurate the further a spaceship gets from Earth.

SEXTANT used an array of 52 X-ray telescopes to measure the signals from five pulsars. By analyzing those signals, the researchers were able to locate SEXTANT’s position to within 10 kilometers as it orbited Earth on the space station, astronomer Keith Gendreau of NASA’s Goddard Space Flight Center in Greenbelt, Md., reported. On Earth, knowing your location within 10 kilometers isn’t that impressive — GPS can do much better. But “if you’re going out to Pluto, there is no GPS navigation system,” Gendreau said. Far from Earth, pulsar navigation could improve upon the position estimates made using radio telescopes.

Source (Emily Conover, “Spaceships could use blinking dead stars to chart their way”, ScienceNews, 12.01.2018)

Here’s the Plan

Video description:

A married cat-dog couple of cupcake bakers dream of opening their own bakery. One day their oven breaks and they have to postpone their dream in order to earn money and replace it. Somewhere down the line, they drift apart from their dream and from themselves.

“Here’s the plan” (“Este es el Plan”) is a 18-minute CG animated short film from Chile. Directed by Fernanda Frick and financed by the National Council of Culture and Arts, it took almost 2 years of work and a talented team of 32 professionals to finish it. It had it’s World Premiere at Nashville International Film Festival and it’s national premiere at Festival Chilemonos 2017.

DARPA Subterranean Challenge

The Defense Advanced Research Projects Agency (DARPA) has announced its latest challenge, called the Subterranean or “SubT” Challenge. The global competition asks entrants to develop systems that can help humans navigate, map and search in underground locations that are normally too perilous to visit.

“One of the main limitations facing war fighters and emergency responders in subterranean environments is a lack of situational awareness; we often don’t know what lies beneath us,” Timothy Chung, program manager in DARPA’s Tactical Technology Office (TTO), said in a statement. “The DARPA Subterranean Challenge aims to provide previously unimaginable situational awareness capabilities for operations underground.”

Groups all around the world will compete to solve problems that help people navigate in unknown, treacherous subterranean conditions, where time is of the essence, according to the statement.  Teams can compete in one of two tracks: a Systems track, to develop hardware-based solutions for a physical course, or a Virtual track, to develop software to test on a simulated course, DARPA said.

dsc.png

Credit: DARPA

The final competition, which will take place in 2021, will include three challenges that involve navigating in one of three environments: a network of human-made tunnels, a subterranean municipal-transit system and a network of underground natural caves. The final event will challenge teams to navigate networks that include elements of all three environments. The grand-prize winners will take home $2 million. The deadline to apply is Jan. 18, 2018.

Source (Tia Ghose, “Dig Deep: DARPA Contest Aims to Take People Underground”, LiveScience, 27.12.2017)