Local Cosmic Web from Galaxies

Dark matter is an elusive substance that makes up 80% of the universe. It also provides the skeleton for what cosmologists call the cosmic web, the large-scale structure of the universe that, due to its gravitational influence, dictates the motion of galaxies and other cosmic material. However, the distribution of local dark matter is currently unknown because it cannot be measured directly. Researchers must instead infer its distribution based on its gravitational influence on other objects in the universe, like galaxies.

Credit: Sungwook E. Hong

Previous attempts to map the cosmic web started with a model of the early universe and then simulated the evolution of the model over billions of years. However, this method is computationally intensive and so far has not been able to produce results detailed enough to see the local universe. In the new study, the researchers took a completely different approach, using machine learning to build a model that uses information about the distribution and motion of galaxies to predict the distribution of dark matter.

The researchers built and trained their model using a large set of galaxy simulations, called Illustris-TNG, which includes galaxies, gasses, other visible matter, as well as dark matter. The team specifically selected simulated galaxies comparable to those in the Milky Way and ultimately identified which properties of galaxies are needed to predict the dark matter distribution. The research team then applied their model to real data from the local universe from the Cosmicflow-3 galaxy catalog. The catalog contains comprehensive data about the distribution and movement of more than 17 thousand galaxies in the vicinity of the Milky Way — within 200 megaparsecs.

The map successively reproduced known prominent structures in the local universe, including the “local sheet” — a region of space containing the Milky Way, nearby galaxies in the “local group,” and galaxies in the Virgo cluster — and the “local void” — a relatively empty region of space next to the local group. Additionally, it identified several new structures that require further investigation, including smaller filamentary structures that connect galaxies. For example, it has been suggested that the Milky Way and Andromeda galaxies may be slowly moving toward each other, but whether they may collide in many billions of years remains unclear. Studying the dark matter filaments connecting the two galaxies could provide important insights into their future.

Adapted and abridged from Source (Penn State. “Dark matter map reveals hidden bridges between galaxies.” ScienceDaily. ScienceDaily, 25 May 2021.)

Original paper: Sungwook E. Hong, Donghui Jeong, Ho Seong Hwang, Juhan Kim. Revealing the Local Cosmic Web from Galaxies by Deep Learning. The Astrophysical Journal, 2021; 913 (1): 76 DOI: 10.3847/1538-4357/abf040

A robot that senses hidden objects

The researchers at MIT (Massachusetts Institute of Technology) have developed a robot that uses radio waves, which can pass through walls, to sense occluded objects. The robot, called RF-Grasp, combines this powerful sensing with more traditional computer vision to locate and grasp items that might otherwise be blocked from view. The advance could one day streamline e-commerce fulfillment in warehouses or help a machine pluck a screwdriver from a jumbled toolkit.

Using optical vision alone, robots can’t perceive the presence of an item packed away in a box or hidden behind another object on the shelf — visible light waves, of course, don’t pass through walls. But radio waves can. For decades, radio frequency (RF) identification has been used to track everything from library books to pets. RF identification systems have two main components: a reader and a tag. The tag is a tiny computer chip that gets attached to — or, in the case of pets, implanted in — the item to be tracked. The reader then emits an RF signal, which gets modulated by the tag and reflected back to the reader.

The reflected signal provides information about the location and identity of the tagged item. The technology has gained popularity in retail supply chains — Japan aims to use RF tracking for nearly all retail purchases in a matter of years. The researchers realized this profusion of RF could be a boon for robots, giving them another mode of perception.

Source: Boroushaki

RF Grasp uses both a camera and an RF reader to find and grab tagged objects, even when they’re fully blocked from the camera’s view. It consists of a robotic arm attached to a grasping hand. The camera sits on the robot’s wrist. The RF reader stands independent of the robot and relays tracking information to the robot’s control algorithm. So, the robot is constantly collecting both RF tracking data and a visual picture of its surroundings. Integrating these two data streams into the robot’s decision making was one of the biggest challenges the researchers faced.

Adapted and abridged from Source (Massachusetts Institute of Technology. “A robot that senses hidden objects: System uses penetrative radio frequency to pinpoint items, even when they’re hidden from view.” ScienceDaily. ScienceDaily, 1 April 2021.)

Original paper: Boroushaki, T., Leng, J., Clester, I., Rodriguez, A. and Adib, F., 2020. Robotic Grasping of Fully-Occluded Objects using RF Perception. arXiv preprint arXiv:2012.15436.