Dog colour patterns explained

Scientists have unraveled the enigma of inheritance of coat color patterns in dogs. The researchers discovered that a genetic variant responsible for a very light coat in dogs and wolves originated more than two million years ago in a now extinct relative of the modern wolf.

Credit: University of Bern

The Institute of Genetics of the University of Bern has worked on understanding dog colour patterns and discovered that the gene responsible for a very light coat in wolves originated more than 2 million years ago which is now extinct. Wolves and dogs can make two different types of pigment, the black one, called eumelanin and the yellow, pheomelanin. A precisely regulated production of these two pigments at the right time and at the right place on the body gives rise to very different coat colour patterns. Prior to the study, four different patterns had been recognized in dogs and several genetic variants had been theorized which cause these patterns. During the formation of coat color, the so-called agouti signaling protein represents the body’s main switch for the production of yellow pheomelanin. If the agouti signaling protein is present, the pigment producing cells will synthesize yellow pheomelanin. If no agouti signaling protein is present, black eumelanin will be formed.

For the first time, the researchers characterized these two promoters in detail, in hundreds of dogs. They discovered two variants of the ventral promoter. One of the variants conveys the production of normal amounts of agouti signaling protein. The other variant has higher activity and causes the production of an increased amount of agouti signaling protein. The researchers even identified three different variants of the hair cycle-specific promoter. Starting with these variants at the individual promoters, the researchers identified a total of five different combinations, which cause different coat colour patterns in dogs.

Source (University of Bern. “Genetic enigma solved: Inheritance of coat color patterns in dogs.” ScienceDaily. ScienceDaily, 12 August 2021.)

Original paper: Bannasch, D.L., Kaelin, C.B., Letko, A., Loechel, R., Hug, P., Jagannathan, V., Henkel, J., Roosje, P., Hytönen, M.K., Lohi, H. and Arumilli, M., 2021. Dog colour patterns explained by modular promoters of ancient canid origin. Nature ecology & evolution, pp.1-9.

Superstrata bike

Send Superstrata your dimensions, riding style and preferences, and they’ll 3D print you a carbon fibre bike frame made to fit. Prefer a stiffer ride? A bike for commuting, or for touring? Superstrata claim to have over 500,000 possible combinations. There are two versions available: the traditional Terra bike and the Ion e-bike. The Ion has a sleek in-tube battery (no bulky black boxes in sight), takes two hours to charge and lasts for up to 88 kilometres.

Credit: Superstrata

Source (ScienceFocus, “80 cool gadgets: Our pick of the best new tech for 2021”, 22.06.2021)

Superstrata bike

Toward next-generation brain-computer interface systems

A new kind of neural interface system that coordinates the activity of hundreds of tiny brain sensors could one day deepen understanding of the brain and lead to new medical therapies.

Credit: Brown University

Most current BCI systems use one or two sensors to sample up to a few hundred neurons, but neuroscientists are interested in systems that are able to gather data from much larger groups of brain cells. Now, a team of researchers has taken a key step toward a new concept for a future BCI system — one that employs a coordinated network of independent, wireless microscale neural sensors, each about the size of a grain of salt, to record and stimulate brain activity. The sensors, dubbed “neurograins,” independently record the electrical pulses made by firing neurons and send the signals wirelessly to a central hub, which coordinates and processes the signals.

The team first designed and simulated the electronics on a computer, and went through several fabrication iterations to develop operational chips. The second challenge was developing the body-external communications hub that receives signals from those tiny chips. The device is a thin patch, about the size of a thumb print, that attaches to the scalp outside the skull. It works like a miniature cellular phone tower, employing a network protocol to coordinate the signals from the neurograins, each of which has its own network address. The patch also supplies power wirelessly to the neurograins, which are designed to operate using a minimal amount of electricity.

The goal of this new study was to demonstrate that the system could record neural signals from a living brain — in this case, the brain of a rodent. The team placed 48 neurograins on the animal’s cerebral cortex, the outer layer of the brain, and successfully recorded characteristic neural signals associated with spontaneous brain activity.

Source (Brown University. “Toward next-generation brain-computer interface systems.” ScienceDaily. ScienceDaily, 12 August 2021.)

Original paper: Lee, J., Leung, V., Lee, A.H., Huang, J., Asbeck, P., Mercier, P.P., Shellhammer, S., Larson, L., Laiwalla, F. and Nurmikko, A., 2021. Neural recording and stimulation using wireless networks of microimplants. Nature Electronics, pp.1-11.

Artificial Intelligence learns better when distracted

Computer scientists from the Netherlands and Spain have determined how a deep learning system well suited for image recognition learns to recognize its surroundings. They were able to simplify the learning process by forcing the system’s focus toward secondary characteristics.

Credit: University of Groningen

The researcher Estefania Talavera Martinez, lecturer and researcher at the Bernoulli Institute for Mathematics, Computer Science and Artificial Intelligence of the University of Groningen in the Netherlands, wanted to understand why errors in AI classification arise. She studied the use case scenario of recognizing food encounters, and soon found out that the images were not scanned thoroughly for clues. Therefore, her team came up with a solution that distracts the CNN from its primary targets. After the first successful detection, the target part of the image is blurred and the system is retrained. The methodology is less time consuming, giving better classification results.

Source (University of Groningen. “Artificial Intelligence learns better when distracted.” ScienceDaily. ScienceDaily, 29 July 2021.)

Original paper: Morales, D., Talavera, E. and Remeseiro, B., 2021. Playing to distraction: towards a robust training of CNN classifiers through visual explanation techniques. Neural Computing and Applications, pp.1-13.

Cyber Kicks

Video description: Created in two months by Kris Theorin using motion capture and other techno wizardry this personal project became a pulse-pounding foot chase through a neon-drenched cyberpunk city.

Credits: Directed, Animated, Edited, Sound Designed by Kris Theorin. “Up All Night” by Midnight Riot provided by Musicbed. Character Design and Modeling by Kris Theorin. Additional Models from KitBash3D, CGTrader, and TurboSquid.

Something’s Awry Productions