According to the study, the number of species in breeding birds (here: a blue tit) increased in the observation data, but this could only be a temporary trend. Photo: Pexels/Sony Dude
According to the study, the number of species in breeding birds (here: a blue tit) increased in the observation data, but this could only be a temporary trend. Photo: Pexels/Sony Dude

A recent study conducted by German ecologists reveals that the decline of local species diversity could be frequently underestimated

Species richness is not a reliable metric for monitoring ecosystems. A new study by Lucie Kuczynski and Helmut Hillebrand shows that systematic biases can mask an imminent decline in biodiversity.

Seemingly healthy ecosystems with a constant or even increasing number of species may already be on the path to the decline and loss of species. Even in long-term datasets, such negative trends may only become apparent with a delay. This is due to systematic distortions in temporal trends for species numbers. 

"Our results are important in order to understand that the species number alone is not a reliable measure of how stable the biological balance in a given ecosystem is at the local level," explains Dr. Lucie Kuczynski, an ecologist at the University of Oldenburg's Institute for Chemistry and Biology of the Marine Environment (ICBM) in Oldenburg, Germany and the lead author of the study, in which she and her colleagues combined observational data for freshwater fish and birds with calculations based on simulations.

The research team, the other members of which were Professor Dr. Helmut Hillebrand from the ICBM and Dr. Vicente Ontiveros from the University of Girona in Spain, was surprised by the results: "We find it very worrying that a constant or even increasing species number does not necessarily mean that all is well in an ecosystem and that the number of species will remain constant in the long term," Hillebrand explains. "Apparently, we have so far underestimated the negative trends for freshwater fish, for example. Species are disappearing faster than expected at the local level," adds Kuczynski.

A dynamic equilibrium

Up to now, biodiversity research had worked on the assumption that the number of species in an ecosystem will remain constant in the long term if the environmental conditions neither deteriorate nor improve. "The hypothesis is that there is a dynamic equilibrium between colonisations and local extinctions," lead author Kuczynski explains. Increasing or decreasing species numbers are interpreted as a response to improving or deteriorating environmental conditions.

To find out whether a constant species richness is a reliable indicator of a stable biological balance, Kuczynski and her colleagues first analyzed several thousand datasets documenting the number of species of freshwater fish and breeding birds in different regions of Europe and North America over many years – 24 years on average for the fish and 37 for the birds – with the aim of identifying trends in individual communities. They then compared the empirical data with various simulation models based on different expectations regarding immigration and extinctions of species.

The team initially observed a general increase in the number of species in both fish and bird populations during the observation periods. However, a comparison with the simulations showed that this increase was smaller than would have been expected. The researchers attributed this discrepancy to an imbalance between colonisations and local extinctions: "According to our simulations organisms such as freshwater fish which have limited potential for dispersal colonise an ecosystem faster than in neutral models, while their extinction occurs later than expected," says Kuczynski.

Doomed to extinction

This means that after an environmental change, species that are in fact doomed to extinction may remain present in an ecosystem for some time, while at the same time new species also move in. This effect disguises the impending loss of biodiversity, she explains. "There are transitional phases in ecosystems in which the number of species is higher than expected. Species extinction occurs only after these transition phases – and then usually faster than expected."

The team anticipates that a reassessment of which methods are best suited for monitoring the state of ecosystems will now be necessary, and that nature conservation targets – which in most cases aim to preserve existing species diversity – may also need to be redefined. The model developed by Kuczynski and her colleagues could serve as a tool to distinguish between the different mechanisms that influence species richness, and also provides information on the extent to which the observational data deviates from expected changes.

Brazil identifies flood-prone areas of cities

The study combined models that predict urban expansion and land-use changes with hydrodynamic models, and the results were validated using actual data for São Caetano do Sul, a city in metropolitan São Paulo.

Scientists affiliated with the National Space Research Institute (INPE) in Brazil have combined models that predict urban expansion and land-use changes with hydrodynamic models to create a methodology capable of supplying geographical information that identifies flood-prone areas of cities, especially those vulnerable to the impact of extremely heavy rainfall.

The groundbreaking study was based on data for São Caetano do Sul, a city in metropolitan São Paulo, but the methodology can be used by other cities to devise public policies and make decisions in addressing the impacts of these phenomena to avoid deaths of residents and destruction of buildings and infrastructure.

FAPESP funded the study via two projects (20/09215-3 and 21/11435-4). Preliminary results are reported in an article published in the journal Water. They were part of the Ph.D. research of Elton Vicente Escobar Silva, the first author of the article and a researcher at INPE.

In partnership with the Federal University of Paraíba (UFPB) and the Federal University of Rio Grande do Sul (UFRGS), and with local bodies, the researchers “tested” the modeling methodology using civil defense data for the city relating to a flood that occurred on March 10, 2019, when three people drowned and the floodwaters reached a depth of almost 2 meters in several streets. 

“I’ve worked with modeling for years, focusing on changes in land use and cover in urban areas. I wanted to combine this with flood simulation. The opportunity arose in connection with Elton’s project,” Cláudia Maria de Almeida, joint first author of the article and Silva’s thesis advisor, told Agência FAPESP. She is also a researcher at INPE, and she heads the institute’s urban remoting sensing unit (CITIES Laboratory).

“The study innovated by combining hydrodynamic modeling for urban areas with the complexity of the underground runoff drainage network, and by using real data to calibrate and validate the model. We combined very high-resolution spatial imaging and deep learning. All this is linked to big data and smart cities,” she said.

Discussion of smart cities began in 2010, initially involving technological issues such as integrated traffic light control systems and bus stops with Wi-Fi. Sustainability and quality of life for residents have been included more recently.

According to the United Nations, the world population reached 8 billion in 2022, with 56% living in urban areas. The population is expected to rise to 9.7 billion by 2050, with 6.6 billion (68%) living in cities.

Cities are currently expanding at twice the rate of population growth. In the next three decades, urban areas worldwide are set to total more than 3 million square kilometers, equivalent to the territory of India.

City planning is not advancing at the same pace. For example, rampant urbanization incurs changes in land use and cover, expands impermeable surfaces, and alters hydrology. In conjunction with the higher frequency of extreme weather events due to climate change, this exposes cities to flooding and landslides in the rainy season.

Cross-tabulation

For hydrodynamic modeling, the researchers used a software package called HEC-RAS (Hydrologic Engineering Center’s River Analysis System), which simulates water flow and surface elevation, as well as sediment transport.

To identify flood-prone areas, they used two digital terrain models (DTMs) with different spatial resolutions of 0.5m and 5m. A DTM is a mathematical representation of the topography of the Earth’s surface, excluding all vertical objects. The model can be manipulated by computer programs and is typically visualized as a grid in which an elevation value is assigned to each pixel. Vegetation, buildings, and other characteristics are digitally removed. In this study, the researchers used four supercomputing intervals (1, 15, 30, and 60 seconds) in their analysis of the simulations.

The best results were obtained from the simulations with a spatial resolution of 5m, which displayed maps with the highest coverage of flood-prone areas (278 out of 286 points, or 97.2%) in the shortest computation time. They identified the potential for flooding in areas not detected by civil defense authorities or citizens of São Caetano do Sul during actual flood events.

“We set out to create a methodology to support decision-makers. We simulated projected land-use changes several years ahead and their impact on the network of watercourses. On this basis, it’s possible to run simulations with scenarios. An example would be specifying millimeters of rain in a given timeframe to predict the impact on an area of a city in terms of flooding. Public administrators can use this capability to make decisions, avoiding economic damage as well as loss of life,” Silva said.

The researchers stressed the need for cities to update their databases for this type of analysis, as did São Caetano do Sul. “The model works with and is fed by data. It’s important for cities to have up-to-date information, including records relating to extreme cases, such as major floods and inundations,” Almeida said.

São Caetano do Sul is part of a dense conurbation that encompasses São Paulo city as well as the neighboring cities of Santo André and São Bernardo do Campo. It has had many floods and inundations – 29 between 2000 and 2022 alone, according to the researchers.

On the other hand, it ranks first among all 5,570 municipalities in Brazil for sustainability based on the Sustainable Development of Cities Index – Brazil (IDSC-BR), part of a series of reports produced by the United Nations Sustainable Development Solutions Network (SDSN) to monitor implementation of the Sustainable Development Goals (SDGs) in member countries.

With some 162,000 inhabitants, it has a comprehensive wastewater treatment system connected to 100% of homes. Almost all urban dwellings (95.4%) are located on public streets with trees, and a reasonably large proportion (37%) are on adequately urbanized streets (paved and with sidewalks, curbs, and drains), according to IBGE, Brazil’s census and statistics bureau.

Should robots be given a human conscience?

Humans have curated the best of human intelligence to inform AI, with the hopes of creating flawless machines – but could the flaws we left out be the missing pieces needed to ensure robots do not go rogue?

Modern-day society relies intrinsically on automated systems and artificial intelligence. It is embedded into our daily routines and shows no signs of slowing, in fact, the use of robotic and automated assistance is ever-increasing.

Such pervasive use of AI presents technologists and developers with two ethical dilemmas – how do we build robots that behave in line with our values and how do we stop them from going rogue?

One author suggests that one option which is not explored enough is to code more humanity into robots, gifting robots with traits such as empathy and compassion.

Is humanity the answer?

In a new book called Robot Souls, to be published in August, academic Dr. Eve Poole OBE explores the idea that the solution to society’s conundrum about how to make sure AI is ethical lies in human nature.

She argues that in its bid for perfection, humans stripped out the ‘junk code’ including emotions, free will, and a sense of purpose.

She said: “It is this ‘junk’ which is at the heart of humanity. Our junk code consists of human emotions, our propensity for mistakes, our inclination to tell stories, our uncanny sixth sense, our capacity to cope with uncertainty, an unshakeable sense of our own free will, and our ability to see meaning in the world around us.

“This junk code is in fact vital to human flourishing, because behind all of these flaky and whimsical properties lies a coordinated attempt to keep our species safe. Together they act as a range of ameliorators with a common theme: they keep us in the community so that there is safety in numbers.”

Robot souls

With AI increasingly taking up more decision-making roles in our daily lives, along with rising concerns about bias and discrimination in AI, Dr. Poole argues the answer might be in the stuff we tried to strip out of autonomous machines in the first place.

She said: “If we can decipher that code, the part that makes us all want to survive and thrive together as a species, we can share it with the machines. Giving them to all intents and purposes a ‘soul’.”

In the new book, Poole suggests a series of next steps to make this a reality, including agreeing with a rigorous regulation process and an immediate ban on autonomous weapons along with a licensing regime with rules that reserve any final decision over the life and death of a human to a fellow human.

She argues we should also agree on the criteria for legal personhood and a road map for Al toward it.

The human blueprint

“Because humans are flawed we disregarded a lot of characteristics when we built AI,” Poole explains. “It was assumed that robots with features like emotions and intuition, that made mistakes and looked for meaning and purpose, would not work as well.

“But on considering why all these irrational properties are there, it seems that they emerge from the source code of the soul. Because it is actually this ‘junk’ code that makes us human and promotes the kind of reciprocal altruism that keeps humanity alive and thriving.”

Robot Souls looks at developments in AI and reviews the emergence of ideas of consciousness and the soul.

It places our ‘junk code’ in this context and argues that it is time to foreground that code and use it to look again at how we are programming AI.

New research in structured light means researchers can exploit the many patterns of light as an encoding alphabet without worrying about how noisy the channel is.
New research in structured light means researchers can exploit the many patterns of light as an encoding alphabet without worrying about how noisy the channel is.

South African researchers demo noise-free communication with structured light

A new approach to optical communication that can be deployed with conventional technology.

The patterns of light hold tremendous promise for a large encoding alphabet in optical communications, but progress is hindered by their susceptibilities to distortion, such as in atmospheric turbulence or bent optical fiber.  Now researchers at the University of the Witwatersrand (Wits) have outlined a new optical communication protocol that exploits spatial patterns of light for multi-dimensional encoding in a manner that does not require the patterns to be recognized, thus overcoming the prior limitation of modal distortion in noisy channels.  The result is a new encoding state-of-the-art of over 50 vectorial patterns of light sent virtually noise-free across a turbulent atmosphere, opening a new approach to high-bit-rate optical communication.  

Published this week in Laser & Photonics Reviews, the Wits team from the Structured Light Laboratory in the Wits School of Physics used a new invariant property of vectorial light to encode information.  This quantity, which the team calls “vectorness”, scales from 0 to 1 and remains unchanged when passing through a noisy channel.  Unlike traditional amplitude modulation which is 0 or 1 (only a two-letter alphabet), the team used the invariance to partition the 0 to 1 vectorness range into more than 50 parts (0, 0.02, 0.04, and so on up to 1) for a 50-letter alphabet.  Because the channel over which the information is sent does not distort the vectorness, both sender and received will always agree on the value, hence noise-free information transfer.  

The critical hurdle that the team overcame is to use patterns of light in a manner that does not require them to be “recognized” so that the natural distortion of noisy channels can be ignored.  Instead, the invariant quantity just “adds up” light in specialized measurements, revealing a quantity that doesn’t see distortion at all.

“This is a very exciting advance because we can finally exploit the many patterns of light as an encoding alphabet without worrying about how noisy the channel is,” says Professor Andrew Forbes, from the Wits School of Physics. “In fact, the only limit to how big the alphabet can be is how good the detectors are and not at all influenced by the noise of the channel.”

Lead author and Ph.D. candidate Keshaan Singh added: “To create and detect the vectorness modulation requires nothing more than conventional communications technology, allowing our modal (pattern) based protocol to be deployed immediately in real-world settings.”

The team has already started demonstrations in optical fiber and in fast links across free space and believes that the approach can work in other noisy channels, including underwater.

Dutch scientists develop artificial molecules that behave like real ones

Scientists from the Radboud University in Nijmegen, the Netherlands have developed synthetic molecules that resemble real organic molecules. A collaboration of researchers, led by Alex Khajetoorians and Daniel Wegner, can now simulate the behavior of real molecules by using artificial molecules. In this way, they can tweak the properties of molecules in ways that are normally difficult or unrealistic, and they can understand much better how molecules change.

Emil Sierda, who was in charge of conducting the experiments at Radboud University: "A few years ago we had this crazy idea to build a quantum simulator. We wanted to create artificial molecules that resembled real molecules. So we developed a system in which we trapped electrons. Electrons surround a molecule like a cloud, and we used those trapped electrons to build an artificial molecule." The results the team found were astonishing. Sierda: ‘The resemblance between what we built and real molecules was uncanny."

Changing molecules

Alex Khajetoorians, head of the Scanning Probe Microscopy (SPM) department at Radboud University: "Making molecules is difficult enough. What is often harder, is to understand how certain molecules react, for example how they change when they are twisted or altered." How molecules change and react is the basis of chemistry, and leads to chemical reactions, like the formation of water from hydrogen and oxygen.

"We wanted to simulate molecules, so we could have the ultimate toolkit to bend them and tune them in ways that are nearly impossible with real molecules. In that way, we can say something about real molecules, without making them, or without having to deal with the challenges they present, like their constantly changing shape."

Benzene

Using this simulator, the researchers created an artificial version of one of the basic organic molecules in chemistry: benzene. Benzene is the starting component for a vast amount of chemicals, like styrene, which is used to make polystyrene. Khajetoorians: "By making benzene, we simulated a textbook organic molecule, and built a molecule that is made up of elements that are not organic." Above that: the molecules are ten times bigger than their real counterparts, which makes them easier to work with.

Practical uses

The uses of this new technique are endless. Daniel Wegner, assistant professor within the SPM department: "We have only begun to imagine what we can use this for. We have so many ideas that it is hard to decide where to start."

By using the simulator, scientists can understand molecules and their reactions much better, which will help in every scientific field imaginable. Wegner: "New materials for future computer hardware are really hard to make, for instance. By making a simulated version, we can look for the novel properties and functionalities of certain molecules and evaluate whether it will be worth making the real material."

In the far future, all kinds of things may be possible: understanding chemical reactions step by step like in a slow-motion video, or making artificial single-molecule electronic devices, like shrinking the size of a transistor on a computer chip. Quantum simulators are even suggested to perform as quantum supercomputers. Sierda: "But that’s a long way to go, for now, we can start by beginning to understand molecules in a way we never understood before."

The research was conducted by a Radboud University collaboration between the groups of Malte Rösner (Theory of Condensed Matter), Mikhail Katsnelson (Theory of Condensed Matter), Gerrit Groenenboom (Theoretical Chemistry), Daniel Wegner (SPM), and Alex Khajetoorians (SPM).