Skip to main content

Olive Trees Were First Domesticated 7,000 Years Ago

Earliest evidence for cultivation of a fruit tree, according to researchers.

A joint study by researchers from Tel Aviv University and the Hebrew University unraveled the earliest evidence for domestication of a fruit tree. The researchers analyzed remnants of charcoal from the Chalcolithic site of Tel Zaf in the Jordan Valley and determined that they came from olive trees. Since the olive did not grow naturally in the Jordan Valley, this means that the inhabitants planted the tree intentionally about 7,000 years ago. Some of the earliest stamps were also found at the site, and as a whole, the researchers say the findings indicate wealth, and early steps toward the formation of a complex multilevel society.

The groundbreaking study was led by Dr. Dafna Langgut of the The Jacob M. Alkow Department of Archaeology & Ancient Near Eastern Cultures, The Sonia & Marco Nadler Institute of Archaeology and the Steinhardt Museum of Natural History at Tel Aviv University. The charcoal remnants were found in the archaeological excavation directed by Prof. Yosef Garfinkel of the Institute of Archaeology at the Hebrew University. The findings were published in the journal Scientific Reports from the publishers of Nature.

‘Indisputable Proof of Domestication’

According to Dr. Langgut, Head of the Laboratory of Archaeobotany & Ancient Environments which specializes in microscopic identification of plant remains, “trees, even when burned down to charcoal, can be identified by their anatomic structure. Wood was the ‘plastic’ of the ancient world. It was used for construction, for making tools and furniture, and as a source of energy. That’s why identifying tree remnants found at archaeological sites, such as charcoal from hearths, is a key to understanding what kinds of trees grew in the natural environment at the time, and when humans began to cultivate fruit trees. »

In her lab, Dr. Langgut identified the charcoal from Tel Zaf as belonging to olive and fig trees. « Olive trees grow in the wild in the land of Israel, but they do not grow in the Jordan Valley, » she says. « This means that someone brought them there intentionally – took the knowledge and the plant itself to a place that is outside its natural habitat. In archaeobotany, this is considered indisputable proof of domestication, which means that we have here the earliest evidence of the olive’s domestication anywhere in the world.”

 

7,000 years-old microscopic remains of charred olive wood (Olea) recovered from Tel Tsaf (Photo: Dr. Dafna Langgut)

“I also identified many remnants of young fig branches. The fig tree did grow naturally in the Jordan Valley, but its branches had little value as either firewood or raw materials for tools or furniture, so people had no reason to gather large quantities and bring them to the village. Apparently, these fig branches resulted from pruning, a method still used today to increase the yield of fruit trees. »

Evidence of Luxury

The tree remnants examined by Dr. Langgut were collected by Prof. Yosef Garfinkel of the Hebrew University, who headed the dig at Tel Zaf. Prof. Garfinkel: « Tel Zaf was a large prehistoric village in the middle Jordan Valley south of Beit She’an, inhabited between 7,200 and 6,700 years ago. Large houses with courtyards were discovered at the site, each with several granaries for storing crops. Storage capacities were up to 20 times greater than any single family’s calorie consumption, so clearly these were caches for storing great wealth. The wealth of the village was manifested in the production of elaborate pottery, painted with remarkable skill. In addition, we found articles brought from afar: pottery of the Ubaid culture from Mesopotamia, obsidian from Anatolia, a copper awl from the Caucasus, and more. »

Dr. Langgut and Prof. Garfinkel were not surprised to discover that the inhabitants of Tel Zaf were the first in the world to intentionally grow olive and fig groves, since growing fruit trees is evidence of luxury, and this site is known to have been exceptionally wealthy.

Dr. Langgut: « The domestication of fruit trees is a process that takes many years, and therefore befits a society of plenty, rather than one that struggles to survive. Trees give fruit only 3-4 years after being planted. Since groves of fruit trees require a substantial initial investment, and then live on for a long time, they have great economic and social significance in terms of owning land and bequeathing it to future generations – procedures suggesting the beginnings of a complex society. Moreover, it’s quite possible that the residents of Tel Zaf traded in products derived from the fruit trees, such as olives, olive oil, and dried figs, which have a long shelf life. Such products may have enabled long-distance trade that led to the accumulation of material wealth, and possibly even taxation – initial steps in turning the locals into a society with a socio-economic hierarchy supported by an administrative system. »

Dr. Langgut concludes: « At the Tel Zaf archaeological site we found the first evidence in the world for the domestication of fruit trees, alongside some of the earliest stamps – suggesting the beginnings of administrative procedures. As a whole, the findings indicate wealth, and early steps toward the formation of a complex multilevel society, with the class of farmers supplemented by classes of clerks and merchants. »

Big Brains Helped Large Animals Survive Extinction

TAU researchers: more brain power helped animals adapt to changing conditions and increased chances of survival.

What do an elephant, a rhino and a hippopotamus all have in common? All three, along with other large animals, survived the mass extinction that took place for a period of about 120,000 years, starting from the time the last Ice Age began. In contrast, other huge animals, such as giant armadillos (weighing a ton), giant kangaroos and mammoths went extinct.

Researchers at Tel Aviv University and the University of Naples have examined the mass extinction of large animals over the past tens of thousands of years, and found that those species who survived extinction had, on average, much larger brains than those who did not. The researchers conclude that having a large brain (relative to body size) indicates relatively high intelligence and helped the surviving species adapt to changing conditions and cope with potential causes of extinction, such as human hunting.

The study was led by doctoral student Jacob Dembitzer of the University of Naples in Italy, Prof. Shai Meiri of Tel Aviv University’s School of Zoology and The Steinhardt Museum of Natural History, and Prof. Pasquale Raia and doctoral student Silvia Castiglione of the University of Naples. The study was published in the journal Scientific Reports.

Heavy Weight – No Guarantee

The researchers explain that the last Ice Age was characterized by the widespread extinction of large and giant animals on all continents on earth (except Antarctica). Among these:

  • America: Giant ground sloths weighing 4 tons, a giant armadillo weighing a ton, and mastodons
  • Australia: Marsupial diprotodon weighing a ton, giant kangaroos, and a marsupial ‘lion’
  • Eurasia: Giant deer, woolly rhinoceros, mammoth, and giant elephants weighing up to 11 tons

Other large animals, however, such as elephants, rhinos, and hippos, survived this extinction event and exist to this day.

The researchers also note that in some places, the extinction was particularly widespread:

  • Australia: The red and grey kangaroos are today the largest native animals
  • South America: The largest survivors are the guanaco and vicuña (similar to the llama, which is a domesticated animal) and the tapir, while many of the species weighing half a ton or more have become extinct

Brains over Body

Jacob Dembitzer: “We know that most of the extinctions were of large animals, and yet it is not clear what distinguishes the large extant species from those that went extinct. We hypothesized that behavioral flexibility, made possible by a large brain in relation to body size, gave the surviving species an evolutionary advantage – it has allowed them to adapt to the changes that have taken place over the last tens of thousands of years, including climate change and the appearance of humans. Previous studies have shown that many species, especially large species, went extinct due to over-hunting by humans that have entered their habitats. In this study, we tested our hypothesis for mammals over a period of about 120,000 years, from the time the last Ice Age began, and the time that modern man began to spread all over the world with lethal weapons, to 500 years before our time. This hypothesis even helps us explain the large number of extinctions in South America and Australia, since the large mammals living on these continents had relatively small brains.”

The researchers collected data from the paleontological literature on 50 extinct species of mammal from all continents, weighing from 11 kg (an extinct giant echidna) up to 11 tons (the straight-tusked elephant, which was also found in the Land of Israel), and compared the size of their cranial cavity to that of 291 evolutionarily close mammal species that survived and exist today, weighing from 1.4 kg (the platypus) up to 4 tons (the African elephant). They fed the data into statistical models that included the weighting of body size and phylogeny between different species.

Prof. Meiri: “We found that the surviving animals had brains 53% larger, on average than evolutionarily closely related, extinct species of a similar body size. We hypothesize that mammals with larger brains have been able to adapt their behavior and cope better with the changing conditions – mainly human hunting and possibly climate changes that occurred during that period – compared to mammals with relatively small brains.”

Research based on a comprehensive study of 8,000 birds in Israel

Tel Aviv University (TAU) researchers say that climate change may be responsible for changes in the morphology of many birds in Israel over the past 70 years. The body mass of some species decreased while in others body length increased, in both cases increasing the ratio between surface area and volume. The researchers contend that these are strategies to facilitate heat loss to the environment.

“The birds evidently changed in response to the changing climate,” the researchers concluded. “However, this solution may not be fully adequate, especially as temperatures continue to rise.”

The study was led by Professor Shai Meiri and PhD student Shahar Dubiner of the School of Zoology, Wise Faculty of Life Sciences, and the Steinhardt Museum of Natural History at TAU. The paper was published in the scientific journal Global Ecology and Biogeography.

Professor Meiri explains that according to “Bergmann’s rule,” an ecogeographical rule formulated in the 19th century, members of bird and mammal species living in a cold climate tend to be larger than members of the same species living in a warmer climate. This is because the ratio of surface area to volume is higher in smaller animals, permitting more heat loss (an advantage in warm regions), and lower in larger bodies, minimizing heat loss (a benefit in colder climates). Based on this rule, scientists have predicted that global warming will lead to a reduction in animal size, with a possible exception: birds living in the human environment (such as pigeons, house sparrows, and the hooded crow) may gain size due to increased food availability, a phenomenon already witnessed in mammals such as jackals and wolves.

Relying on the vast bird collection preserved by the Steinhardt Museum of Natural History at TAU, the researchers looked for changes in bird morphology over the past 70 years in Israel. They examined approximately 8,000 adult specimens of 106 different species, including migratory birds that annually pass through Israel such as the common chiffchaff, white stork, and black buzzard; resident wild birds like the Eurasian jay, Eurasian eagle-owl, and rock partridge; and commensal birds that live near humans. They built a complex statistical model consisting of various parameters to assess morphological changes — in the birds’ body mass, body length and wing length — during the relevant period.

“Our findings revealed a complicated picture,” Dubiner says. “We identified two different types of morphological changes: some species had become lighter – their mass had decreased while their body length remained unchanged; while others had become longer – their body length had increased, while their mass remained unchanged. These together represent more than half of the species examined, but there was practically no overlap between the two groups – almost none of the birds had become both lighter and longer.

“We think that these are two different strategies for coping with the same problem, namely the rising temperatures. In both cases, the surface area to volume ratio is increased by either increasing the numerator or reducing the denominator, which helps the body lose heat to its environment. The opposite, namely a decrease in this ratio, was not observed in any of the species.”

These findings were observed across the country, regardless of nutrition, and in all types of species. A difference was identified, however, between the two strategies: changes in body length tended to occur more in migrants, while changes in body mass were more typical of non-migratory birds. The very fact that such changes were found in migratory birds coming from Asia, Europe, and Africa suggests that this is a global phenomenon. The study also found that the impact of climate change over time on bird morphology is 10 times greater than the impact of similar differences in temperature between geographical areas.

“Our findings indicate that global warming causes fast and significant changes in bird morphology,” Dubiner concludes. “But what are the implications of these changes? Should we be concerned? Is this a problem, or rather an encouraging ability to adapt to a changing environment? Such morphological changes over a few decades probably do not represent an evolutionary adaptation, but rather certain phenotypic flexibility exhibited by the birds. We are concerned that over such a short period of time, there is a limit to the flexibility or evolutionary potential of these traits, and the birds might run out of effective solutions as temperatures continue to rise.”

Unravelling Recycling Practices from 500,000 Years Ago

The urge to collect in the prehistoric world: preserving memory of ancestors and connectedness with place and time.

What drove prehistoric humans to collect and recycle flint tools that had been made, used, and discarded by their predecessors? In a first-of-its-kind study at Tel Aviv University, researchers examined flint tools from one layer at the 500,000-year-old prehistoric site of Revadim in the south of Israel’s Coastal Plain, and propose a novel explanation: prehistoric humans, just like us, were collectors by nature and culture. The study suggests that they had an emotional urge to collect old human-made artefacts, mostly as a means for preserving the memory of their ancestors and maintaining their connectedness with place and time.

The study was led by PhD student Bar Efrati and Prof. Ran Barkai of the Jacob M. Alkow Department of Archaeology and Ancient Near Eastern Cultures at TAU’s The Lester and Sally Entin Faculty of Humanities, in collaboration with Dr. Flavia Venditti from the University of Tubingen in Germany and Prof. Stella Nunziante Cesaro from the Sapienza University of Rome, Italy. The paper appeared in the prestigious scientific journal Scientific Reports, published by Nature.

Prehistoric Vintage Tools

Bar Efrati explains that stone tools with two lifecycles have been found at prehistoric sites all over the world, but the phenomenon has never been thoroughly investigated. In the current study, the researchers focused on a specific layer at Revadim – a large, open-air, multi-layered site in the south of Israel’s Coastal Plain, dated to about 500,000 years ago. The rich findings at Revadim suggest that this was a popular spot in the prehistoric landscape, revisited over and over again by early humans drawn by an abundance of wildlife, including elephants. Moreover, the area is rich with good-quality flint, and most tools found at Revadim were in fact made of fresh flint. 

« The big question is: Why did they do it?” says Bar Efrati. “Why did prehistoric humans collect and recycle actual tools originally produced, used, and discarded by their predecessors, many years earlier? Scarcity of raw materials was clearly not the reason at Revadim, where good-quality flint is easy to come by. Nor was the motivation merely functional, since the recycled tools were neither unusual in form nor uniquely suitable for any specific use. »

Scars that Reveal the Past

The key to identifying the recycled tools and understanding their history is the patina – a chemical coating which forms on flint when it is exposed to the elements for a long period of time. Thus, a discarded flint tool that lay on the ground for decades or centuries accumulated an easily identifiable layer of patina, which is different in both color and texture from the scars of a second cycle of processing that exposed the original color and texture of flint.

In the current study, 49 flint tools with two lifecycles were examined. Produced and used in their first lifecycle, these tools were abandoned, and years later, after accumulating a layer of patina, they were collected, reworked, and used again. The individuals who recycled each tool removed the patina, exposing fresh flint, and shaped a new active edge. Both edges, the old and the new, were examined by the researchers under two kinds of microscopes, and via various chemical analyses, in search of use-wear marks and/or organic residues. In the case of 28 tools, use-wear marks were found on the old and/or new edges, and in 13 tools, organic residues were detected, evidence of contact with animal bones or fat.

Surprisingly, the tools had been used for very different purposes in their two lifecycles – the older edges primarily for cutting, and the newer edges for scraping (processing soft materials like leather and bone). Another baffling discovery: in their second lifecycle the tools were reshaped in a very specific and minimal manner, preserving the original form of the tool, including its patina, and only slightly modifying the active edge.

Recycled Tools as Keepsakes

Prof. Ran Barkai: « Based on our findings, we propose that prehistoric humans collected and recycled old tools because they attached significance to items made by their predecessors.”

“Imagine a prehistoric human walking through the landscape 500,000 years ago, when an old stone tool catches his eye. The tool means something to him – it carries the memory of his ancestors or evokes a connection to a certain place. He picks it up and weighs it in his hands. The artifact pleases him, so he decides to take it ‘home’. Understanding that daily use can preserve and even enhance the memory, he retouches the edge for his own use, but takes care not to alter the overall shape – in honor of the first manufacturer. In a modern analogy, the prehistoric human may be likened to a young farmer still plowing his fields with his great-grandfather’s rusty old tractor, replacing parts now and then, but preserving the good old machine as is, because it symbolizes his family’s bond with the land.”

“In fact,” says Barkai, “the more we study early humans, we learn to appreciate them, their intelligence, and their capabilities. Moreover, we discover that they were not so different from us. This study suggests that collectors and the urge to collect may be as old as humankind. Just like us, our early ancestors attached great importance to old artifacts, preserving them as significant memory objects – a bond with older worlds and important places in the landscape. »

Featured image: From Left to Right: Prof. Ran Barkai & Bar Efrati

Why do Locusts Form Destructive Swarms?

TAU researchers may have the answer.

Locust swarms that ruin all crops in their path have been a major cause of famine from Biblical times to the present. Over the last three years, large parts of Africa, India and Pakistan have been hard-hit by locust outbreaks, and climate change is expected to exacerbate the problem even further.

A new multidisciplinary study by experts in fields as varied as insect behavior and physiology, microbiology, and computational models of evolution, has led to valuable insights concerning locust swarming: “Locust swarms form when individual locusts, usually solitary and harmless, aggregate and begin to migrate. However, the causes for this behavior remain largely unknown, and an effective solution is yet to be found,” explains Prof. Amir Ayali from the School of Zoology at TAU’s George S. Wise Faculty of Life Sciences.

Following recent studies, indicating that microbiomes can influence their hosts’ social behavior, the researchers hypothesized that locusts’ microbiomes may play a role in changing the behavior of their hosts to become more ‘sociable’. The study was published in Environmental Microbiology.

The Bacteria that Fly with Borrowed Wings 

To test their hypothesis, the researchers examined the gut microbiomes of locusts reared in the laboratory, and found a profound change when individuals reared in solitary conditions joined a large group of about 200 locusts.

Omer Lavy: « The most significant change was observed in bacteria called Weissella, almost completely absent from the microbiome of solitary locusts, which became dominant soon after their hosts joined the group. »

The researchers then developed a mathematical model that was used for analyzing the conditions under which induction of locust aggregation produces significant evolutionary advantages for Weissella, allowing these bacteria to spread to numerous other hosts. Based on these results, the researchers hypothesize that Weissella bacteria may play an important role in the locust aggregation behavior. In other words, the bacteria may in some way encourage their hosts to change their behavior and become more ‘sociable’.

Prof. Ayali concludes:  « Our study contributes to the understanding of locust swarming – a leading cause of famine from antiquity to the present. Our findings do not prove unequivocally that the Weissella bacteria are responsible for the swarming and migration of locusts. The results do, however, suggest a high probability that the bacteria play an important role in inducing this behavior – a new hypothesis never previously proposed. We hope that this new understanding will drive the development of new means for combating locust outbreaks – still a major threat to countless people, animals, and plants all over the globe. »

The new study was based on a multidisciplinary collaboration of experts in fields as varied as insect behavior and physiology, microbiology, and computational models of evolution. The project was led by Prof. Amir Ayali and PhD student Omer Lavy from the School of Zoology at TAU’s George S. Wise Faculty of Life Sciences. Participants included Prof. Lilach Hadany, Ohad Lewin-Epstein and Yonatan Bendett from the School of Plant Sciences and Food Security and Prof. Uri Gophna from The Shmunis School of Biomedicine and Cancer Research, all of the Wise Faculty. They were joined by Dr. Eran Gefen from the University of Haifa-Oranim. 

Can Higher Temperatures Accelerate the Rate of Evolution?

TAU researchers use worms to demonstrate that epigenetic inheritance of sexual attractiveness can impact the evolutionary process.

Can environment impact genetic diversity in face of changing conditions, such as higher temperatures (think global warming)? Researchers at Tel Aviv University have discovered that epigenetic inheritance – inheritance which does not involving changes in the DNA sequence – can affect the genetic composition of the population for many generations. The environment can actually impact genetic diversity under certain conditions and the researchers believe that it’s a way for the environment to adjust genetic diversity.

Worms Get It from their Mama’s Mama’s Mama’s… 

Females of the worm species C. elegans produce both egg cells (or « oocytes ») and sperm, and can self-reproduce (hence are considered hermaphrodites). They produce their sperm in a limited amount, only when they are young. At the same time, there are also rare C. elegans males in the population that can provide more sperm to the female worms through mating.

In normal conditions, the female hermaphrodites secrete pheromones to attract males for mating only when they grow old and run out of their own sperm (at this point mating becomes the only way for them to continue and reproduce). Therefore, when the hermaphrodite is young, and still has sperm, she can choose whether to mix her genes by sexually reproducing with a male, or not.

In the new study, exposure to elevated temperatures was found to encourage more hermaphrodites to mate, and this trait was also preserved in the offspring for multiple generations, even though they were raised in comfortable temperatures and did not experience the stress from the increased heat.

The study, which was published today in the journal Development Cell, was led by Prof. Oded Rechavi and Dr. Itai Toker, as well as Dr. Itamar Lev and MD-PhD student Dr. Yael Mor, who did their doctorates under Prof. Rechavi’s supervision at the School of Neurobiology, Biochemistry & Biophysics, George S. Wise Faculty of Life Sciences, and the Sagol School of Neuroscience. The study was conducted in collaboration with the Rockefeller University in New York.

Securing Genetic Diversity

Why did the higher temperatures result in the C. elegans worms becoming more attractive, mating more with males? Dr. Itai Toker explains that « The heat conditions we created disrupted the inheritance of small RNA molecules that control the expression of genes in the sperm, so the worm’s sperm was not able to fertilize the egg with the efficiency that it normally would. The worm sensed that the sperm it produced was partially damaged, and therefore began to secrete the pheromone and attract males at an earlier stage, while it was still young.”

If that wasn’t enough, Dr. Rechavi points out that the really fascinating finding was that the trait of enhanced attractiveness was then passed on for many generations to offspring who did not experience the conditions of higher temperatures. The researchers found that heritable small RNA molecules, not changes in the DNA, transmitted the enhanced attractiveness between generations. Small RNAs control gene expression through a mechanism known as RNA interference or gene silencing – they can destroy mRNA molecules and thus prevent specific genes from functioning in a given time at a given tissue or cell.

Dr. Itai Toker adds that, “In the past, we discovered a mechanism that passes on small RNA molecules to future generations, in parallel and in a different way from the usual DNA-based inheritance mechanism. This enables the transmission of certain traits transgenerationally. By specifically inhibiting the mechanism of small RNA inheritance, we demonstrated that the inheritance of increased attractiveness depends on the transmission of small RNAs that control sperm activity.”

Mating, as opposed to fertilizing themselves, comes at a price for the female, hermaphroditic worms, as it allows them to pass on only half of their genome to the next generation. This « dilution » of the parents’ genetic contribution is a heavy price to pay. The benefit, however, is that it increases genetic diversity. By conducting lab evolution experiments we indeed discovered that it may be a useful adaptive strategy.

The researchers later experimented with evolution: They tracked the offspring of mothers who passed on the trait of attractiveness to males with the help of small RNAs, and allowed them to compete for males, for many generations, against normal offspring from a control group. The researhers observed how the inheritance of sexual attractiveness led to more mating in these competitive conditions, and that as a result the attractive offspring were able to spread their genes in the population more successfully.

 

Prof. Oded Rechavi (photo: Yehonatan Zur Duvdevani)

Environment’s Response to Global Warming?

In general, living things respond to their environment by changing their gene expression, without changing the genes themselves. The understanding that some of the epigenetic information, including information about the parents’ responses to environmental challenges, is encoded in small RNA molecules and can be passed down from generation to generation has revolutionized our understanding of heredity, challenging the dogma that has dominated evolution for a century or more. However, to date researchers have not been able to find a way in which epigenetic inheritance can affect the genetic sequence (DNA) itself.

« Epigenetics in general, and the inheritance of parental responses facilitated by small RNAs in particular, is a new field that is garnering a lot of attention, » says Dr. Lev. « We have now proven that the environment can change not only the expression of genes, but, indirectly, also genetic heredity, and for many generations.”

“Generally, epigenetic inheritance of small RNA molecules is a transient matter: the organism is exposed to a particular environment, and preserves the epigenetic information for 3-5 generations. In contrast, evolutionary change occurs over hundreds and thousands of generations. We looked for a link between epigenetics and genetics and found that a change in the environment, that is relevant to global warming, induces transgenerational secretion of a pheromone to attract males, and thus affects the evolution of the worms’ genome. »

Dr. Mor adds, « We think that it’s a way for the environment to adjust genetic diversity. After all, evolution requires variability and selection. The classical theory is that the environment can influence selection, but cannot affect variability, which is created randomly as a result of mutations. We found that the environment can actually impact genetic diversity under certain conditions. »

Finding the Optimal Location for the Tribal Bonfire

Early humans’ placement of cave hearths ensured maximum benefit and minimum smoke exposure.

In a first-of-its kind study, the researchers developed a software-based smoke dispersal simulation model and applied it to a known prehistoric site. They discovered that the early humans who occupied the cave had placed their hearth at the optimal location – enabling maximum utilization of the fire for their activities and needs while exposing them to a minimal amount of smoke. The groundbreaking study provides evidence for high cognitive abilities in early humans who lived 170,000 years ago.

The study was led by PhD student Yafit Kedar, and Prof. Ran Barkai from the Jacob M. Alkow Department of Archaeology and Ancient Near Eastern Cultures at The Lester and Sally Entin Faculty of Humanities, together with Dr. Gil Kedar. The paper was published in Scientific Reports.

In the Back of the Cave? Or towards the front?

The use of fire by early humans has been widely debated by researchers for many years, regarding questions such as: At what point in their evolution did humans learn how to control fire and ignite it at will? When did they begin to use it on a daily basis? Did they use the inner space of the cave efficiently in relation to the fire? While all researchers agree that modern humans were capable of all these things, the dispute continues about the skills and abilities of earlier types of humans. One focal issue in the debate is the location of hearths in caves occupied by early humans for long periods of time.

“Multilayered hearths have been found in many caves, indicating that fires had been lit at the same spot over many years,” says Yafit Kedar. “In previous studies, using a software-based model of air circulation in caves, along with a simulator of smoke dispersal in a closed space, we found that the optimal location for minimal smoke exposure in the winter was at the back of the cave. The least favorable location was the cave’s entrance. »

Humans Need Balance

In the current study, the researchers applied their smoke dispersal model to an extensively studied prehistoric site – the Lazaret Cave in southeastern France, inhabited by early humans around 170-150 thousand years ago. « According to our model, based on previous studies, placing the hearth at the back of the cave would have reduced smoke density to a minimum, allowing the smoke to circulate out of the cave right next to the ceiling,” explains Kedar. “However, in the archaeological layers we examined, the hearth was located at the center of the cave.”

The team tried to understand why the occupants had chosen this spot, and whether smoke dispersal had been a significant consideration in the cave’s spatial division into activity areas. The researchers performed a range of smoke dispersal simulations for 16 hypothetical hearth locations inside the 290sqm cave. To understand the health implications of smoke exposure, measurements were compared with the average smoke exposure recommendations of the World Health Organization.

Excavations at the Lazaret Cave, France (photo: De Lumley, M. A. néandertalisation (pp. 664-p). CNRS éditions. (2018Les restes humains fossiles de la grotte du Lazaret. Nice, Alpes-Maritimes, France. Des Homo erectus européens évolués en voie de)

The researchers found that the average smoke density, based on measuring the number of particles per spatial unit, is in fact minimal when the hearth is located at the back of the cave – just as their model had predicted. However, Yafit Kedar and Dr. Gil Kedar explain that they also discovered that “In this situation, the area with low smoke density, most suitable for prolonged activity, is relatively distant from the hearth itself. Early humans needed a balance – a hearth close to which they could work, cook, eat, sleep, get together, warm themselves, etc. while exposed to a minimum amount of smoke. Ultimately, when all needs are taken into consideration – daily activities vs. the damages of smoke exposure – the occupants placed their hearth at the optimal spot in the cave. »

Our Ancestors Nailed It

The study identified a 25sqm area in the cave which would be optimal for locating the hearth in order to enjoy its benefits while avoiding too much exposure to smoke. Astonishingly, in the several strata examined in this study, the early humans actually did place their hearth within this area. 

« Our study shows that early humans were able, with no sensors or simulators, to choose the perfect location for their hearth and manage the cave’s space as early as 170,000 years ago – long before the advent of modern humans in Europe. This ability reflects ingenuity, experience, and planned action, as well as awareness of the health damage caused by smoke exposure. In addition, the simulation model we developed can assist archaeologists excavating new sites, enabling them to look for hearths and activity areas at their optimal locations, » concludes Prof. Barkai.

In upcoming studies, the researchers intend to use their model to investigate the influence of different fuels on smoke dispersal, use of the cave with an active hearth at different times of year, use of several hearths simultaneously, and more.

And Let There Be Light

Efforts by TAU’s Clinical Law Program will help keep electricity running for those who are struggling to pay utility bills.

The recent drop in temperature in Israel has led to a significant increase in electricity consumption. But what about those who simply cannot afford basic necessities?

A petition jointly filed by Tel Aviv University’s Human Rights Clinic at The Buchmann Faculty of Law will help keep the electricity on for some of Israel’s most underprivileged populations. In response to the appeal, Israel’s High Court ruled that electricity must not be cut off for citizens who prove a difficult economic or medical condition, effective immediately. We spoke with attorney Adi Nir Binyamini from TAU’s Human Rights Clinic, one of the lawyers who handled the case. 

Electricity – A Fundamental Right?

In a precedent-setting decision, the High Court ruled on January 20 that access to electricity should be considered a fundamental right and that the Electricity Authority must, within six months, amend the criteria for power outages as a means of collecting debt. Meanwhile, the new ruling assists electricity consumers who find themselves in serious economic or medical distress, and ensure that they will not be left in the dark or the cold and without other basic needs.

The ruling came in response to a petition filed by the Association for Civil Rights in Israel (ACRI) in collaboration with the Human Rights Clinic at Tel Aviv University, Physicians for Human Rights and the Israel Union of Social Workers against the Electricity Authority, the Israel Electric Corp. and Energy Minister. It was filed on behalf of several poor families whose electricity had been cut off for non-payment.

The High Court of Justice ruled that, until the Electricity Authority establishes appropriate criteria and procedures (within six months from the time of the ruling), it must enable consumers facing power cuts from lack of payment to demonstrate whether they are suffering financial or health problems that justify their continued access to electric power. The court said the Electricity Authority must conduct a hearing prior to cutting a customer’s power. It gave the national electricity provider six months to revise its procedures and ordered it to pay the petitioners 40,000 NIS ($12,800) in expenses, to be divided among them. « This is a dramatic change from the previous situation, when it was possible to cut off people’s electricity access due to the accumulation of debt, except for very few exceptions, » explains Att. Nir Binyamini.

 

From the second hearing in Higher Court, on October 28, 2021 (from left to right): Gil Gan Mor (ACRI), Hicham Chabaita and Att. Adi Nir Binyamini from TAU’s Human Rights Clinic and Att. Mascit Bendel (ACRI) 

The Beginning of a New Era

Binyamini, who has dealt with electricity litigation for several years now, says, « I feel personal and professional satisfaction that on the coldest day of the year, when people were left without heating, the High Court accepted our position and ruled not to cut off people’s electricity due to poverty and that debt must instead be collected by more moderate means. »

 When asked how the Clinic got involved with the project, Binyamini explains that TAU’s Humans Rights Clinic was previously part of a legal battle over water disconnections for consumers unable to pay their water bill. « After that was successfully completed, we took on the subject of electricity and have been working on it continuously for the past eight years. The Clinic represented and handled the two petitions that were submitted to the Israeli High Court, and over the years we have dealt with hundreds of individual cases of people being cut off from electricity. We have also been guiding and assisting social workers with individual cases. »

She adds that a large number of students from the Clinic have worked on the case over the years, and stresses that such practical experience is an extremely valuable component of legal education.

Upon the court’s ruling, Binyamini along with Att. Maskit Bendel of the ACRI issued a statement, saying: “We hope that the ruling, which opened with the words ‘and let there be light,’ heralds the beginning of new era when it comes to protecting weak populations from having their electricity cut off.” 

 

Attorney-at-law Adi Nir Binyamini from Tel Aviv University’s Human Rights Clinic (photo: Tomer Jacobson) 

Start Up Nation in Ancient Canaan

Thanks to advanced management skills, the Arava became the copper power of the ancient world.

A new Tel Aviv University study has determined that thanks to advanced management methods and impressive technological creativity, about three thousand years ago, the Arava Valley’s [located deep in the Southern Negev desert in Israel, along the Jordanian border] copper industry managed to thrive and become the largest and most advanced smelting center in the ancient world. The study was conducted by graduate student David Luria of TAU’s Jacob M. Alkow Department of Archaeology and Ancient Near Eastern Cultures and The Sonia & Marco Nadler Institute of Archaeology, and is being published in the prestigious journal PLOS ONE.

Ancient Practice of “Trial and Error”

According to Luria, the copper industry in Canaan at that time was concentrated in two large mining areas – one in Timna (north of Eilat) and the other in Faynan (in the northern Arava, in Jordan). Previous research on the subject has claimed that the high level of technology employed there was made possible thanks to Egyptian technologies brought to the region during the voyage of the Egyptian Pharoah Shishak in 925 BC. This theory was strengthened in 2014 following the discovery of a scarab bearing the figure of Shishak in Faynan, and again later in 2019, following the development of a new scientific model that claimed that a sudden technological leap had taken place around the time of Shishak’s journey.

Luria, on the other hand, argues that the great economic and technological success of the copper industry in the Arava was not related to Egyptian capabilities, but rather to the talent of the Arava people, who learned to use the two advanced methods we know today as “trial and error” and “scaling up.” “Obviously these terms were not in use in ancient times, but the application of their practical principles was made possible due to a basic understanding of engineering and common sense, which were seen in other places in the ancient world as well,” says Luria.

Luria explains that the “trial and error” method allowed the Arava metalworkers to slowly improve technological processes, as well as to increase the volume and quality of production. In addition, “scaling up” made it possible to increase the dimensions of the existing means of production using materials and processes that were common at the time, thereby developing advanced production equipment within a short amount of time and with minimum cost and technological risk.

The Secret Behind the Technological Success

“Shishak’s expedition was not intended to physically take over the copper mines in the Arava, but rather to formulate a long-term agreement with the Arava people in order to bolster local production and thus increase copper exports to Egypt, which was suffering from local production difficulties at the time,” Luria says.

“It appears that the secret of the success of the ancient copper industry in the Arava lies in the skills and abilities of efficient managers, who were assisted at every stage of their decision-making by talented technological experts. Archeology today can’t identify who these executives were, but a careful analysis of the deposits left in the area can tell us an accurate story. These findings are the residues of copper production that have accumulated as heaps of waste that can be dated, and whose size allows us to assess the volume of production at any given time. Moreover, by conducting a chemical analysis of the copper content remaining in the waste, we can determine the quality of the production; when the amount of copper in the waste diminishes, we can conclude that the process had become more efficient.”

Luria also says that traces detected at these sites show that throughout the production period, the management team was able to close inefficient mines and open more efficient ones. Moreover, at certain points a decision was taken to reuse waste from earlier periods, which was produced in less efficient processes in which a lot of copper remained, rather than use the pure mineral. These decisions could not have been made without an excellent technical team that backed management decisions with regular technological testing. The management also engaged in extensive marketing of the copper throughout the ancient world.

“The important lesson to take away from this technological success is that the high-tech savvy of individuals – educated and energetic people who lived here in the first millennium BCE – succeeded, just like it does today, in bringing about a huge revolution in the local economy,” Luria concludes. “As they say, there is nothing new under the sun.”

Featured image: David Luria

Learning from The Fastest Growing Alga in The World

In scientific first, researchers successfully map photosynthetic properties of the Chlorella ohadii.

Sustainable food are grown, produced, distributed and consumed whilst keeping the environment in mind, and thus believed to help combat climate change. In a recent study, researchers set out to reveal the secret behind the rapid growth of “the fastest growing plant cell in the world,” the green alga Chlorella ohadii. Why? A better understanding of Chlorella ohadii, they assessed, might possibly help improve the efficiency of photosynthesis in other plants as well, and in turn help develop new engineering tools that could provide a solution for sustainable food. 

Can We Boost the Photosynthesis in Plants?

The study’s findings indicate that the main factors behind the plant’s rapid photosynthesis rate lie in its efficient metabolic processes. The researchers found that this alga has a unique ability to elicit a chemical reaction in which it is able to efficiently and quickly recycle one of the components used by an enzyme called RuBisCO, in a manner that significantly speeds up the photosynthetic processes.

The study was led by researchers from the Max-Planck Institute for Molecular Plant Physiology in Germany, Participating in the study was Dr. Haim Treves, a member of the School of Plant Sciences and Food Security at Tel Aviv University, together with colleagues at the Max-Planck Institute for Molecular Plant Physiology in Germany. The study was published in the prestigious journal Nature Plants.

In the framework of the study, the researchers sought to examine whether it is possible to improve the efficiency of photosynthesis in plants, an energetic process that has been occurring in nature for about 3.5 billion years. To try to answer this question, the researchers decided to focus on green algae, particularly the Chlorella ohadii variety. This alga is known for its ability to survive in extreme conditions of heat and cold, which forces it to exhibit resilience and grow very quickly.

The researchers assessed that a better understanding of Chlorella ohadii (named after the late botanist Prof. Itzhak Ohad) would make it possible to improve the efficiency of photosynthesis in other plants as well, and in turn to develop new engineering tools that could provide a solution for sustainable food.

Online Monitoring of Photosynthesis

In the process of photosynthesis, plants and algae convert water, light and carbon dioxide into the sugar and oxygen essential for their functioning. The researchers used innovative microfluidic methods based on complex physical, chemical and biotechnological principles in order to provide the algae with carbon dioxide in a measured and controlled manner and monitor the photosynthesis “online.”

By using a comparative analysis, the researchers identified that there was a fundamental difference in the photosynthetic processes carried out in in green algae compared to the model plants. They assess that the difference lies in variations in the metabolic networks, a deeper understanding of which will help in developing innovative engineering solutions in the field of plant metabolism, as well as the optimal engineering of future agricultural products.

“Past empirical studies have shown that photosynthetic efficiency is higher in microalgae than in C3 or C4 crops, both types of plants that have transport systems but which are completely different in terms of their anatomy and the way they carry out photosynthesis,” Dr. Treves explains. “The problem is that the scientific community does not yet know how to explain these differences accurately enough.”

Dr. Treves adds, “In our current study we mapped the patterns of energy production and photosynthetic metabolism in green algae and compared them to existing and new data collected from model plants. We were able to clearly identify the factors that influence the difference in these patterns. Our research reinforces previous assessments that the metabolic pathway responsible for recycling is one of the major bottlenecks in photosynthesis in plants. The next step, is to export the genes involved in this pathway and in other pathways in which we have detected differences from algae, and to test whether their insertion into other plants via metabolic engineering will increase their rate of growth or photosynthetic efficiency.

“The toolbox we have assembled will enable us to harness the conclusions from the study to accelerate future developments in engineering in the field of algae-based sustainable food as a genetic reservoir for plant improvement; monitoring the photosynthesis is a quantitative and high-resolution process, and algae offer an infinite source of possibilities for improving photosynthetic efficiency.”

Featured image: Dr. Haim Treves

  • 1
  • 2