Você já imaginou se pudesse não sentir fome, dor ou cansaço? Você deve estar pensando que isso só seria possível se você fosse um super-herói ou qualquer personagem fruto da ficção. No entanto, o caso de uma garotinha britânica, de 7 anos, está intrigando os cientistas justamente por apresentar todas essas características.
Olivia Farnsworth, da cidade de Huddersfield, está sendo chamada de garota biônica. Ela não possui sensações básicas do corpo humano como dor, fome e cansaço. Dentro da sua condição, a menina come apenas porque é condicionada a isso, consegue passar 3 dias seguidos sem dormir e quase nunca chora.
A pequena Olivia Farnsworth, de 7 anos, que está sendo chamada de "garota biônica"
O distúrbio da pequena Olivia é conhecido como “exclusão do cromossomo 6”. Esse quadro por si só já é raro, mas, segundo os pesquisadores, os casos registrados até hoje apresentavam apenas um dos sintomas. Portanto, acredita-se que a menina apresenta uma condição única no mundo, pois reúne todas as características provocadas pela anomalia.
No dia em que a condição de Olivia foi descoberta, a família passou por um drama. Junto com a mãe, Niki Trepak, 32 anos, e os quatro irmãos, Olivia estava passeando, quando foi atropelada e arrastada por um carro ao longo de vários metros. “Foi horrível. Eu acho que é algo que eu nunca vou superar. Eu e meus filhos gritamos e choramos muito ao ver o momento em que ela foi levada”, contou Niki ao site Mail Online.
Olivia no hospital após o acidente
Apesar da cena assustadora, a menina se levantou com uma marca de roda no peito e começou a andar em direção à família. Ela tinha apenas escorões no quadril e no dedão e não esboçou qualquer reação. Após avaliarem o quadro, os médicos concluíram que o que a salvou de ferimentos maiores foi justamente o fato de ela se manter tranquila durante o ocorrido. Os clínicos descobriram então que a menina possuía a rara condição genética da exclusão do cromossomo 6.
Segundo a mãe, desde que a criança era recém-nascida, já era possível observar diferenças em seu comportamento. Niki contou que Olivia nunca chorou quando era um bebê e parou de dormir durante o dia quando ainda tinha 9 meses de idade. Outra consequência do distúrbio foi a demora no crescimento do cabelo da menina, que só foi acontecer de maneira apropriada quando ela tinha 4 anos.
Olivia precisa ser condicionada para comer, pois não sente fome
Apesar de parecer uma vantagem para Olivia o fato de não sofrer com determinadas reações naturais do corpo humano, sua condição requer atenção e causa bastante preocupação na mãe.
Os riscos e outros sintomas do quadro de Olivia
Niki conta que a filha não tem “senso de perigo”, razão que influenciou que o acidente em que foi atropelada acontecesse. Portanto, ela precisa ser acompanhada de maneira adequada para evitar novas situações de risco.
A menina mordeu os lábios e não esboçou reação, mas precisou até de cirurgia plástica para reparo
Certa vez, a pequena garota mordeu os próprios lábios e não se manifestou, porém ela precisou passar por uma cirurgia plástica para reparar o dano. Além disso, como não sente fome e cansaço, a menina biônica é condicionada a comer e toma remédios para poder dormir.
Na maior parte do tempo, Olivia demonstra ser uma criança feliz, mas sofre de ataques repentinos de fúria em função de seu quadro genético. Algumas dessas reações são bem violentas e causam estranheza quando ocorrem em público. “Aconteceu em um parque há algumas semanas. Ela me deu alguns socos e chutes, e as pessoas ficaram se perguntando o que estava acontecendo. Ninguém sabe o que há de errado com ela, então, quando ocorre algo assim, o momento é constrangedor”, explicou a mãe.
Olivia, ao centro, com a mãe Nikki e os quatro irmãos
Sabendo da necessidade de acompanhamento especial, Nikki e os irmãos da garota participam de um grupo de apoio para distúrbios cromossômicos. A ideia é poder entender a situação da menina, saber como lidar e também poder espalhar informações sobre essa rara condição genética.
Ninguém espera encontrar um cadáver no terreno de casa ou durante um passeio. Porém, algumas descobertas podem mudar muito do que conhecemos sobre os nossos antepassados e os seus costumes. Confira 10 casos:
10. As mulheres ricas da Etiópia
O The Guardian relatou em junho de 2015 que, em uma escavação no antigo reino de Aksum, atualmente na Etiópia, foram encontrados inúmeros artefatos. Mas os cientistas ficaram mais surpresos com os locais de enterro de duas mulheres da época romana.
Durante a verificação dos restos, foi possível descobrir que ambas eram de uma classe mais alta, já que foram encontradas com joias e recipientes de bebida. Uma delas possuía um colar com mais de mil pedras.
O achado de objetos de origem romana solidificou a teoria de que Aksum foi um importante ponto ao longo das rotas comerciais.
9. A enigmática Emma (e Richard)
Provavelmente, essa foi uma das descobertas mais recentes e inusitadas: em 2012, os restos mortais do Rei Ricardo III foram encontrados embaixo de um estacionamento na cidade de Leicester, na Inglaterra.
Mas o rei não estava sozinho. No ano seguinte, em uma área adjacente ao corpo, os arqueólogos descobriram o local do enterro de uma mulher idosa, que provavelmente viveu em algum momento do século 14. Um cientista da Universidade de Leicester examinou os restos e, em seguida, divulgou que, devido ao elaborado sarcófago de pedras, aquela mulher deveria ser uma pessoa importante.
Durante a busca por sua identidade, alguns sugeriram que aquela poderia ser Emma, uma mulher que tinha se casado com um homem local conhecido como João de Holt. Porém, os cientistas acharam que as pistas eram escassas demais para determinar quem foi ela de fato.
8. Otzi, o Iceman
Em 1991, os moradores que vivem nas montanhas que fazem fronteira com a Itália e a Áustria encontraram um corpo extremamente bem conservado de um homem que, posteriormente, ficaria conhecido como “Otzi”. Como não foram localizados outros corpos na região, a teoria é de que ele tenha morrido por acidente ou assassinato.
Segundo evidências, Otzi morreu em combate e teria recebido uma pancada tão forte na cabeça que seu cérebro se chocou com a parte de trás do crânio. A National Geographic informou que o homem teria vivido há 5,3 mil anos, e sua morte ocorreu quando ele tinha em torno de 40 anos.
Seu corpo era coberto por mais de 50 tatuagens, provavelmente feitas através de pequenos cortes na pele e, em seguida, carvão aplicado sobre as feridas ainda abertas. Ele também sofria de doenças como artérias endurecidas, problemas nas articulações e doença de Lyme, uma infecção bacteriana transmitida por carrapatos.
Otzi está em exibição no Museu de Arqueologia do Tirol do Sul em Bolzano, Itália.
7. O esqueleto preso à árvore
Em setembro de 2015, os moradores da cidade de Collooney, na Irlanda, tiveram uma surpresa e tanto durante uma tempestade. Os ventos estavam tão fortes que o tronco tombou, e suas raízes revelaram algo improvável: um esqueleto humano. O corpo estava tão emaranhado à planta que, quando a árvore caiu, o tronco subiu junto à planta e a parte de baixo continuou presa ao solo. Pesquisas indicaram que o esqueleto pertencia a um homem adulto que teria sido morto por uma arma afiada entre 1030 e 1200.
6. Sepultura em massa
Os trabalhadores de uma construção perto do Condado de Schuylkill, na Pensilvânia, descobriram uma sepultura em massa em uma propriedade privada. Os atuais proprietários do imóvel compraram o terreno em 1997 e haviam ouvido algumas histórias sobre o local, que, segundo a população, foi usado para enterrar mais de 1,5 mil moradores dos arredores durante a gripe espanhola, em 1918.
O DNA foi coletado em um esforço para identificar os esqueletos e notificar os possíveis parentes vivos. Os restos mortais foram enterrados em um cemitério da região.
Em 2014, enquanto examinava o porto de Ashkelon, em Israel, o arqueólogo Ross Voss achou uma grande quantidade de ossos pequenos no sistema de esgoto da cidade. Ao investigarem os restos mortais, os pesquisadores descobriram mais de 100 bebês que tinham vivido menos de uma semana.
4. Contra as bruxas
Em 2006, a descoberta de uma vala comum em Veneza, na Itália, revelou um esqueleto feminino que havia sido morto de forma cruel: sua mandíbula foi destruída ao enfiarem um tijolo em sua boca. Os pesquisadores acreditam que ela tenha sido acusada de bruxaria na Idade Média.
3. A múmia mais preservada
Imagine a seguinte situação: você vai ao pântano para pegar turfa, um material vegetal que é usado no aquecimento de casas. Ao chegar ao local, você se depara com um cadáver estrangulado por uma corda. Qual é a sua reação? Provavelmente iria chamar a polícia, certo? Foi exatamente isso que dois irmãos fizeram durante a década de 50, na Dinamarca. Porém, qual não foi a surpresa de todos ao descobrirem que o suposto crime havia ocorrido há quase 2,4 mil anos?
O cadáver foi mumificado naturalmente pelo solo, e sua cabeça e os órgãos internos estavam intactos! A múmia foi chamada de “Tollund” e é considerada uma das mais preservadas de que se tem notíciaaté hoje.
2. As Crianças de Llullaillaco
Também conhecidas como as múmias de Llullaillaco, foram descobertas em 1999, quando os arqueólogos examinavam o topo do vulcão Llullaillaco, nos Andes, a 6.739 metros de altitude.
Segundo os pesquisadores, as três crianças incas foram escolhidas para o sacrifício “Capacocha”, que consistia em caminhar até as montanhas geladas e permanecer ali até a morte.
A descoberta de 25 corpos na cidade de Onavas, no México, causou furor na comunidade de agricultores da região. Além do susto inicial ao encontrar as ossadas, a população ficou espantada ao ver o formato alongado dos crânios. A primeira teoria é de que aqueles corpos pertenciam a extraterrestres.
Mas os cientistas logo acabaram com a animação: os cadáveres, com mais de mil anos, pertenciam a civilizações pré-hispânicas, que tinham faixas amarradas à cabeça para que ficassem com esse formato.
Amid gloomy reports of an impending epidemic of Alzheimer’s and other dementias, emerging research offers a promising twist. Recent studies in North America, the U.K. and Europe suggest that dementia risk among seniors in some high-income countries has dropped steadily over the past 25 years. If the trend is driven by midlife factors such as building “brain reserve” and maintaining heart health, as some experts suspect, this could lend credence to staying mentally engaged and taking cholesterol-lowering drugs as preventive measures.
At first glance, the overall message seems somewhat confusing. Higher life expectancy and falling birth rates are driving up the global elderly population. “And if there are more 85-year-olds, it’s almost certain there will be more cases of age-related diseases,” says Ken Langa, professor of internal medicine at the University of Michigan. According to the World Alzheimer Report 2015 (pdf), 46.8 million people around the globe suffered from dementia last year, and that number is expected to double every 20 years.
Looking more closely, though, new epidemiological studies reveal a surprisingly hopeful trend. Analyses conducted over the last decade in the U.S., Canada, England, the Netherlands, Sweden and Denmark suggest that “a 75- to 85-year-old has a lower risk of having Alzheimer’s today than 15 or 20 years ago,” says Langa, who discussed the research on falling dementia rates in a 2015 Alzheimer’s Research & Therapycommentary (pdf).
Some of the clearest evidence comes from the Cognitive Function and Aging Study (CFAS), led by Carol Brayne, professor of public health medicine at the University of Cambridge. This study surveyed adults in the U.K. 65 or older in Cambridgeshire, Newcastle and Nottingham in the 1990s and again around 2010. During that period, dementia rates in the older population fell 24 percent—from 8.3 to 6.5 percent. Put another way: if the frequency of dementia in seniors had stayed the same across that period, there should have been 214,000 additional people with dementia than the 670,000 documented.
Studies in Canada as well as the Netherlands, Sweden and elsewhere in Europe also suggest that dementia risk has declined in the past few decades. In the U.S. Langa and colleagues reported in Alzheimer’s & Dementia that the percent of adults over 70 years of age with cognitive impairment dropped from 12.2 to 8.7 between 1993 and 2002. The seniors were part of an ongoing longitudinal study funded by the National Institute on Aging (NIA), which surveys a representative sample of 20,000 older adults in the U.S. every two years.
But other research does not underscore this trend. A study led by Denis Evans, director of the Rush Institute for Healthy Aging in Chicago sends a more sobering message. The researchers measured new cases of Alzheimer’s between 1997 and 2008 and found no change in disease risk over time. Another study estimated, based on U.S. Census Bureau data, that the number of people with Alzheimer’s will nearly triple by 2050—and the percentage of seniors with dementia will creep upward.
All things considered, Langa agrees it’s very likely that due to higher life expectancy the absolute number of people with Alzheimer’s and other dementias will go up in the coming years. He notes, however, if an older adult’s risk for dementia continues declining as it has in some high-income countries over the last few decades, “that increase in number of cases may be a little less eye-popping than it would be if the risk were staying the same.”
The different results could have come from different starting assumptions—Evans assumes the number of new dementia cases will stay the same in coming decades, while Langa takes into account the possibility that dementia risk could decline because of changes in lifestyle and health prevention measures in the last quarter century.
What could be driving the apparent downward trend in dementia frequency? Although the question cannot be answered definitively, other analyses have linked lower dementia risk to better control of cardiovascular risk factors such as hypertension and high cholesterol, and by building up “cognitive reserve” with more education. People with chronic health conditions, however, add additional complexity to the picture. Those with type 2 diabetes, for example, are at higher risk for dementia. In light of rising levels of diabetes—as well as obesity—it’s possible these conditions could offset or override the downward dementia trend going forward, Langa says.
Epidemiological studies such as these could be further complicated by an artificial rise in reported dementia cases due to several factors. One is simply a growing awareness of Alzheimer’s disease. As a result, physicians may be more likely to make a diagnosis today, compared with decades ago, even in someone with the same level of cognitive impairment. They may also be more prone to list Alzheimer’s as a cause of mortality on death certificates. Second, neuroimaging and basic science research aimed at identifying potential treatments is shifting the field’s focus to earlier in the disease trajectory, driven by the belief that interventions stand the greatest chance in people who have not become too impaired. There is yet no cure for Alzheimer’s, although some drugs can alleviate symptoms. “From a clinical point of view, the concept of dementia syndrome has changed,” Brayne says. “Using current criteria, people get diagnosed at a much earlier stage.”
Despite the possible influence of growing disease awareness and changing diagnostic standards, bigger problems with assessing dementia rates could be methodological, says John Haaga, NIA deputy director for behavioral and social research. Different labs use different measures; the same group might use two measures 15 years apart. “How much change is real and how much is due to measurement differences?” Haaga says.
Evans sees a more fundamental problem with epidemiological studies of chronic disorders in older people. Although the diseases are scored in a binary fashion—you have it or you don’t—their underlying causes are often a continuous process. “When you diagnose Alzheimer’s disease, what you’re doing is placing a cut point on a bell-shaped curve” of cognitive function, Evans says. “You’re cutting off one tail to separate ‘Alzheimer’s’ from ‘non-Alzheimer’s.’” Putting the cut point in the same place every time is challenging. Well-trained researchers “will vary in how they do it,” Evans says. Yet because it is a point in the curve where the slope is changing sharply, even “a slight difference in where they place the cut can make a big difference in the number of people in the tail.”
On the whole, though, Haaga thinks the epidemiological data on dementia is improving. Whereas in the past “we were often extrapolating from small samples,” he says, “we’re now starting to be able to talk confidently about what the trends are in national populations.” Plus, efforts are underway to harmonize data sets, which should make it easier to compare results across different studies. That will become especially important as data comes in from other parts of the world—such as developing countries where the relative growth in dementia cases is predicted to exceed that of high-income nations.
Two thirds (66 percent) of people with dementia live in low- and middle-income countries, where less than 10 percent of population-based research has been conducted. Aptly named, the 10/66 study is investigating dementia and aging trends in these very regions. In India, for example, people do not live as long as they do in many developed countries, but life expectancy is continuing to go up, prompting a sharper rise in the number of dementia cases among the elderly. “In order to get Alzheimer’s or other dementias, you have to live long enough,” Langa says.
But changes in average life span can have different effects on different diseases. Eileen Crimmins, a gerontologist at the University of Southern California, studies how life expectancy influences chronic disease burden, as measured by the time one needs help and care. Two factors contribute to this: changes in mortality and changes in disease onset. “It is possible to end up with more people sick for a longer time when you are delaying death,” Crimmins told Scientific American via e-mail. “This is what has happened with heart disease.” More people in the U.S. are living with heart disease today, compared with decades ago, even though rates of death due to heart disease have gone down. Mortality and disease onset trends have played out more favorably for dementia, however. Nowadays fewer people have cognitive impairment, and people are not living as long with cognitive impairment, Crimmins says.
Crimmins, Brayne and Langa will discuss the research on falling dementia risk in a February 13 panel at the annual meeting of the American Association for the Advancement of Science in Washington, D.C. The trend is “intriguing and wonderful,” says NIA’s Haaga, who will moderate the panel. However, “I don’t want to give the impression that somehow the problem is now solved.” Even if progressively smaller percentages of the growing elderly population develop dementia in years to come, Haaga says, Alzheimer’s “is already the most expensive disease in the U.S., and it will continue to grow.”
Worldwide, the cost of dementia in 2015 was estimated at $818 billion. By 2030, it is expected to become a $2-trillion disease. As far as individual risk is concerned, however, “things are not getting worse,” Crimmins says. “Even if it is only a start of a trend, the likelihood of your getting dementia is getting smaller.”
Twenty-one weeks into her pregnancy Brook Meakins and her husband started to regret their "babymoon"—a vacation celebrating just the two of them before their burgeoning family grew to three. Right before Christmas they had left for a luxurious trip to Bora Bora that they had planned well in advance. From there they went to Easter Island for another week and then decided to swing back through Bora Bora for a few more days.
But last week their nightmare began. They spent Martin Luther King Day at the emergency room back in California, where doctors gave Meakins a battery of tests for her rash and achy joints. Just days earlier the U.S. Centers for Disease Control and Prevention had issued its first-ever warning for pregnant women and Zika disease—advising expectant mothers to avoid 13 countries and Puerto Rico where the mosquito-borne virus was rapidly spreading. The list of spots to avoid did not include French Polynesia, but Zika has been known to surface there in the past and her doctors had said there was cause for concern. Although Meakins tried to avoid getting bitten—thinking of other mosquito-borne threats because Zika was not even on their radar—she did receive about a dozen bug bites on their trip, particularly in Bora Bora. “I have thought many times these past few days about how none of this would have happened if we had just canceled the trip,” Meakins says.
The CDC is not necessarily concerned with the effect of Zika virus disease on pregnant women like Meakins because its symptoms are relatively mild and could last about a week. Instead, its links to a serious birth defect called microcephaly is what prompted the agency to put out the warning and why Meakins is losing sleep at night. That condition, which results in the fetus developing an abnormally small head, may also include potentially debilitating brain damage. In Brazil, where Zika is increasingly common, there have been more than 3,500 cases of microcephaly—more than 20 times the norm for the country. Doctors still do not know how often pregnant women with Zika go on to have children with microcephaly but they are now on the lookout for it.
Already the Hawaii State Department of Health announced last week that in the state one baby was born with microcephaly to a mother who was likely infected with Zika when she lived in Brazil in May 2015. And yesterday Illinois public health officials confirmed two pregnant women in their state have also tested positive for the virus after one traveled to Honduras and another went to Haiti. The CDC is now steeling itself for more such cases.
At the hospital on MLK Day, Meakins gave a blood sample that will be sent to the CDC for further testing. She says she was told not to expect the results for four to 14 days. The wait, unsurprisingly, is agonizing. “We are balls of worry right now, although we are trying to stay positive, love each other through and stay on top of the research as it comes in,” she says. Their fetus was the product of in vitro fertilization and so “he was a little miracle already,” she says.
Tests for Zika are cumbersome and limited. At the CDC, there are several tests they can do to hunt for signs of the virus. Right after infection there are several days when the virus is in the blood so the CDC could use a method called reverse transcription–polymerase chain reaction to amplify the viral genetic material if it is present and detect it. The window for such testing, however, only lasts about a week. “Once the person is infected and after their immune system kicks in and they start developing antibodies to fight the virus you can’t find it,” says Erin Staples, a medical epidemiologist at the CDC. At that point, the search would focus instead on antibodies that show up due to a recent exposure to the virus. If those antibodies are present, then the CDC would do another confirmatory test—one whose results would take about a week to complete—that involves taking the person’s blood that should contain antibodies and exposing it to the virus in the lab. That final test is necessary because Zika virus closely resembles similar pathogens like dengue and yellow fever, which could lead to false positives, Staples says.
Yet if Meakins does turn out to have had Zika, there are no steps doctors can take to try to prevent transmission of the disease to the baby—or to prevent microcephaly. Despite years of expertise preventing transmission of diseases like HIV to fetuses, there is too little information on Zika to know what would work with this tropical virus. Under certain circumstances with HIV, for example, doctors recommend a C-section delivery to help reduce chances of disease transmission, but there have been so few studies on Zika virus disease that “we don’t have enough information at this point to make any recommendations at this point in that area,” says Denise Jamieson, a medical officer in the CDC's Division of Reproductive Health. The CDC recommends instead that expectant mothers should consult an infectious disease specialist with expertise in pregnancy. And if the fetus does look like it will have microcephaly, to deliver it somewhere with a robust neonatal and obstetric care department.
On January 19 the CDC released guidelines for obstetricians stating that doctors should be sure to ask pregnant women about their recent travel history, and advise them if they had visited locations where there is active Zika transmission to be tested for the virus. The guidelines—which the CDC stresses will evolve with more information—also include instructions for close monitoring of such women with ultrasounds every few weeks and potential testing of amniotic fluid to look for the viral RNA from Zika. With those ultrasounds, “you can see the anatomy of the brain and have some estimate of the severity of the damage to the brain,” Jamieson says.
Regardless of the test results, Meakins’s doctor, David Marinoff, a specialist in high-risk obstetrics at Alta Bates Summit Medical Center in Berkeley, Calif., says he plans to do ultrasounds every four weeks to ensure the pregnancy is progressing smoothly and the baby’s brain is developing normally. “There might be something going on we didn’t test for,” he says, noting they also did tests for other mosquito-borne illnesses that she may have been exposed to in her travels. Moreover, with Zika, “no one knows what the attack rate is—how many women who are pregnant end up with babies who have microcephaly” he says. For the Meakinses, at least for now, there is no thing to do but wait.
In case you haven’t heard, Washington, D.C., and other parts of the Mid-Atlantic region, are about to get walloped by a major storm that could bury the city in a record-breaking amount of snow.
The storm is expected to bring snows that could top 2 feet in the D.C. area and has already resulted in thousands of cancelled flights. While snows may not be quite as impressive further north, the storm’s fierce winds could whip up significant coastal flooding.
Part of the reason this Snowzilla storm is expected to dump so much snow is because it is pulling abundant moisture. As the planet warms because of excess heat trapped by human-emitted greenhouse gases, the atmosphere can hold more moisture. Scientists already expect heavy downpours to increase because of that. But there’s been little research into what that means for “epic blizzards” like this one.
It might seem that more moisture in the atmosphere along with warming temperatures should mean more rain than snow, and that’s true. But, it turns out, that’s only part of the story.
On Thursday, MIT climate researcher Paul O’Gorman reviewed a 2014 study he conducted that is one of the few to look at extreme snowfalls and warming. Speaking before a group of scientists during a talk at Columbia University, he detailed his use of climate models to look at how extreme snowfalls might change as the planet heats up. Global temperatures have already risen by nearly 2°F (1°C) since the late 1800s.
O’Gorman found that while both average annual snow amounts and extreme snowfalls would decline as temperatures rose, the extremes didn’t drop off as rapidly. Effectively, extreme snowfalls would become a bigger proportion of all snow events.
The reason for this disparity, O’Gorman found, has to do with the very particular temperature conditions in which extreme snows occur, sort of like a frozen version of the Goldilocks tale: If it’s too warm, you get rain, not snow, but if it’s too cold, there won’t be enough moisture in the air to fuel a full-on blizzard.
But looking across a winter, snows in general will occur across a wider band of temperatures—essentially, less warming is needed to chip away at the temperatures that produce all snow than the narrow band where extreme snows occur.
One possible exception to this decrease could be in very cold places, such as the Canadian Arctic, where even with warming it would still be cold enough to snow, but the temperature increase would mean more moisture to fuel that snow.
O’Gorman’s study is one of very few to look at the issue of warming and extreme snowfalls, and, to date, the pattern he identified has yet to be seen in snowfall observations, he said. He suspects this is because there are fewer snow observations than those for rain because snow happens over a much smaller area of Earth’s land surface.
“I don’t expect the signal on snowfall to emerge for another 20 years or so,” O’Gorman said.
That study also only looks at one specific aspect of snowstorms. Another relatively unexplored factor is how warming might influence the storms, called extratropical cyclones, that actually bring the snow as they sweep across the country. Some research has suggested that, like hurricanes, these systems could become less frequent, but those that do occur will be more intense, but it’s still an active area of research.
Discerning any role of warming in fueling this specific storm would require a specific attribution study, but one expected impact of this storm that does have a clear connection to climate change is the coastal flooding it could bring to areas from Maryland up to Long Island. As sea levels continue to rise from global warming, nor’easters and other intense storms are more likely to cause damaging floods.
But for a better picture of what the Snowpocalypses of the future might look like, much more research remains to be done.
In 2012 the remains of 27 hunter–gatherers were unearthed in a remote part of Kenya called Nataruk near Lake Turkana—many of whom, based on the startling state of their bony remnants, died horrifically violent deaths. Skulls were bashed in with blunt objects; knees and hands bound and broken. Razor-sharp obsidian spear tips were found lodged in two of the skeletons.
After exhuming and carbon-dating the skeletons, researchers from the University of Cambridge Leverhulme Center for Human Evolutionary Studies have published their findings in this week’s Nature, reporting that the remains are estimated to be from between 9,500 and 10,500 years ago, making it the earliest scientifically dated evidence of organized human violence among scavenging humans. (Scientific American is part of Nature Publishing Group.)
Whereas other recent evidence suggests that hunter–gatherer conflicts resulted in the men being slayed while women and children were assimilated to the victorious group, the winners of this conflict were considerably less discerning. At least eight of the victims were female—including one carrying a six- to nine-month-old fetus—five were children and one was a teenager.
Although the recent discovery is the oldest for hunter–gatherers, it is not the oldest find of large-scale human violence—currently that title goes to remains discovered in the 1960s at the Jebel Sahaba site in Sudan, which could be up to 13,000 years old. Researchers say, however, that although the Sudan site is probably older, the dating methods were not as reliable as the one used for this new study. Also, many of those remains were found in what appeared to be a cemetery and could have accumulated over many years. “The significance of Nataruk lies partly in its early age but particularly in the fact that it is evidence of a single event,” explains professor of human evolution Robert Foley, who co-authored the new paper with his Cambridge colleague evolutionary biologist Marta Mirazón Lahr.
Foley also points out that the new findings suggest that group violence occurred among people whose way of life was nomadic and often fought for resources as opposed to—as the Jebel Sahaba cemetery implies—an established community where territories and allotment of possessions and properties were more clearly identified “This shows that even under hunter–gatherer conditions, conflicts between groups became serious enough for this level of killing,” Foley says.
His point is significant, given that one school of anthropological thought holds that coordinated conflict, and eventually what we would call warfare, only arose with the settlement of land and a sense of proprietorship over resources. This thinking, as Foley points out, traces back to the idea of the “noble savage” being uncorrupted by civilization, a phrase which first appeared in English in John Dryden's 1672 play The Conquest of Granada and which is often misattributed to 18th-century French philosopher Jean-Jacques Rousseau.
Yet violence predating advances like agriculture is more in line with a perhaps disheartening counterview in anthropology—one echoing English philosopher Thomas Hobbes’s idea that war and violence toward one another is the natural state of mankind—positing that conflict among humans and our predecessors has deep evolutionary roots that exist independently of cultural advance. “This is an important discovery because it provides crucial data to answer a long-standing debate about whether warfare predates agriculture,” comments Luke Glowacki, a Harvard University anthropologist who studies the evolutionary roots of violence. “This new study shows that warfare can and did occur in the absence of agriculture and complex social organization. It fills in important gaps in our understanding of the human propensity for violence and suggests a continuum between chimpanzee raiding and full-blown human warfare.”
Although the recently discovered fallen Africans are thought to have lived a mobile existence, there is evidence that they were at least somewhat situated. Nataruk at the time abutted a fertile lagoon and would have been ripe with valuable resources like water and seafood. Also pottery was found at the site, which among early hunter–gatherers is, as Rutgers University anthropology professor Brian Ferguson points out, “usually associated with increased sedentism, food storage and heightened social complexity. It’s possible that the Lake Turkana victims were ensconced in a particularly productive locale, and those who massacred them came from somewhere much worse off.
“There is probably little doubt that settling down and giving up a nomadic way of life would have intensified the probability of violence between groups,” Foley says. “For most hunter–gatherers, mobility was a strategy for avoiding such conflicts. However, if one looks at chimpanzees we see similar levels of intergroup conflict, and that might be an indication that it has a deeper ancestry in our evolutionary past.”
To this point, Glowacki explains that studies looking at more modern hunter–gathers also suggest that humans probably have some deep-rooted tendency toward violence. “[Early] wars occur even in cases without resource competition,” he says. “In fact, resource abundance rather than competition sometimes contributes to increasing intensity of warfare because individuals are freed from worrying about providing for their basic subsistence needs.”
Foley feels that a contributor to this violent tendency in humans and our ancestors might be, ironically, the same development that allows for altruism and compassion—that is, cooperation. “I can see violent attacks as deriving from the ability of humans to form groups with high levels of solidarity—put simple, between-group rivalry may have come with group coordination and sociality.”
Unfortunately, as Nataruk and other instances of prehistoric group carnage suggest, friendship and ferocity seem to go hand in hand