Posted on Leave a comment

New biomarkers of inflammation identified as risk of polyneuropathy

Polyneuropathy is one of the most common complications in people with diabetes. However, it can also occur with certain risk factors or diseases before the onset of diabetes. First symptoms are often pins-and-needles sensations in the feet. Although polyneuropathy is present in about 30% of people with diabetes, it often remains undiagnosed. Scientists from the German Diabetes Center (DDZ) in Düsseldorf, in cooperation with colleagues from Helmholtz Zentrum München (HMGU), both partners in the German Center for Diabetes Research (DZD), have now been able to show for the first time that six biomarkers of inflammation indicate the risk of polyneuropathy. The results were published in the current issue of the journal “Diabetes.”

Although many patients suffer from polyneuropathy, relatively little is currently known about its development, which also limits the therapeutic options. It is known that inflammatory processes contribute to other diabetic complications such as heart attack or stroke. The aim of this new study was therefore the extensive analysis of biomarkers that characterize inflammatory processes as a risk factor for distal sensory polyneuropathy (DSPN). Both people with type 2 diabetes and people in the elderly general population were examined.

“In our study, we identified novel biomarkers that indicate the risk of polyneuropathy. For the first time, we were also able to find indications that in addition to the innate immune system, the adaptive immune system could be involved in the development of the disease,” said Professor Christian Herder, MD, head of the study at the German Diabetes Center (DDZ). “These findings could open new therapeutic perspectives. The goal could be to influence the immune system accordingly and thus ultimately prevent the development or progression of neuropathy,” added Professor Dan Ziegler, MD, who is a neuro-diabetologist and deputy director of the Institute for Clinical Diabetology at the DDZ.

Study — Procedure and Design

The study included 513 men and women of the population-based KORA (Cooperative Health Research in the Region of Augsburg) F4/FF4 cohort aged 62 to 81 years who had no distal sensory polyneuropathy at the beginning of the study. Of these individuals, 127 developed a DSPN during the 6.5 year follow-up period. The serum level of 71 biomarkers of inflammation was measured using the new proximity extension assay technology. The serum level of 26 of these 71 biomarkers was higher in people who developed polyneuropathy during the study than in people without polyneuropathy. After statistical correction for multiple testing, higher concentrations of six biomarkers remained associated with the DSPN risk. Three of these proteins (MCP-3/CCL7, MIG/CXCL9, IP-10/CXCL10) were chemokines, while the other three (DNER, CD40, TNFRSF9) were soluble forms of transmembrane receptors.

The chemokines showed neurotoxic effects in a cell culture model, which indicates their involvement in the development of neuropathy. When the data for these six biomarkers were added to a clinical risk model, the predictive quality of the model improved significantly. Further pathway analyses indicated that different cell types of innate and adaptive immunity are likely to be involved in the development of DSPN. Overall, this study has therefore been able to reveal novel associations between biomarkers of inflammation and the risk of polyneuropathy and to provide evidence suggesting a complex interaction of innate and adaptive immunity in the development of this complication.

Conclusion

This study significantly improves understanding of the role of inflammatory processes in the development of distal sensory polyneuropathy in the elderly both with and without type 2 diabetes. The main findings must now be replicated in other cohorts. In addition to biochemical investigations, investigations of immune cells are also important. The long-term aim of this work is to clarify whether and how modulation of inflammatory processes can supplement the options for prevention and therapy of distal sensory polyneuropathy.

Story Source:

Materials provided by Deutsches Zentrum fuer Diabetesforschung DZD. Note: Content may be edited for style and length.

Posted on Leave a comment

Two methods for measuring children's exposure to radio frequencies are compared

A study has quantified emissions coming from radio frequency sources, and by means of personal and spot measurements has analysed which levels of exposure children find themselves in.

The use of radio frequency sources is very widespread, whether it is in mobile phones, Wi-Fi, Bluetooth, phone antennas, radio and TV, etc., and there is a huge amount of information to find out whether these emissions fall or do not fall within the recommended levels. However, information about the exposure to which individuals, and children, in particular, are subject remains scant, and among experts on the subject there are opposing views about the effect these sources could have on children. In any case, “the levels to which children are exposed need to be properly analysed to be able to conduct epidemiological studies, and for this it is essential to optimize the methodology,” pointed out Mara Gallastegi-Bilbao. She is the author of the work, she wrote up her PhD thesis at the UPV/EHU and is currently working at the Biodonostia Institute.

That is why this work had a triple aim: firstly, by means of spot and personal measurements, to study exposure of 8-year-old children to radio frequencies; secondly, to identify the locations and sources that have the greatest repercussions on the overall calculation of exposure; and finally, to examine whether the assessments obtained through spot measurements could replace personal measurements. To be able to compare the two methodologies, personal and spot measurements were made during the course of the work. “In recent years, personal measurements have been on the increase.

Measurements of this type offer personal information, since the people participating carry an exposure measuring device around with them. Such measurements constitute a major effort in terms of money and time,” said Gallastegi. The researcher went on to add that “there are certain disadvantages, because the fact of carrying the device around does not allow the measurements to be made properly. Exposure is reduced because of the shielding effect of the body itself, so the values obtained are not totally reliable.” Apart from that, “the behaviour of the participants varies because they are somehow aware that they are being analysed,” said the researcher. In any case, “we conducted personal measurements among a small sample of 50 children over a three-day period,” she added.

“The spot measurements, by contrast, were conducted in environments in which children spend much of their daily lives, in other words, at home, in schools and parks. Bearing in mind the level of exposure in these locations and the time spent in them, the TWA (time-weighted average) values were calculated,” said the author of the work. Weighted averages are ones that apply a different weight to the values. “A study was made of 104 children through a measuring device that provides an analysis of each source separately. Measurements of this type do not call for so much effort in terms of time and are not affected by the shielding effect,” she added.

According to Gallastegi, “it is the first time that the two methodologies have been compared, in other words, besides carrying out spot measurements in the locations where children spend most of their time, a set of personal measurements was also made on a smaller sample.”

“Despite the fact that the absolute values obtained by one methodology and the other do not coincide, no differences were observed when it comes to classifying the child population into one level or another in terms of exposure to sources of radio frequency,” said Gallastegi. “For the epidemiological studies we were not that interested in knowing the exact value of the exposure, but in knowing whether children are exposed to low, moderate or high levels of exposure, since inadequate classifications could lead to incorrect interpretations,” added the researcher. Therefore, “the measurements of the exposure based on spot measurements could be useful and appropriate for classifying children into low, moderate or high levels of exposure, as long as they are studied in environments in which they spend most of their daily lives, and bearing in mind that the child population studied do not yet use mobile phones on a regular basis,” concluded Mara Gallastegi-Bilbao.

Posted on Leave a comment

Garlic ingredient from the lab bench

Fresh garlic extracts contain a variety of healthy organosulfur compounds, among which ajoene forms a major oil-extractable ingredient. Now, chemists in the United Kingdom have synthesized ajoene from readily available components for the first time. The results, which are published in the journal Angewandte Chemie, show that ajoene is accessible on a large scale with very few synthetic steps. Chemical synthesis of biologically active compounds is important for their further evaluation in medicinal research.

If garlic is cut or chewed, enzymes present in the damaged tissue start to degrade its main organosulfur metabolite, alliin. The first degradation product is allicin, which gives fresh garlic preparations their characteristic pungent odor. However, this molecule decomposes further into various, largely oil-soluble compounds, all characterized chemically as organosulfides or disulfides. A more stable decomposition product and main component in oil extracts is ajoene. This compound has similar health-promoting effects to allicin and it exhibits anticancer activity.

Although ajoene can be isolated from garlic extracts, chemical synthesis would have many advantages. Synthesized ajoene would allow the introduction of chemical modifications, a key provision in drug research. Therefore, Thomas Wirth and his group at Cardiff University in collaboration with the Welsh company Neem Biotech in the United Kingdom have now developed a fully synthetic approach based on simple, readily available components. The sequence starts with a simple dibromide and terminates with the oxidation of an organoselenium compound. Oxidative elimination of the selenium compound, the scientists noted, leads to the formation of the terminal carbon-carbon double bond characteristic for the ajoene molecule. At the same time, its sulfide moiety is oxidized to a sulfoxide, another characteristic chemical function in ajoene.

The biggest challenge in ajoene synthesis was minimizing the various side reactions typical for organosulfur compounds, Wirth and his team reported. Such side reactions profoundly decreased the yield in the biomimetic approach to ajoene, which started from allicin. But low yields turned out to be a problem in total synthesis as well. Therefore, the scientists explored several modifications in the reaction steps, but the most profound improvement, unexpectedly, came from scaling up the synthesis. On the 200-gram scale, the final oxidation yielded 56 percent of the product, the authors reported, which was twice as much as when working on the milligram scale.

The product was biologically active. Testing its activity against bacteria in a bioassay, Wirth and his group found that synthetic ajoene performed similarly to or even better than natural ajoene extracted from garlic. It inhibited biological communication called quorum sensing in Gram-negative bacteria, which may lead to biofilm formation. Inhibiting this could be a promising usage of ajoene, the authors suggested. And as total synthesis has now made this compound more easily accessible, its career in medicinal chemistry may be ready to take off.

Story Source:

Materials provided by Wiley. Note: Content may be edited for style and length.

Posted on Leave a comment

Prospect of a new treatment for rheumatoid arthritis

An international research group led by Charité — Universitätsmedizin Berlin has completed testing a new drug to treat rheumatoid arthritis. The drug is effective in patients with moderate to severe forms of the disease who have shown an inadequate response to conventional disease modifying drugs. Results from this research have been published in The Lancet.

Rheumatoid arthritis is a painful inflammatory condition affecting the joints and tendons, which is typically characterized by periods of increased disease activity. Prof. Dr. Gerd-Rüdiger Burmester, Head of Charité’s Medical Department, Division of Rheumatology and Clinical Immunology, conducted a study to assess the efficacy of upadacitinib in patients with an inadequate response to ‘conventional synthetic disease-modifying antirheumatic drugs’. Upadacitinib is a selective inhibitor of the enzyme Janus kinase 1 (JAK1) and has been shown to be efficacious in this patient group in earlier phase II clinical trials. By inhibiting JAK1, upadacitinib disrupts an important signaling pathway that is responsible for triggering inflammatory responses.

In the phase III study presented, patients treated with upadacitinib showed significant improvements in joint swelling when compared to patients receiving placebo. Patients also experienced less pain and showed improvements in joint function. Prof. Burmester is very pleased to see this new tablet-based treatment produce such significant improvements in clinical symptoms. “Our results prove that JAK inhibitors represent an effective treatment alternative in patients with long-term conditions who do not respond adequately to conventional drugs, and in those for whom biologics are not a good treatment option. JAK inhibitors could help these patients achieve a quick response to treatment, allowing them to gain control over their illness. The trial sponsor AbbVie is currently in the process of collating all trial results and submitting them to the European and US regulatory authorities for review.”

Story Source:

Materials provided by Charité – Universitätsmedizin Berlin. Note: Content may be edited for style and length.

Posted on Leave a comment

Increased phosphate intake elevates blood pressure in healthy adults

If more phosphate is consumed with food, blood pressure and pulse rate increase in healthy young adults. These findings were shown by a study led by the University of Basel and published in the Journal of the American Society of Nephrology.

They make processed cheese spreadable, prevent coffee from clumping and help preserve many meat products: phosphates are a common additive in industrially produced foodstuffs.

Natural foods also contain phosphates, but modern eating habits mean that we are increasing our intake of them. Increased consumption of processed foodstuffs has significantly increased phosphate intake in recent years, which now often exceeds the daily intake of 700 mg recommended in the US.

Healthy people also at risk

As a high phosphate level can lead, for example, to deposits in blood vessels, a low-phosphate diet has long been recommended for people with chronic kidney problems.

However, an increase in dietary phosphate also increases the likelihood of developing or even dying from arteriosclerosis or a cardiovascular disease in healthy people. This has been shown by epidemiological studies that examine the connection between potential risk factors and certain diseases.

Physiological study with young adults

For the first time, a research team led by Professor Reto Krapf from the University of Basel has now verified this statistical connection in a qualitative study with 20 healthy test subjects.

Over 11 weeks, half of the participants received an additional dose of sodium phosphate in tablet form alongside their normal diet. This increased the phosphate content in their blood to an above-average level, albeit one that is widespread in the population.

The second group took a phosphate binder that inhibits the substance’s intake in the body. They also received salt as sodium chloride to equal the first group’s sodium intake.

Effect on blood pressure and pulse rate

After six weeks, the doctors examined the effects of the different diets on various cardiovascular indicators such as blood pressure and pulse. A comparison of the two groups showed that the increased phosphate intake significantly increased the systolic and diastolic blood pressure of healthy young adults — by 4.1 and 3.2 mmHg, respectively. At the same time, pulse rate increased by an average of four beats per minute.

The researchers show that increased phosphate intake, more specifically an increased serum phosphate level, activates the sympathetic nervous system, which accelerates cardiac activity and increases blood pressure. The study demonstrated the effect to be reversible: two months after the end of the study, the participants’ levels had returned to normal.

No effect of vitamin D

Vitamin D is increasingly prescribed for various reasons. It both stimulates intestinal phosphate absorption thus further increasing phosphate load, but also as putative cardioprotective effects. Therefore, in the second phase of this study, the effect of an additional supplement of vitamin D was examined. However, no measurable influence on the cardiovascular values was found in either group.

“Our results provide an important explanation for the association of dietary phosphate intake with increased cardiovascular morbidity and mortality in the general population,” says study leader Reto Krapf. “These conclusions are important for public health and should be further examined in larger studies in various population groups.”

The study was supported by the National Center of Competence in Research (NCCR) Kidney Control of Homeostasis.

Story Source:

Materials provided by University of Basel. Note: Content may be edited for style and length.

Posted on Leave a comment

How sleep loss may contribute to adverse weight gain

In a new study, researchers at Uppsala University now demonstrate that one night of sleep loss has a tissue-specific impact on the regulation of gene expression and metabolism in humans. This may explain how shift work and chronic sleep loss impairs our metabolism and adversely affects our body composition. The study is published in the scientific journal Science Advances.

Epidemiological studies have shown that the risk for obesity and type 2 diabetes is elevated in those who suffer from chronic sleep loss or who carry out shift work. Other studies have shown an association between disrupted sleep and adverse weight gain, in which fat accumulation is increased at the same time as the muscle mass is reduced — a combination that in and of itself has been associated with numerous adverse health consequences. Researchers from Uppsala and other groups have in earlier studies shown that metabolic functions that are regulated by e.g. skeletal muscle and adipose tissue are adversely affected by disrupted sleep and circadian rhythms. However, until now it has remained unknown whether sleep loss per se can cause molecular changes at the tissue level that can confer an increased risk of adverse weight gain.

In the new study, the researchers studied 15 healthy normal-weight individuals who participated in two in-lab sessions in which activity and meal patterns were highly standardised. In randomised order, the participants slept a normal night of sleep (over eight hours) during one session, and were instead kept awake the entire night during the other session. The morning after each night-time intervention, small tissue samples (biopsies) were taken from the participants’ subcutaneous fat and skeletal muscle. These two tissues often exhibit disrupted metabolism in conditions such as obesity and diabetes. At the same time in the morning, blood samples were also taken to enable a comparison across tissue compartments of a number of metabolites. These metabolites comprise sugar molecules, as well as different fatty and amino acids.

The tissue samples were used for multiple molecular analyses, which first of all revealed that the sleep loss condition resulted in a tissue-specific change in DNA methylation, one form of mechanism that regulates gene expression. DNA methylation is a so-called epigenetic modification that is involved in regulating how the genes of each cell in the body are turned on or off, and is impacted by both hereditary as well as environmental factors, such as physical exercise.

“Our research group were the first to demonstrate that acute sleep loss in and of itself results in epigenetic changes in the so-called clock genes that within each tissue regulate its circadian rhythm. Our new findings indicate that sleep loss causes tissue-specific changes to the degree of DNA methylation in genes spread throughout the human genome. Our parallel analysis of both muscle and adipose tissue further enabled us to reveal that DNA methylation is not regulated similarly in these tissues in response to acute sleep loss,” says Jonathan Cedernaes who led the study.

“It is interesting that we saw changes in DNA methylation only in adipose tissue, and specifically for genes that have also been shown to be altered at the DNA methylation level in metabolic conditions such as obesity and type 2 diabetes. Epigenetic modifications are thought to be able to confer a sort of metabolic “memory” that can regulate how metabolic programmes operate over longer time periods. We therefore think that the changes we have observed in our new study can constitute another piece of the puzzle of how chronic disruption of sleep and circadian rhythms may impact the risk of developing for example obesity,” notes Jonathan Cedernaes.

Further analyses of e.g. gene and protein expression demonstrated that the response as a result of wakefulness differed between skeletal muscle and adipose tissue. The researchers say that the period of wakefulness simulates the overnight wakefulness period of many shift workers assigned to nightwork. A possible explanation for why the two tissues respond in the observed manner could be that overnight wakefulness periods exert a tissue-specific effect on tissues’ circadian rhythm, resulting in misalignment between these rhythms. This is something that the researchers found preliminary support for also in this study, as well as in an earlier similar but smaller study.

“In the present study we observed molecular signatures of increased inflammation across tissues in response to sleep loss. However, we also saw specific molecular signatures that indicate that the adipose tissue is attempting to increase its capacity to store fat following sleep loss, whereas we instead observed signs indicating concomitant breakdown of skeletal muscle proteins in the skeletal muscle, in what’s also known as catabolism. We also noted changes in skeletal muscle levels of proteins involved handling blood glucose, and this could help explain why the participants’ glucose sensitivity was impaired following sleep loss. Taken together, these observations may provide at least partial mechanistic insight as to why chronic sleep loss and shift work can increase the risk of adverse weight gain as well as the risk of type 2 diabetes,” says Jonathan Cedernaes.

The researchers have only studied the effect of one night of sleep loss, and therefore do not know how other forms of sleep or disruption of circadian misalignment would have affected the participants’ tissue metabolism.

“It will be interesting to investigate to what extent one or more nights of recovery sleep can normalise the metabolic changes that we observe at the tissue level as a result of sleep loss. Diet and exercise are factors that can also alter DNA methylation, and these factors can thus possibly be used to counteract adverse metabolic effects of sleep loss,” says Jonathan Cedernaes.

Story Source:

Materials provided by Uppsala University. Note: Content may be edited for style and length.

Posted on Leave a comment

Rapid development in Central Africa increases the risk of infectious disease outbreaks

The Central Africa region is experiencing rapid urbanization and economic growth, and infrastructure development. These changes, while generally positive and welcome, also make the region more vulnerable to explosive infectious disease outbreaks, according to an international group of scientists. Writing in the New England Journal of Medicine, the authors, all of whom have field research experience in the region, note that efforts to build up the health care infrastructure in Central Africa are critically needed to mitigate or prevent a large outbreak of Ebola or other infectious disease in the region. The authors represent 12 different organizations, including the National Institute of Allergy and Infectious Diseases, part of the National Institutes of Health.

Citing the example of the 2013-2016 Ebola outbreak in West Africa, they note that Liberia, Sierra Leone and Guinea all have large, urban and mobile populations. Among other factors, this enabled the Ebola virus to quickly spread through these countries and overwhelm their limited health care infrastructures, resulting in more than 28,000 cases of Ebola virus disease and 11,000 fatalities.

Through their Central Africa field work over several years — primarily in the Republic of the Congo and the Democratic Republic of the Congo (DRC) — the researchers have observed what they describe as the world’s fastest rate of urbanization. By 2030, they write, half of the Central Africa population is expected to live in urban areas. They have seen the evolution of once-rutted jeep trails used to access remote villages now accessible by paved roads, typically related to the growth in logging, mining and hydroelectric industries. Road construction and similar disturbances in the jungle terrain alters ecosystems in which pathogens and their hosts reside, they note. This increases the opportunity for new infectious diseases to emerge and reduces the time it takes people to travel to and from urban areas, allowing outbreaks to spread quickly.

“Clearly, Central Africa is rapidly approaching a tipping point,” the authors state. “Africa’s economic development is a positive change that cannot and should not be stopped. At the same time, rapid economic and demographic transitions bring the challenges of emerging infectious disease outbreaks of increased frequency, size, and global impact.”

They believe that increases in population, income and educational attainment could spur demand for improved services, including health care. Moreover, directed investments in clinical research infrastructure could include training health care workers to identify, report and properly handle cases of unknown emerging infectious disease; diagnose patients; provide clinical care; and test new vaccines and therapeutics.

“Directed and sustained investment is urgently needed, before ongoing demographic and economic changes conspire to cause major outbreaks of both national and international consequence,” they write.

Story Source:

Materials provided by NIH/National Institute of Allergy and Infectious Diseases. Note: Content may be edited for style and length.

Posted on Leave a comment

New research presents alternative methods, like robo-advisors, to manage retirement income

The need to help retirees make prudent spending decisions has led to the growth of a large industry of financial advisors, but a new article suggests that improved policy approaches may be more effective. Published in Policy Insights from the Behavioral and Brain Sciences, the study reviews the psychology behind rapid spending decisions and presents five policy options that lead to the smarter self-management of assets.

Decisions regarding decumulation, or the spending of savings during retirement, are often greatly affected by psychological factors, include trade-offs between current and future income, and have long term implications. Impulse, loss aversion, and the desire for ownership are all psychological predictors that can lead retirees to manage their assets poorly and spend savings too rapidly or too slowly.

Researchers from the University of California, Los Angeles and City, University of London built upon existing decumulation research in order to offer insight to public policy experts, financial industry regulators, and social program administrators. They proposed the following policy options to optimize retirement income solutions:

1. Financial Literacy: Financial training programs offered before complicated or risky investment decisions, similar to the training required to obtain a driver’s license.

2. Safe Automatic Options: A low-cost option to protect a proportion of the individuals’ total wealth, yet still allow discretion by the retiree for the remainder. Such an option could be built into an employee’s retirement benefit program.

3. Precommitment: If individuals are at risk of future intellectual decline as a result of dementia or another similar disorder, commitment to financial decisions during younger working years could be especially useful. Options could include payments made over time.

4. Disclosures and Framing: Changes in language can also affect decumulation decisions. For example, reframing continued employment as an investment in future social security income may cause those considering retirement to feel more inclined to wait.

5. Customized Interventions: A financial robo-advisor that would perform an assessment of a retiree’s key psychological drivers, biases, and inclinations before leading them through personalized solutions. Such interventions would provide policy makers with more insight to individual biases, preferences, and problem solving techniques, allowing them to work in tandem with individuals, rather than in opposition.

“Baby boomers are now retiring at the rate of almost 10,000 per day,” wrote authors Suzanne and Stephen Shu. “These millions of retirees, and the families and providers who look out for their financial well-being are counting on a smarter approach to decumulation.”

Story Source:

Materials provided by SAGE. Note: Content may be edited for style and length.

Posted on Leave a comment

Landslides triggered by human activity on the rise

More than 50,000 people were killed by landslides around the world between 2004 and 2016, according to a new study by researchers at UK’s Sheffield University. The team, who compiled data on over 4800 fatal landslides during the 13-year period, also revealed for the first time that landslides resulting from human activity have increased over time. The research is published today in the European Geosciences Union journal Natural Hazards and Earth System Sciences.

The team found that over 700 fatal landslides that occurred between 2004 and 2016 had a human fingerprint. Construction works, legal and illegal mining, as well as the unregulated cutting of hills (carving out land on a slope) caused most of the human-induced landslides.

“We were aware that humans are placing increasing pressure on their local environment, but it was surprising to find clear trends within the database that fatal landslides triggered by construction, illegal hillcutting and illegal mining were increasing globally during the period of 2004 and 2016,” says Melanie Froude, a postdoctoral researcher at Sheffield’s Department of Geography and lead author of the study.

While the trend is global, Asia is the most affected continent: “All countries in the top 10 for fatal landslide triggered by human activity are located in Asia,” says Froude. The number 1 country is India, which accounts for 20% of these events. It is also the country where human-triggered fatal landslides are increasing at the highest rate, followed by Pakistan, Myanmar and the Philippines.

Dave Petley, a professor and Vice-President for Research and Innovation at the University of Sheffield, started collecting data on fatal landslides after realising that many databases on natural disasters were “significantly underestimating the extent of landslide impact.” While earthquakes and storms are deadlier, landslides do cause a significant number of fatalities.

The researchers identified a total of 4800 fatal landslides, excluding those triggered by earthquakes, that occurred around the world between 2004 and 2016 and caused a total of about 56,000 deaths. The most tragic event identified by the researchers was the Kedarnath landslide in June 2013 in India, which resulted in over 5000 deaths. It was due to extreme weather conditions that caused flash floods and massive mudflows, which affected thousands of religious pilgrims trapped in a mountain area.

Since 2004, Petley has painstakingly collected data on fatal landslides from online English-language media reports. To confirm the news were accurate, Petley — and more recently Froude, who reviewed all landslide accounts — checked each report whenever possible against government and aid agency articles, academic studies or through personal communication. Details about the landslides, such as location, impacts or cause, were added to their Global Fatal Landslide Database.

“Collecting these reports and organising them into a database shows us where landslides are frequently harming people, what causes these landslides and whether there are patterns in fatal landslide occurrence over time. The database provides us with an overview of the impact of landslides on society,” explains Petley.

Aside from Asia, where 75% of landslides in the database occurred, the areas most affected are in Central and South America, the Caribbean islands, and East Africa. In Europe, the Alps are the region with more fatal landslides.

In support of past studies, the researchers also found that 79% of landslides in their database were triggered by rainfall. Most events happen during the northern hemisphere summer, when cyclones, hurricanes and typhoons are more frequent and the monsoon season brings heavy rains to parts of Asia.

The Natural Hazards and Earth System Sciences study highlights that fatal landslides are more common in settlements, along roads, and at sites rich in precious resources. They occur more frequently in poor countries and affect poor people disproportionately, the researchers say.

In the Himalayan mountain region, especially in Nepal and India, many of the fatal landslides triggered by construction occurred on road construction sites in rural areas, while in China many happened in urban building sites. “The prevalence of landslides in these settings suggests that regulations to protect workers and the public are insufficient or are not being sufficiently enforced. In the case of roads, maintaining safety during construction is difficult when it is economically unviable to completely shut roads because alternative routes involve substantial 100 mile + detours,” says Froude.

Landslides triggered by hillcutting are mostly a problem in rural areas, where many people illegally collect material from hillslopes to build their houses. “We found several incidents of children being caught-up in slides triggered as they collected coloured clay from hillslopes, for decoration of houses during religious festivals in Nepal. Educating communities who undertake this practise on how to do it safely, will save lives,” Froude says.

“With appropriate regulation to guide engineering design, education and enforcement of regulation by specialist inspectors, landslides triggered by construction, mining and hillcutting are entirely preventable,” Froude emphasises. “The study highlights that we need to refocus our efforts globally on preventable slope accidents,” concludes Petley.

Posted on Leave a comment

New research proposes using local data in resolving malnutrition

Kwashiorkor, one of the most extreme forms of malnutrition, is estimated to affect more than a hundred thousand children annually. The condition can make a starving child look healthy to the untrained eye, which makes it difficult to study and track. As a result, it has largely been overlooked by the scientific community. Researchers have recently attempted to increase its recognition by conducting a global study of more than 1.7 million children, but a new study published in the Food and Nutrition Bulletin reveals that kwashiorkor may be a local phenomenon that is underestimated by national statistics.

Out today, the study concludes that analyzing data on a large, global scale carries dangerous risks:

  1. Health crises like kwashiorkor in particular villages may go undetected
  2. The real effects of health interventions may be underestimated
  3. Badly targeted interventions may lead to poor coverage, efficacy, and cost-efficiency
  4. Researchers may miss useful insights into the root causes of a disease like kwashiorkor.

Nutrition researchers from Tufts University, Harvard University, and St. Johns Research Institute conducted a comprehensive survey of a geographic area including more than 1,300 children aged one to five years in the Democratic Republic of the Congo (DRC). A previous study conducted globally had suggested that 33 percent of malnutrition cases in the DRC were of the kwashiorkor variety but that this was higher in some provinces than others. Hoping to better understand this dynamic, the researchers comprehensively surveyed 19 neighboring villages to understand the prevalence of the disease at the local level.

Their results found that rates of kwashiorkor varied from 0 to 14.9 percent in these villages, the latter number indicating extreme nutritional stress within specific communities. The difference between different areas, which appeared statistically identical, was extreme: one group, or “cluster,” of five adjacent villages had no cases of kwashiorkor, while in a neighboring cluster of five villages, 9.5% of children had the condition. By interviewing health service staff members in the region and reviewing the nutritional history of the children, the researchers were able to confirm that these numbers reflect a long-term pattern.

“Understanding that this clustering effect exists, at least in some regions, provides an opportunity to increase the effectiveness of treatment through better targeting in those regions and to explore potential risk factors for kwashiorkor,” write the researchers.

Story Source:

Materials provided by SAGE. Note: Content may be edited for style and length.