SwePub
Sök i SwePub databas

  Extended search

Träfflista för sökning "WFRF:(Arendt A.) "

Search: WFRF:(Arendt A.)

  • Result 1-25 of 66
Sort/group result
   
EnumerationReferenceCoverFind
1.
  • Pulit, S. L., et al. (author)
  • Atrial fibrillation genetic risk differentiates cardioembolic stroke from other stroke subtypes
  • 2018
  • In: Neurology-Genetics. - : Ovid Technologies (Wolters Kluwer Health). - 2376-7839. ; 4:6
  • Journal article (peer-reviewed)abstract
    • Objective We sought to assess whether genetic risk factors for atrial fibrillation (AF) can explain cardioembolic stroke risk. We evaluated genetic correlations between a previous genetic study of AF and AF in the presence of cardioembolic stroke using genome-wide genotypes from the Stroke Genetics Network (N = 3,190 AF cases, 3,000 cardioembolic stroke cases, and 28,026 referents). We tested whether a previously validated AF polygenic risk score (PRS) associated with cardioembolic and other stroke subtypes after accounting for AF clinical risk factors. We observed a strong correlation between previously reported genetic risk for AF, AF in the presence of stroke, and cardioembolic stroke (Pearson r = 0.77 and 0.76, respectively, across SNPs with p < 4.4 x 10(-4) in the previous AF meta-analysis). An AF PRS, adjusted for clinical AF risk factors, was associated with cardioembolic stroke (odds ratio [OR] per SD = 1.40, p = 1.45 x 10(-48)), explaining similar to 20% of the heritable component of cardioembolic stroke risk. The AF PRS was also associated with stroke of undetermined cause (OR per SD = 1.07,p = 0.004), but no other primary stroke subtypes (all p > 0.1). Genetic risk of AF is associated with cardioembolic stroke, independent of clinical risk factors. Studies are warranted to determine whether AF genetic risk can serve as a biomarker for strokes caused by AF.
  •  
2.
  • Blunden, Jessica, et al. (author)
  • State of the Climate in 2012
  • 2013
  • In: Bulletin of The American Meteorological Society - (BAMS). - 0003-0007 .- 1520-0477. ; 94:8, s. S1-S258
  • Journal article (peer-reviewed)abstract
    • For the first time in serveral years, the El Nino-Southern Oscillation did not dominate regional climate conditions around the globe. A weak La Ni a dissipated to ENSOneutral conditions by spring, and while El Nino appeared to be emerging during summer, this phase never fully developed as sea surface temperatures in the eastern conditions. Nevertheless, other large-scale climate patterns and extreme weather events impacted various regions during the year. A negative phase of the Arctic Oscillation from mid-January to early February contributed to frigid conditions in parts of northern Africa, eastern Europe, and western Asia. A lack of rain during the 2012 wet season led to the worst drought in at least the past three decades for northeastern Brazil. Central North America also experienced one of its most severe droughts on record. The Caribbean observed a very wet dry season and it was the Sahel's wettest rainy season in 50 years. Overall, the 2012 average temperature across global land and ocean surfaces ranked among the 10 warmest years on record. The global land surface temperature alone was also among the 10 warmest on record. In the upper atmosphere, the average stratospheric temperature was record or near-record cold, depending on the dataset. After a 30-year warming trend from 1970 to 1999 for global sea surface temperatures, the period 2000-12 had little further trend. This may be linked to the prevalence of La Ni a-like conditions during the 21st century. Heat content in the upper 700 m of the ocean remained near record high levels in 2012. Net increases from 2011 to 2012 were observed at 700-m to 2000-m depth and even in the abyssal ocean below. Following sharp decreases in to the effects of La Ni a, sea levels rebounded to reach records highs in 2012. The increased hydrological cycle seen in recent years continued, with more evaporation in drier locations and more precipitation in rainy areas. In a pattern that has held since 2004, salty areas of the ocean surfaces and subsurfaces were anomalously salty on average, while fresher areas were anomalously fresh. Global tropical cyclone activity during 2012 was near average, with a total of 84 storms compared with the 1981-2010 average of 89. Similar to 2010 and 2011, the North Atlantic was the only hurricane basin that experienced above-normal activity. In this basin, Sandy brought devastation to Cuba and parts of the eastern North American seaboard. All other basins experienced either near-or below-normal tropical cyclone activity. Only three tropical cyclones reached Category 5 intensity-all in Bopha became the only storm in the historical record to produce winds greater than 130 kt south of 7 N. It was also the costliest storm to affect the Philippines and killed more than 1000 residents. Minimum Arctic sea ice extent in September and Northern Hemisphere snow cover extent in June both reached new record lows. June snow cover extent is now declining at a faster rate (-17.6% per decade) than September sea ice extent (-13.0% per decade). Permafrost temperatures reached record high values in northernmost Alaska. A new melt extent record occurred on 11-12 July on the Greenland ice sheet; 97% of the ice sheet showed some form of melt, four times greater than the average melt for this time of year. The climate in Antarctica was relatively stable overall. The largest maximum sea ice extent since records begain in 1978 was observed in September 2012. In the stratosphere, warm air led to the second smallest ozone hole in the past two decades. Even so, the springtime ozone layer above Antarctica likely will not return to its early 1980s state until about 2060. Following a slight decline associated with the global 2 emissions from fossil fuel combustion and cement production reached a record 9.5 +/- 0.5 Pg C in 2011 and a new record of 9.7 +/- 0.5 Pg C is estimated for 2012. Atmospheric CO2 concentrations increased by 2.1 ppm in 2012, to 392.6 ppm. In spring 2012, 2 concentration exceeded 400 ppm at 7 of the 13 Arctic observation sites. Globally, other greenhouse gases including methane and nitrous oxide also continued to rise in concentration and the combined effect now represents a 32% increase in radiative forcing over a 1990 baseline. Concentrations of most ozone depleting substances continued to fall.
  •  
3.
  •  
4.
  •  
5.
  •  
6.
  •  
7.
  • Pfeffer, W. Tad, et al. (author)
  • The Randolph Glacier Inventory : a globally complete inventory of glaciers
  • 2014
  • In: Journal of Glaciology. - 0022-1430 .- 1727-5652. ; 60:221, s. 537-552
  • Journal article (peer-reviewed)abstract
    • The Randolph Glacier Inventory (RGI) is a globally complete collection of digital outlines of glaciers, excluding the ice sheets, developed to meet the needs of the Fifth Assessment of the Intergovernmental Panel on Climate Change for estimates of past and future mass balance. The RGI was created with limited resources in a short period. Priority was given to completeness of coverage, but a limited, uniform set of attributes is attached to each of the similar to 198 000 glaciers in its latest version, 3.2. Satellite imagery from 1999-2010 provided most of the outlines. Their total extent is estimated as 726 800 +/- 34 000 km(2). The uncertainty, about +/- 5%, is derived from careful single-glacier and basin-scale uncertainty estimates and comparisons with inventories that were not sources for the RGI. The main contributors to uncertainty are probably misinterpretation of seasonal snow cover and debris cover. These errors appear not to be normally distributed, and quantifying them reliably is an unsolved problem. Combined with digital elevation models, the RGI glacier outlines yield hypsometries that can be combined with atmospheric data or model outputs for analysis of the impacts of climatic change on glaciers. The RGI has already proved its value in the generation of significantly improved aggregate estimates of glacier mass changes and total volume, and thus actual and potential contributions to sea-level rise.
  •  
8.
  • Windhorst, Rogier A., et al. (author)
  • JWST PEARLS. Prime extragalactic areas for reionization and lensing science : project overview and first results
  • 2023
  • In: Astronomical Journal. - : Institute of Physics (IOP). - 0004-6256 .- 1538-3881. ; 165:1
  • Journal article (peer-reviewed)abstract
    • We give an overview and describe the rationale, methods, and first results from NIRCam images of the JWST “Prime Extragalactic Areas for Reionization and Lensing Science” (PEARLS) project. PEARLS uses up to eight NIRCam filters to survey several prime extragalactic survey areas: two fields at the North Ecliptic Pole (NEP); seven gravitationally lensing clusters; two high redshift protoclusters; and the iconic backlit VV 191 galaxy system to map its dust attenuation. PEARLS also includes NIRISS spectra for one of the NEP fields and NIRSpec spectra of two high-redshift quasars. The main goal of PEARLS is to study the epoch of galaxy assembly, active galactic nucleus (AGN) growth, and First Light. Five fields—the JWST NEP Time-Domain Field (TDF), IRAC Dark Field, and three lensing clusters—will be observed in up to four epochs over a year. The cadence and sensitivity of the imaging data are ideally suited to find faint variable objects such as weak AGN, high-redshift supernovae, and cluster caustic transits. Both NEP fields have sightlines through our Galaxy, providing significant numbers of very faint brown dwarfs whose proper motions can be studied. Observations from the first spoke in the NEP TDF are public. This paper presents our first PEARLS observations, their NIRCam data reduction and analysis, our first object catalogs, the 0.9–4.5 μm galaxy counts and Integrated Galaxy Light. We assess the JWST sky brightness in 13 NIRCam filters, yielding our first constraints to diffuse light at 0.9–4.5 μm. PEARLS is designed to be of lasting benefit to the community.
  •  
9.
  • Aaron-Morrison, Arlene P., et al. (author)
  • State of the climate in 2014
  • 2015
  • In: Bulletin of the American Meteorological Society. - : American Meteorological Society. - 0003-0007 .- 1520-0477. ; 96
  • Journal article (peer-reviewed)abstract
    • Most of the dozens of essential climate variables monitored each year in this report continued to follow their long-term trends in 2014, with several setting new records. Carbon dioxide, methane, and nitrous oxide-the major greenhouse gases released into Earth's atmosphere-once again all reached record high average atmospheric concentrations for the year. Carbon dioxide increased by 1.9 ppm to reach a globally averaged value of 397.2 ppm for 2014. Altogether, 5 major and 15 minor greenhouse gases contributed 2.94 W m-2 of direct radiative forcing, which is 36% greater than their contributions just a quarter century ago. Accompanying the record-high greenhouse gas concentrations was nominally the highest annual global surface temperature in at least 135 years of modern record keeping, according to four independent observational analyses. The warmth was distributed widely around the globe's land areas, Europe observed its warmest year on record by a large margin, with close to two dozen countries breaking their previous national temperature records; many countries in Asia had annual temperatures among their 10 warmest on record; Africa reported above-average temperatures across most of the continent throughout 2014; Australia saw its third warmest year on record, following record heat there in 2013; Mexico had its warmest year on record; and Argentina and Uruguay each had their second warmest year on record. Eastern North America was the only major region to observe a below-average annual temperature. But it was the oceans that drove the record global surface temperature in 2014. Although 2014 was largely ENSO-neutral, the globally averaged sea surface temperature (SST) was the highest on record. The warmth was particularly notable in the North Pacific Ocean where SST anomalies signaled a transition from a negative to positive phase of the Pacific decadal oscillation. In the winter of 2013/14, unusually warm water in the northeast Pacific was associated with elevated ocean heat content anomalies and elevated sea level in the region. Globally, upper ocean heat content was record high for the year, reflecting the continued increase of thermal energy in the oceans, which absorb over 90% of Earth's excess heat from greenhouse gas forcing. Owing to both ocean warming and land ice melt contributions, global mean sea level in 2014 was also record high and 67 mm greater than the 1993 annual mean, when satellite altimetry measurements began. Sea surface salinity trends over the past decade indicate that salty regions grew saltier while fresh regions became fresher, suggestive of an increased hydrological cycle over the ocean expected with global warming. As in previous years, these patterns are reflected in 2014 subsurface salinity anomalies as well. With a now decade-long trans-basin instrument array along 26°N, the Atlantic meridional overturning circulation shows a decrease in transport of-4.2 ± 2.5 Sv decade-1. Precipitation was quite variable across the globe. On balance, precipitation over the world's oceans was above average, while below average across land surfaces. Drought continued in southeastern Brazil and the western United States. Heavy rain during April-June led to devastating floods in Canada's Eastern Prairies. Above-normal summer monsoon rainfall was observed over the southern coast of West Africa, while drier conditions prevailed over the eastern Sahel. Generally, summer monsoon rainfall over eastern Africa was above normal, except in parts of western South Sudan and Ethiopia. The south Asian summer monsoon in India was below normal, with June record dry. Across the major tropical cyclone basins, 91 named storms were observed during 2014, above the 1981-2010 global average of 82. The Eastern/Central Pacific and South Indian Ocean basins experienced significantly above-normal activity in 2014; all other basins were either at or below normal. The 22 named storms in the Eastern/Central Pacific was the basin's most since 1992. Similar to 2013, the North Atlantic season was quieter than most years of the last two decades with respect to the number of storms, despite the absence of El Niño conditions during both years. In higher latitudes and at higher elevations, increased warming continued to be visible in the decline of glacier mass balance, increasing permafrost temperatures, and a deeper thawing layer in seasonally frozen soil. In the Arctic, the 2014 temperature over land areas was the fourth highest in the 115-year period of record and snow melt occurred 20-30 days earlier than the 1998-2010 average. The Greenland Ice Sheet experienced extensive melting in summer 2014. The extent of melting was above the 1981-2010 average for 90% of the melt season, contributing to the second lowest average summer albedo over Greenland since observations began in 2000 and a record-low albedo across the ice sheet for August. On the North Slope of Alaska, new record high temperatures at 20-m depth were measured at four of five permafrost observatories. In September, Arctic minimum sea ice extent was the sixth lowest since satellite records began in 1979. The eight lowest sea ice extents during this period have occurred in the last eight years. Conversely, in the Antarctic, sea ice extent countered its declining trend and set several new records in 2014, including record high monthly mean sea ice extent each month from April to November. On 20 September, a record large daily Antarctic sea ice extent of 20.14 × 106 km2 occurred. The 2014 Antarctic stratospheric ozone hole was 20.9 million km2 when averaged from 7 September to 13 October, the sixth smallest on record and continuing a decrease, albeit statistically insignificant, in area since 1998.
  •  
10.
  •  
11.
  •  
12.
  •  
13.
  • Abreu, A., et al. (author)
  • Priorities for ocean microbiome research
  • 2022
  • In: Nature Microbiology. - : Springer Science and Business Media LLC. - 2058-5276. ; 7:7, s. 937-947
  • Journal article (peer-reviewed)abstract
    • Studying the ocean microbiome can inform international policies related to ocean governance, tackling climate change, ocean acidification and pollution, and can help promote achievement of multiple Sustainable Development Goals. Microbial communities have essential roles in ocean ecology and planetary health. Microbes participate in nutrient cycles, remove huge quantities of carbon dioxide from the air and support ocean food webs. The taxonomic and functional diversity of the global ocean microbiome has been revealed by technological advances in sampling, DNA sequencing and bioinformatics. A better understanding of the ocean microbiome could underpin strategies to address environmental and societal challenges, including achievement of multiple Sustainable Development Goals way beyond SDG 14 'life below water'. We propose a set of priorities for understanding and protecting the ocean microbiome, which include delineating interactions between microbiota, sustainably applying resources from oceanic microorganisms and creating policy- and funder-friendly ocean education resources, and discuss how to achieve these ambitious goals.
  •  
14.
  •  
15.
  •  
16.
  • Carville, S F, et al. (author)
  • EULAR evidence-based recommendations for the management of fibromyalgia syndrome
  • 2008
  • In: Annals of the Rheumatic Diseases. - : BMJ. - 0003-4967 .- 1468-2060. ; 67:4, s. 536-541
  • Journal article (peer-reviewed)abstract
    • Objective: To develop evidence-based recommendations for the management of fibromyalgia syndrome. Methods: A multidisciplinary task force was formed representing 11 European countries. The design of the study, including search strategy, participants, interventions, outcome measures, data collection and analytical method, was defined at the outset. A systematic review was undertaken with the keywords "fibromyalgia", "treatment or management" and "trial". Studies were excluded if they did not utilise the American College of Rheumatology classification criteria, were not clinical trials, or included patients with chronic fatigue syndrome or myalgic encephalomyelitis. Primary outcome measures were change in pain assessed by visual analogue scale and fibromyalgia impact questionnaire. The quality of the studies was categorised based on randomisation, blinding and allocation concealment. Only the highest quality studies were used to base recommendations on. When there was insufficient evidence from the literature, a Delphi process was used to provide basis for recommendation. Results: 146 studies were eligible for the review. 39 pharmacological intervention studies and 59 non-pharmacological were included in the final recommendation summary tables once those of a lower quality or with insufficient data were separated. The categories of treatment identified were antidepressants, analgesics, and "other pharmacological" and exercise, cognitive behavioural therapy, education, dietary interventions and "other non-pharmacological". In many studies sample size was small and the quality of the study was insufficient for strong recommendations to be made. Conclusions: Nine recommendations for the management of fibromyalgia syndrome were developed using a systematic review and expert consensus.
  •  
17.
  • Fernandez-de-las-Penas, C., et al. (author)
  • Carpal Tunnel Syndrome: Neuropathic Pain Associated or Not with a Nociplastic Condition
  • 2023
  • In: Biomedicines. - 2227-9059. ; 11:6
  • Journal article (peer-reviewed)abstract
    • Carpal tunnel syndrome (CTS) has been traditionally classified as primarily a neuropathic condition with or without pain. Precision medicine refers to an evidence-based method of grouping patients based on their susceptibility to biology, prognosis of a particular disease, or in their response to a specific treatment, and tailoring specific treatments accordingly. In 2021, the International Association for the Study of Pain (IASP) proposed a grading system for classifying patients into nociceptive, neuropathic, or nociplastic phenotypes. This position paper presents data supporting the possibility of subgrouping individuals with specific CTS related-pain into nociceptive, neuropathic, nociplastic or mixed-type phenotypes. Carpal tunnel syndrome is a neuropathic condition but can also be comorbid with a nociplastic pain condition. The presence of extra-median symptoms and the development of facilitated pain processing seem to be signs suggesting that specific CTS cases can be classified as the nociplastic pain phenotype. The clinical responses of therapeutic approaches for the management of CTS are inconclusive. Accordingly, the ability to identify the predominant pain phenotype in patients with CTS could likely be problematic for producing efficient treatment outcomes. In fact, the presence of a nociplastic or mixed-type pain phenotype would explain the lack of clinical effect of treatment interventions targeting the carpal tunnel area selectively. We propose a clinical decision tree by using the 2021 IASP classification criteria for identifying the predominant pain phenotype in people with CTS-related pain, albeit CTS being a priori a neuropathic pain condition. The identification of a nociplastic-associated condition requires a more nuanced multimodal treatment approach to achieve better treatment outcomes.
  •  
18.
  • Shraim, M. A., et al. (author)
  • Features and methods to discriminate between mechanism-based categories of pain experienced in the musculoskeletal system: a Delphi expert consensus study
  • 2022
  • In: Pain. - : Ovid Technologies (Wolters Kluwer Health). - 0304-3959 .- 1872-6623. ; 163:9, s. 1812-1828
  • Journal article (peer-reviewed)abstract
    • Classification of musculoskeletal pain based on underlying pain mechanisms (nociceptive, neuropathic, and nociplastic pain) is challenging. In the absence of a gold standard, verification of features that could aid in discrimination between these mechanisms in clinical practice and research depends on expert consensus. This Delphi expert consensus study aimed to: (1) identify features and assessment findings that are unique to a pain mechanism category or shared between no more than 2 categories and (2) develop a ranked list of candidate features that could potentially discriminate between pain mechanisms. A group of international experts were recruited based on their expertise in the field of pain. The Delphi process involved 2 rounds: round 1 assessed expert opinion on features that are unique to a pain mechanism category or shared between 2 (based on a 40% agreement threshold); and round 2 reviewed features that failed to reach consensus, evaluated additional features, and considered wording changes. Forty-nine international experts representing a wide range of disciplines participated. Consensus was reached for 196 of 292 features presented to the panel (clinical examination-134 features, quantitative sensory testing-34, imaging and diagnostic testing-14, and pain-type questionnaires-14). From the 196 features, consensus was reached for 76 features as unique to nociceptive (17), neuropathic (37), or nociplastic (22) pain mechanisms and 120 features as shared between pairs of pain mechanism categories (78 for neuropathic and nociplastic pain). This consensus study generated a list of potential candidate features that are likely to aid in discrimination between types of musculoskeletal pain.
  •  
19.
  •  
20.
  • Antinori, A., et al. (author)
  • Updated research nosology for HIV-associated neurocognitive disorders
  • 2007
  • In: Neurology. ; 69:18, s. 1789-99
  • Journal article (peer-reviewed)abstract
    • In 1991, the AIDS Task Force of the American Academy of Neurology published nomenclature and research case definitions to guide the diagnosis of neurologic manifestations of HIV-1 infection. Now, 16 years later, the National Institute of Mental Health and the National Institute of Neurological Diseases and Stroke have charged a working group to critically review the adequacy and utility of these definitional criteria and to identify aspects that require updating. This report represents a majority view, and unanimity was not reached on all points. It reviews our collective experience with HIV-associated neurocognitive disorders (HAND), particularly since the advent of highly active antiretroviral treatment, and their definitional criteria; discusses the impact of comorbidities; and suggests inclusion of the term asymptomatic neurocognitive impairment to categorize individuals with subclinical impairment. An algorithm is proposed to assist in standardized diagnostic classification of HAND.
  •  
21.
  • Hensel, A, et al. (author)
  • Association between global brain volume and the rate of cognitive change in elderly humans without dementia
  • 2005
  • In: Dementia and geriatric cognitive disorders. - : S. Karger AG. - 1420-8008 .- 1421-9824. ; 19:4, s. 213-221
  • Journal article (peer-reviewed)abstract
    • Patients with mild cognitive deficits experience different types of evolution. They are at increased risk of developing dementia, but they have also a chance of remaining stable in cognition or of improving. We investigated whether global brain volume, callosal size and hippocampal size are associated with the rate of cognitive change in elderly without dementia. Volumetric MR images were recorded from 39 controls and 35 patients with questionable dementia who were followed up longitudinally for a mean of 2.3 years. The outcome measure was the annual change in the test score in the Structured Interview for the Diagnosis of Alzheimer’s Dementia and Multi-Infarct Dementia, which includes all items of the Mini-Mental State Examination. Global brain volume, grey matter volume and white matter volume were the only significant independent predictors of the rate of cognitive change.
  •  
22.
  • Kienholz, C., et al. (author)
  • A new method for deriving glacier centerlines applied to glaciers in Alaska and northwest Canada
  • 2014
  • In: The Cryosphere. - : Copernicus GmbH. - 1994-0416 .- 1994-0424. ; 8:2, s. 503-519
  • Journal article (peer-reviewed)abstract
    • This study presents a new method to derive centerlines for the main branches and major tributaries of a set of glaciers, requiring glacier outlines and a digital elevation model (DEM) as input. The method relies on a "cost grid-least-cost route approach" that comprises three main steps. First, termini and heads are identified for every glacier. Second, centerlines are derived by calculating the least-cost route on a previously established cost grid. Third, the centerlines are split into branches and a branch order is allocated. Application to 21 720 glaciers in Alaska and northwest Canada (Yukon, British Columbia) yields 41 860 centerlines. The algorithm performs robustly, requiring no manual adjustments for 87.8% of the glaciers. Manual adjustments are required primarily to correct the locations of glacier heads (7.0% corrected) and termini (3.5% corrected). With corrected heads and termini, only 1.4% of the derived centerlines need edits. A comparison of the lengths from a hydrological approach to the lengths from our longest centerlines reveals considerable variation. Although the average length ratio is close to unity, only similar to 50% of the 21 720 glaciers have the two lengths within 10% of each other. A second comparison shows that our centerline lengths between lowest and highest glacier elevations compare well to our longest centerline lengths. For > 70% of the 4350 glaciers with two or more branches, the two lengths are within 5% of each other. Our final product can be used for calculating glacier length, conducting length change analyses, topological analyses, or flowline modeling.
  •  
23.
  • Kienholz, C., et al. (author)
  • Geodeticmass balance of surge-type Black Rapids Glacier, Alaska, 1980-2001-2010, including role of rockslide deposition and earthquake displacement
  • 2016
  • In: Journal of Geophysical Research - Earth Surface. - 2169-9003 .- 2169-9011. ; 121:12, s. 2358-2380
  • Journal article (peer-reviewed)abstract
    • We determine the geodetic mass balance of surge-type Black Rapids Glacier, Alaska, for the time periods 1980-2001 and 2001-2010 by combining modern interferometric synthetic aperture radar (InSAR)-derived digital elevation models (DEMs), DEMs derived from archival aerial imagery, laser altimetry data, and in situ surface elevation measurements. Our analysis accounts for both the large rockslides and terrain displacements caused by the 2002 M7.9 earthquake on the Denali fault, which runs through Black Rapids Glacier. To estimate uncertainties, we apply Monte Carlo simulations. For the earthquake-triggered rockslides we find a volume of 56.62 +/- 2.86 x 10(6) m(3), equivalent to an average debris thickness of 4.44 +/- 0.24 m across the 11.7 km(2) deposit area on the glacier. Terrain displacement due to the earthquake corresponds to an apparent glacier volume change of -53.1 x 106 m(3), which would cause an apparent specific mass balance of -0.19 meter water equivalent (mwe) if not taken into account. The geodetic mass balance of Black Rapids Glacier is -0.48 +/- 0.07 mwe a(-1) for the entire 30 year period, but more negative for the period 2001-2010 (-0.64 +/- 0.11 mwe a(-1)) than the period 1980-2001 (-0.42 +/- 0.11 mwe a(-1)), in agreement with trends indicated by in situ mass balance measurements. Elevation data indicate no net thickening of the surge reservoir between 1980 and 2010, in contrast to what is expected during the quiescent phase. A surge of Black Rapids Glacier in the near future is thus considered unlikely.
  •  
24.
  • Veronese, N., et al. (author)
  • Inverse relationship between body mass index and mortality in older nursing home residents : a meta-analysis of 19,538 elderly subjects
  • 2015
  • In: Obesity Reviews. - : Wiley. - 1467-7881 .- 1467-789X. ; 16:11, s. 1001-1015
  • Research review (peer-reviewed)abstract
    • Body mass index (BMI) and mortality in old adults from the general population have been related in a U-shaped or J-shaped curve. However, limited information is available for elderly nursing home populations, particularly about specific cause of death. A systematic PubMed/EMBASE/CINAHL/SCOPUS search until 31 May 2014 without language restrictions was conducted. As no published study reported mortality in standard BMI groups (<18.5, 18.5-24.9, 25-29.9, 30kg/m(2)), the most adjusted hazard ratios (HRs) according to a pre-defined list of covariates were obtained from authors and pooled by random-effect model across each BMI category. Out of 342 hits, 20 studies including 19,538 older nursing home residents with 5,223 deaths during a median of 2 years of follow-up were meta-analysed. Compared with normal weight, all-cause mortality HRs were 1.41 (95% CI=1.26-1.58) for underweight, 0.85 (95% CI=0.73-0.99) for overweight and 0.74 (95% CI=0.57-0.96) for obesity. Underweight was a risk factor for higher mortality caused by infections (HR=1.65 [95% CI=1.13-2.40]). RR results corroborated primary HR results, with additionally lower infection-related mortality in overweight and obese than in normal-weight individuals. Like in the general population, underweight is a risk factor for mortality in old nursing home residents. However, uniquely, not only overweight but also obesity is protective, which has relevant nutritional goal implications in this population/setting.
  •  
25.
  • Arendt, Anthony A., et al. (author)
  • Glacier changes in Alaska : can mass-balance models explain GRACE mascon trends?
  • 2009
  • In: Annals of Glaciology. - 0260-3055 .- 1727-5644. ; 50:50, s. 148-154
  • Journal article (peer-reviewed)abstract
    • Temperature and precipitation data from three weather stations in the St Elias Mountains of Alaska and northwestern Canada were used to drive one-dimensional (1-D) (elevation-dependent) and 0-D degree-day mass-balance models. Model outputs were optimized against a 10 day resolution time series of mass variability during 2003-07 obtained from Gravity Recovery and Climate Experiment (GRACE) mass concentration (mascon) solutions. The models explained 52-60% of the variance in the GRACE time series. Modelled mass variations matched the phase of the GRACE observations, and all optimized model parameters were within the range of values determined from conventional mass-balance and meteorological observations. We describe a framework for selecting appropriate weather stations and mass-balance models to represent glacier variations of large regions. There is potential for extending these calibrated mass-balance models forwards or backwards in time to construct mass-balance time series outside of the GRACE measurement window.
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-25 of 66

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view