SwePub
Tyck till om SwePub Sök här!
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Walker K. A.) ;lar1:(gu);srt2:(2012)"

Sökning: WFRF:(Walker K. A.) > Göteborgs universitet > (2012)

  • Resultat 1-5 av 5
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Achberger, Christine, 1968, et al. (författare)
  • State of the Climate in 2011
  • 2012
  • Ingår i: Bulletin of the American Meteorological Society. - 0003-0007. ; 93:7, s. S1-S263
  • Tidskriftsartikel (refereegranskat)abstract
    • Large-scale climate patterns influenced temperature and weather patterns around the globe in 2011. In particular, a moderate-to-strong La Nina at the beginning of the year dissipated during boreal spring but reemerged during fall. The phenomenon contributed to historical droughts in East Africa, the southern United States, and northern Mexico, as well the wettest two-year period (2010-11) on record for Australia, particularly remarkable as this follows a decade-long dry period. Precipitation patterns in South America were also influenced by La Nina. Heavy rain in Rio de Janeiro in January triggered the country's worst floods and landslides in Brazil's history. The 2011 combined average temperature across global land and ocean surfaces was the coolest since 2008, but was also among the 15 warmest years on record and above the 1981-2010 average. The global sea surface temperature cooled by 0.1 degrees C from 2010 to 2011, associated with cooling influences of La Nina. Global integrals of upper ocean heat content for 2011 were higher than for all prior years, demonstrating the Earth's dominant role of the oceans in the Earth's energy budget. In the upper atmosphere, tropical stratospheric temperatures were anomalously warm, while polar temperatures were anomalously cold. This led to large springtime stratospheric ozone reductions in polar latitudes in both hemispheres. Ozone concentrations in the Arctic stratosphere during March were the lowest for that period since satellite records began in 1979. An extensive, deep, and persistent ozone hole over the Antarctic in September indicates that the recovery to pre-1980 conditions is proceeding very slowly. Atmospheric carbon dioxide concentrations increased by 2.10 ppm in 2011, and exceeded 390 ppm for the first time since instrumental records began. Other greenhouse gases also continued to rise in concentration and the combined effect now represents a 30% increase in radiative forcing over a 1990 baseline. Most ozone depleting substances continued to fall. The global net ocean carbon dioxide uptake for the 2010 transition period from El Nino to La Nina, the most recent period for which analyzed data are available, was estimated to be 1.30 Pg C yr(-1), almost 12% below the 29-year long-term average. Relative to the long-term trend, global sea level dropped noticeably in mid-2010 and reached a local minimum in 2011. The drop has been linked to the La Nina conditions that prevailed throughout much of 2010-11. Global sea level increased sharply during the second half of 2011. Global tropical cyclone activity during 2011 was well-below average, with a total of 74 storms compared with the 1981-2010 average of 89. Similar to 2010, the North Atlantic was the only basin that experienced above-normal activity. For the first year since the widespread introduction of the Dvorak intensity-estimation method in the 1980s, only three tropical cyclones reached Category 5 intensity level-all in the Northwest Pacific basin. The Arctic continued to warm at about twice the rate compared with lower latitudes. Below-normal summer snowfall, a decreasing trend in surface albedo, and above-average surface and upper air temperatures resulted in a continued pattern of extreme surface melting, and net snow and ice loss on the Greenland ice sheet. Warmer-than-normal temperatures over the Eurasian Arctic in spring resulted in a new record-low June snow cover extent and spring snow cover duration in this region. In the Canadian Arctic, the mass loss from glaciers and ice caps was the greatest since GRACE measurements began in 2002, continuing a negative trend that began in 1987. New record high temperatures occurred at 20 m below the land surface at all permafrost observatories on the North Slope of Alaska, where measurements began in the late 1970s. Arctic sea ice extent in September 2011 was the second-lowest on record, while the extent of old ice (four and five years) reached a new record minimum that was just 19% of normal. On the opposite pole, austral winter and spring temperatures were more than 3 degrees C above normal over much of the Antarctic continent. However, winter temperatures were below normal in the northern Antarctic Peninsula, which continued the downward trend there during the last 15 years. In summer, an all-time record high temperature of -12.3 degrees C was set at the South Pole station on 25 December, exceeding the previous record by more than a full degree. Antarctic sea ice extent anomalies increased steadily through much of the year, from briefly setting a record low in April, to well above average in December. The latter trend reflects the dispersive effects of low pressure on sea ice and the generally cool conditions around the Antarctic perimeter.
  •  
2.
  • Klionsky, Daniel J., et al. (författare)
  • Guidelines for the use and interpretation of assays for monitoring autophagy
  • 2012
  • Ingår i: Autophagy. - : Informa UK Limited. - 1554-8635 .- 1554-8627. ; 8:4, s. 445-544
  • Forskningsöversikt (refereegranskat)abstract
    • In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process vs. those that measure flux through the autophagy pathway (i.e., the complete process); thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from stimuli that result in increased autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field.
  •  
3.
  • Appeltans, W., et al. (författare)
  • The Magnitude of Global Marine Species Diversity
  • 2012
  • Ingår i: Current Biology. - : Elsevier BV. - 0960-9822 .- 1879-0445. ; 22:23, s. 2189-2202
  • Tidskriftsartikel (refereegranskat)abstract
    • Background: The question of how many marine species exist is important because it provides a metric for how much we do and do not know about life in the oceans. We have compiled the first register of the marine species of the world and used this baseline to estimate how many more species, partitioned among all major eukaryotic groups, may be discovered. Results: There are similar to 226,000 eukaryotic marine species described. More species were described in the past decade (similar to 20,000) than in any previous one. The number of authors describing new species has been increasing at a faster rate than the number of new species described in the past six decades. We report that there are similar to 170,000 synonyms, that 58,000-72,000 species are collected but not yet described, and that 482,000-741,000 more species have yet to be sampled. Molecular methods may add tens of thousands of cryptic species. Thus, there may be 0.7-1.0 million marine species. Past rates of description of new species indicate there may be 0.5 +/- 0.2 million marine species. On average 37% (median 31%) of species in over 100 recent field studies around the world might be new to science. Conclusions: Currently, between one-third and two-thirds of marine species may be undescribed, and previous estimates of there being well over one million marine species appear highly unlikely. More species than ever before are being described annually by an increasing number of authors. If the current trend continues, most species will be discovered this century.
  •  
4.
  • Kaptoge, S., et al. (författare)
  • C-Reactive Protein, Fibrinogen, and Cardiovascular Disease Prediction
  • 2012
  • Ingår i: New England Journal of Medicine. - 0028-4793 .- 1533-4406. ; 367:14, s. 1310-1320
  • Tidskriftsartikel (refereegranskat)abstract
    • Background There is debate about the value of assessing levels of C-reactive protein (CRP) and other biomarkers of inflammation for the prediction of first cardiovascular events. Methods We analyzed data from 52 prospective studies that included 246,669 participants without a history of cardiovascular disease to investigate the value of adding CRP or fibrinogen levels to conventional risk factors for the prediction of cardiovascular risk. We calculated measures of discrimination and reclassification during follow-up and modeled the clinical implications of initiation of statin therapy after the assessment of CRP or fibrinogen. Results The addition of information on high-density lipoprotein cholesterol to a prognostic model for cardiovascular disease that included age, sex, smoking status, blood pressure, history of diabetes, and total cholesterol level increased the C-index, a measure of risk discrimination, by 0.0050. The further addition to this model of information on CRP or fibrinogen increased the C-index by 0.0039 and 0.0027, respectively (P < 0.001), and yielded a net reclassification improvement of 1.52% and 0.83%, respectively, for the predicted 10-year risk categories of "low" (< 10%), " intermediate" (10% to < 20%), and "high" (>= 20%) (P < 0.02 for both comparisons). We estimated that among 100,000 adults 40 years of age or older, 15,025 persons would initially be classified as being at intermediate risk for a cardiovascular event if conventional risk factors alone were used to calculate risk. Assuming that statin therapy would be initiated in accordance with Adult Treatment Panel III guidelines (i.e., for persons with a predicted risk of >= 20% and for those with certain other risk factors, such as diabetes, irrespective of their 10-year predicted risk), additional targeted assessment of CRP or fibrinogen levels in the 13,199 remaining participants at intermediate risk could help prevent approximately 30 additional cardiovascular events over the course of 10 years. Conclusions In a study of people without known cardiovascular disease, we estimated that under current treatment guidelines, assessment of the CRP or fibrinogen level in people at intermediate risk for a cardiovascular event could help prevent one additional event over a period of 10 years for every 400 to 500 people screened. (Funded by the British Heart Foundation and others.)
  •  
5.
  • Carpenter, S. R., et al. (författare)
  • General resilience to cope with extreme events
  • 2012
  • Ingår i: Sustainability. - : MDPI AG. - 2071-1050. ; 4:12, s. 3248-3259
  • Tidskriftsartikel (refereegranskat)abstract
    • Resilience to specified kinds of disasters is an active area of research and practice. However, rare or unprecedented disturbances that are unusually intense or extensive require a more broad-spectrum type of resilience. General resilience is the capacity of social-ecological systems to adapt or transform in response to unfamiliar, unexpected and extreme shocks. Conditions that enable general resilience include diversity, modularity, openness, reserves, feedbacks, nestedness, monitoring, leadership, and trust. Processes for building general resilience are an emerging and crucially important area of research.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-5 av 5

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy