SwePub
Sök i SwePub databas

  Extended search

Träfflista för sökning "WFRF:(Sandberg Andreas) srt2:(2010-2014)"

Search: WFRF:(Sandberg Andreas) > (2010-2014)

  • Result 1-34 of 34
Sort/group result
   
EnumerationReferenceCoverFind
1.
  •  
2.
  • Sandberg, Andreas, 1984-, et al. (author)
  • Modeling performance variation due to cache sharing
  • 2013
  • In: Proc. 19th IEEE International Symposium on High Performance Computer Architecture. - : IEEE Computer Society. - 9781467355858 ; , s. 155-166
  • Conference paper (peer-reviewed)abstract
    • Shared cache contention can cause significant variability in the performance of co-running applications from run to run. This variability arises from different overlappings of the applications' phases, which can be the result of offsets in application start times or other delays in the system. Understanding this variability is important for generating an accurate view of the expected impact of cache contention. However, variability effects are typically ignored due to the high overhead of modeling or simulating the many executions needed to expose them.This paper introduces a method for efficiently investigating the performance variability due to cache contention. Our method relies on input data captured from native execution of applications running in isolation and a fast, phase-aware, cache sharing performance model. This allows us to assess the performance interactions and bandwidth demands of co-running applications by quickly evaluating hundreds of overlappings.We evaluate our method on a contemporary multicore machine and show that performance and bandwidth demands can vary significantly across runs of the same set of co-running applications. We show that our method can predict application slowdown with an average relative error of 0.41% (maximum 1.8%) as well as bandwidth consumption. Using our method, we can estimate an application pair's performance variation 213x faster, on average, than native execution.
  •  
3.
  •  
4.
  •  
5.
  • Baliakas, Panagiotis, et al. (author)
  • Clinical effect of stereotyped B-cell receptor immunoglobulins in chronic lymphocytic leukaemia: a retrospective multicentre study
  • 2014
  • In: The Lancet Haematology. - 2352-3026. ; 1:2, s. 74-84
  • Journal article (peer-reviewed)abstract
    • Background About 30% of cases of chronic lymphocytic leukaemia (CLL) carry quasi-identical B-cell receptor immunoglobulins and can be assigned to distinct stereotyped subsets. Although preliminary evidence suggests that B-cell receptor immunoglobulin stereotypy is relevant from a clinical viewpoint, this aspect has never been explored in a systematic manner or in a cohort of adequate size that would enable clinical conclusions to be drawn. Methods For this retrospective, multicentre study, we analysed 8593 patients with CLL for whom immunogenetic data were available. These patients were followed up in 15 academic institutions throughout Europe (in Czech Republic, Denmark, France, Greece, Italy, Netherlands, Sweden, and the UK) and the USA, and data were collected between June 1, 2012, and June 7, 2013. We retrospectively assessed the clinical implications of CLL B-cell receptor immunoglobulin stereotypy, with a particular focus on 14 major stereotyped subsets comprising cases expressing unmutated (U-CLL) or mutated (M-CLL) immunoglobulin heavy chain variable genes. The primary outcome of our analysis was time to first treatment, defined as the time between diagnosis and date of first treatment. Findings 2878 patients were assigned to a stereotyped subset, of which 1122 patients belonged to one of 14 major subsets. Stereotyped subsets showed significant differences in terms of age, sex, disease burden at diagnosis, CD38 expression, and cytogenetic aberrations of prognostic significance. Patients within a specific subset generally followed the same clinical course, whereas patients in different stereotyped subsets-despite having the same immunoglobulin heavy variable gene and displaying similar immunoglobulin mutational status-showed substantially different times to first treatment. By integrating B-cell receptor immunoglobulin stereotypy (for subsets 1, 2, and 4) into the well established Dohner cytogenetic prognostic model, we showed these, which collectively account for around 7% of all cases of CLL and represent both U-CLL and M-CLL, constituted separate clinical entities, ranging from very indolent (subset 4) to aggressive disease (subsets 1 and 2). Interpretation The molecular classification of chronic lymphocytic leukaemia based on B-cell receptor immunoglobulin stereotypy improves the Dohner hierarchical model and refines prognostication beyond immunoglobulin mutational status, with potential implications for clinical decision making, especially within prospective clinical trials.
  •  
6.
  • Björnsson, Claes-Ingvar, et al. (author)
  • The location of the Crab pulsar emission region : restrictions on synchrotron emission models
  • 2010
  • In: Astronomy and Astrophysics. - : EDP Sciences. - 0004-6361 .- 1432-0746. ; 516, s. A65-
  • Journal article (peer-reviewed)abstract
    • Recent observations of the Crab pulsar show no evidence of a spectral break in the infrared regime. It is argued that the observations are consistent with a power-law spectrum in the whole observable infrared-optical range. This is taken as the starting point for evaluating of how self-consistent incoherent synchrotron models fare in a comparison with observations. Inclusion of synchrotron self-absorption proves important as does the restriction on the observed size of the emission region imposed by the relativistic beaming thought to define the pulse profile. It is shown that the observations can be used to derive two independent constraints on the distance from the neutron star to the emission region; in addition to a direct lower limit, an indirect measure is obtained from an upper limit to the magnetic field strength. Both of these limits indicate that the emission region is located at a distance considerably greater than the light cylinder radius. The implications of this result are discussed, and it is emphasized that, for standard incoherent synchrotron models to fit inside the light cylinder, rather special physical conditions need to be invoked.
  •  
7.
  • Bratt, Ola, et al. (author)
  • The Study of Active Monitoring in Sweden (SAMS) : A randomized study comparing two different follow-up schedules for active surveillance of low-risk prostate cancer
  • 2013
  • In: Scandinavian Journal of Urology. - : Medical Journals Sweden AB. - 2168-1805 .- 2168-1813. ; 47:5, s. 347-355
  • Research review (peer-reviewed)abstract
    • Objective. Only a minority of patients with low-risk prostate cancer needs treatment, but the methods for optimal selection of patients for treatment are not established. This article describes the Study of Active Monitoring in Sweden (SAMS), which aims to improve those methods. Material and methods. SAMS is a prospective, multicentre study of active surveillance for low-risk prostate cancer. It consists of a randomized part comparing standard rebiopsy and follow-up with an extensive initial rebiopsy coupled with less intensive follow-up and no further scheduled biopsies (SAMS-FU), as well as an observational part (SAMS-ObsQoL). Quality of life is assessed with questionnaires and compared with patients receiving primary curative treatment. SAMS-FU is planned to randomize 500 patients and SAMS-ObsQoL to include at least 500 patients during 5 years. The primary endpoint is conversion to active treatment. The secondary endpoints include symptoms, distant metastases and mortality. All patients will be followed for 10-15 years. Results. Inclusion started in October 2011. In March 2013, 148 patients were included at 13 Swedish urological centres. Conclusions. It is hoped that the results of SAMS will contribute to fewer patients with indolent, low-risk prostate cancer receiving unnecessary treatment and more patients on active surveillance who need treatment receiving it when the disease is still curable. The less intensive investigational follow-up in the SAMS-FU trial would reduce the healthcare resources allocated to this large group of patients if it replaced the present standard schedule.
  •  
8.
  • Cambridge Handbook of Institutional Investment and Fiduciary Duty
  • 2014
  • Editorial collection (other academic/artistic)abstract
    • The Cambridge Handbook of Institutional Investment and Fiduciary Duty is a comprehensive reference work exploring recent changes and future trends in the principles that govern institutional investors and fiduciaries. A wide range of contributors offer new perspectives on dynamics that drive the current emphasis on short-term investment returns. Moreover, they analyze the forces at work in markets around the world which are bringing into sharper focus the systemic effects that investment practices have on the long-term stability of the economy and the interests of beneficiaries in financial, social and environmental sustainability. This volume provides a global and multi-faceted commentary on the evolving standards governing institutional investment, offering guidance for students, researchers and policy-makers interested in finance, governance and other aspects of the contemporary investment world. It also provides investment, business, financial media and legal professionals with the tools they need to better understand and respond to new financial market challenges of the twenty-first century.
  •  
9.
  •  
10.
  • Fristedt, Mårten, et al. (author)
  • Supply chain management in practice : a case study of McDonald’s Sweden
  • 2012
  • Reports (other academic/artistic)abstract
    • Although much discussed in theory, supply chain management (SCM) is often problematic to carry out in practice. One exception is McDonald’s Sweden, which since its establishment has worked with suppliers and restaurants (franchisees) in a way that reminds of what SCM literature recommends. The purpose of this report is to describe and analyse the supply chain of McDonald’s Sweden from suppliers to franchisees.Based on interviews with McDonald’s Sweden, suppliers and franchisees, McDonald’s supply chain is described and analysed according to SCM literature. Cooper and Ellram’s (1993) framework of SCM characteristics is used complemented with several other writers.The study describes a supply chain where its members to a large extent collaborate as described in SCM literature. The report identifies and describes how significant SCM characteristics, such as information sharing, joint planning, and the sharing of risks and rewards are managed in the case. Finally, the report identifies market saturation and the search for economies of scale outside the primary supply chain as a challenge for future SCM practices. The case constitutes an interesting showcase where the ways in which the studied features are managed can inspire others businesses in succeeding in SCM.
  •  
11.
  •  
12.
  • Hawley, James P., et al. (author)
  • Introduction
  • 2014
  • In: Cambridge Handbook of Institutional Investment and Fiduciary Duty/ edited by James P. Hawley, Andreas G.F. Hoepner, Keith L. Johnson, Joakim Sandberg, Edward J. Waitzer.. - Cambridge : Cambridge University Press. - 9781107035874 ; , s. 1-6
  • Book chapter (other academic/artistic)
  •  
13.
  • Hayes, Matthew, et al. (author)
  • THE LYMAN ALPHA REFERENCE SAMPLE : EXTENDED LYMAN ALPHA HALOS PRODUCED AT LOW DUST CONTENT
  • 2013
  • In: Astrophysical Journal Letters. - 2041-8205 .- 2041-8213. ; 765:2, s. L27-
  • Journal article (peer-reviewed)abstract
    • We report on new imaging observations of the Lyman alpha emission line (Ly alpha), performed with the Hubble Space Telescope, that comprise the backbone of the Lyman alpha Reference Sample. We present images of 14 starburst galaxies at redshifts 0.028 < z < 0.18 in continuum-subtracted Ly alpha, H alpha, and the far ultraviolet continuum. We show that Ly alpha is emitted on scales that systematically exceed those of the massive stellar population and recombination nebulae: as measured by the Petrosian 20% radius, RP20, Ly alpha radii are larger than those of H alpha by factors ranging from 1 to 3.6, with an average of 2.4. The average ratio of Ly alpha-to-FUV radii is 2.9. This suggests that much of the Ly alpha light is pushed to large radii by resonance scattering. Defining the Relative Petrosian Extension of Ly alpha compared to H alpha, xi(Ly alpha) = R-P20(Ly alpha)/R-P20(H alpha), we find xi(Ly alpha) to be uncorrelated with total Ly alpha luminosity. However, xi(Ly alpha) is strongly correlated with quantities that scale with dust content, in the sense that a low dust abundance is a necessary requirement (although not the only one) in order to spread Ly alpha photons throughout the interstellar medium and drive a large extended Ly alpha halo.
  •  
14.
  • Hayes, Matthew, et al. (author)
  • THE LYMAN ALPHA REFERENCE SAMPLE. II. HUBBLE SPACE TELESCOPE IMAGING RESULTS, INTEGRATED PROPERTIES, AND TRENDS
  • 2014
  • In: Astrophysical Journal. - 0004-637X .- 1538-4357. ; 782:1
  • Journal article (peer-reviewed)abstract
    • We report new results regarding the Ly alpha output of galaxies, derived from the Lyman Alpha Reference Sample, and focused on Hubble Space Telescope imaging. For 14 galaxies we present intensity images in Ly alpha, H alpha, and UV, and maps of H alpha/H beta, Ly alpha equivalent width (EW), and Ly alpha/H alpha. We present Ly alpha and UV radial light profiles and show they are well-fitted by Sersic profiles, but Ly alpha profiles show indices systematically lower than those of the UV (n approximate to 1-2 instead of greater than or similar to 4). This reveals a general lack of the central concentration in Ly alpha that is ubiquitous in the UV. Photometric growth curves increase more slowly for Ly alpha than the far ultraviolet, showing that small apertures may underestimate the EW. For most galaxies, however, flux and EW curves flatten by radii approximate to 10 kpc, suggesting that if placed at high-z only a few of our galaxies would suffer from large flux losses. We compute global properties of the sample in large apertures, and show total Ly alpha luminosities to be independent of all other quantities. Normalized Ly alpha throughput, however, shows significant correlations: escape is found to be higher in galaxies of lower star formation rate, dust content, mass, and nebular quantities that suggest harder ionizing continuum and lower metallicity. Six galaxies would be selected as high-z Ly alpha emitters, based upon their luminosity and EW. We discuss the results in the context of high-z Ly alpha and UV samples. A few galaxies have EWs above 50 angstrom, and one shows f(esc)(Ly alpha) of 80%; such objects have not previously been reported at low-z.
  •  
15.
  • Khan, Muneeb, et al. (author)
  • A case for resource efficient prefetching in multicores
  • 2014
  • In: Proc. 43rd International Conference on Parallel Processing. - : IEEE Computer Society. - 9781479956180 ; , s. 101-110
  • Conference paper (peer-reviewed)abstract
    • Modern processors typically employ sophisticated prefetching techniques for hiding memory latency. Hardware prefetching has proven very effective and can speed up some SPEC CPU 2006 benchmarks by more than 40% when running in isolation. However, this speedup often comes at the cost of prefetching a significant volume of useless data (sometimes more than twice the data required) which wastes shared last level cache space and off-chip bandwidth. This paper explores how an accurate resource-efficient prefetching scheme can benefit performance by conserving shared resources in multicores. We present a framework that uses low-overhead runtime sampling and fast cache modeling to accurately identify memory instructions that frequently miss in the cache. We then use this information to automatically insert software prefetches in the application. Our prefetching scheme has good accuracy and employs cache bypassing whenever possible. These properties help reduce off-chip bandwidth consumption and last-level cache pollution. While single-thread performance remains comparable to hardware prefetching, the full advantage of the scheme is realized when several cores are used and demand for shared resources grows. We evaluate our method on two modern commodity multicores. Across 180 mixed workloads that fully utilize a multicore, the proposed software prefetching mechanism achieves up to 24% better throughput than hardware prefetching, and performs 10% better on average.
  •  
16.
  • Khan, Muneeb, et al. (author)
  • A case for resource efficient prefetching in multicores
  • 2014
  • In: Proc. International Symposium on Performance Analysis of Systems and Software. - : IEEE Computer Society. - 9781479936045 ; , s. 137-138
  • Conference paper (peer-reviewed)abstract
    • Hardware prefetching has proven very effective for hiding memory latency and can speed up some applications by more than 40%. However, this speedup comes at the cost of often prefetching a significant volume of useless data which wastes shared last level cache space and off-chip bandwidth. This directly impacts the performance of co-scheduled applications which compete for shared resources in multicores. This paper explores how a resource-efficient prefetching scheme can benefit performance by conserving shared resources in multicores. We present a framework that uses fast cache modeling to accurately identify memory instructions that benefit most from prefetching. The framework inserts software prefetches in the application only when they benefit performance, and employs cache bypassing whenever possible. These properties help reduce off-chip bandwidth consumption and last-level cache pollution. While single-thread performance remains comparable to hardware prefetching, the full advantage of the scheme is realized when several cores are used and demand for shared resources grows.
  •  
17.
  • Miftakhova, Regina, et al. (author)
  • DNA Methylation in ATRA-treated leukemia cell lines lacking a PML-RAR chromosome translocation
  • 2012
  • In: Anticancer Research. - : International Institute of Anticancer Research. - 0250-7005 .- 1791-7530. ; 32:11, s. 4715-4722
  • Journal article (peer-reviewed)abstract
    • Abstract A deficient retinoic acid signaling has been suggested to be an important cause of the clinical inefficacy of all-trans retinoic acid (ATRA) therapy in non-promyelocytic (non-PML) forms of acute myeloid leukemia (AML). The general aim of the present work was to explore novel ways to take advantage of the anti-leukemic potential of ATRA, and, specifically, to search for a synergism between ATRA and epigenetic drugs. Because previous reports have found no major influence of ATRA on DNA methylation, we investigated whether ATRA-mediated differentiation of the U937 and HL-60 AML cell lines, both lacking a PML-retinoic acid receptor (RAR) fusion product, is accompanied by early-appearing and weak changes in CpG methylation. We report that in HL-60 cells, by using a highly quantitative analysis of a set of genes found to be abnormally expressed in AML, polymerase chain reaction (PCR)-amplified p16 gene promoter molecules (each with 15 CpG sites), exhibited a CpG methylation level of 0-4% in untreated cells, which increased to 4-21% after treatment with ATRA for seven days. In contrast to HL-60 cells, U937 cells exhibited a very high CpG methylation level in p16, and ATRA did not influence the promoter methylation of this gene. In the total CCGG sites of the genome, analysed using a methylation-sensitive restriction enzyme, CpG methylation was significantly lower in ATRA-treated HL-60 (p<0.01) and U937 cells (p<0.05) than in controls. Taken together, our findings show that ATRA can influence DNA methylation, and suggest that future research should investigate whether epigenetic modulation may evoke a clinical effect of ATRA in leukemia.
  •  
18.
  • Papadopoulos, Yiannis, et al. (author)
  • Automatic allocation of safety integrity levels
  • 2010
  • In: Proceedings of the 1st Workshop on Critical Automotive applications. - New York : Association for Computing Machinery (ACM). - 9781605589152 ; , s. 7-10
  • Conference paper (peer-reviewed)abstract
    • In this paper, we describe a concept for the automatic allocationof general Safety Integrity Levels (SILs) to subsystems andcomponents of complex hierarchical networked architectures thatdeliver sets of safety critical functions. The concept is generic andcan be adapted to facilitate the safety engineering approachdefined in several standards that employ the concept of integrityor assurance levels including ISO 26262, the emergingautomotive safety standard. SIL allocation is facilitated by HiPHOPS,an automated safety analysis tool, and can be performed inthe context of development using EAST-ADL2, an automotivearchitecture description language. The process rationalizescomplex risk allocation and leads to optimal/economic allocationof SILs.
  •  
19.
  • Pardy, Stephen A., et al. (author)
  • THE LYMAN ALPHA REFERENCE SAMPLE. III. PROPERTIES OF THE NEUTRAL ISM FROM GBT AND VLA OBSERVATIONS
  • 2014
  • In: Astrophysical Journal. - 0004-637X .- 1538-4357. ; 794:2
  • Journal article (peer-reviewed)abstract
    • We present new Hi imaging and spectroscopy of the 14 UV-selected star-forming galaxies in the Lyman Alpha Reference Sample (LARS), aimed for a detailed study of the processes governing the production, propagation, and escape of Ly alpha photons. New Hi spectroscopy, obtained with the 100 m Green Bank Telescope (GBT), robustly detects the Hi spectral line in 11 of the 14 observed LARS galaxies (although the profiles of two of the galaxies are likely confused by other sources within the GBT beam); the three highest redshift galaxies are not detected at our current sensitivity limits. The GBT profiles are used to derive fundamental Hi line properties of the LARS galaxies. We also present new pilot Hi spectral line imaging of five of the LARS galaxies obtained with the Karl G. Jansky Very Large Array (VLA). This imaging localizes the Hi gas and provides a measurement of the total Hi mass in each galaxy. In one system, LARS 03 (UGC 8335 or Arp 238), VLA observations reveal an enormous tidal structure that extends over 160 kpc from the main interacting systems and that contains >10(9) M-circle dot of Hi. We compare various Hi properties with global Ly alpha quantities derived from Hubble Space Telescope measurements. The measurements of the Ly alpha escape fraction are coupled with the new direct measurements of Hi mass and significantly disturbed Hi velocities. Our robustly detected sample reveals tentative correlations between the total Hi mass and linewidth, and key Ly alpha tracers. Further, on global scales, these data support a complex coupling between Ly alpha propagation and the Hi properties of the surrounding medium.
  •  
20.
  • Sandberg, Andreas, 1984-, et al. (author)
  • A simple statistical cache sharing model for multicores
  • 2011
  • In: Proc. 4th Swedish Workshop on Multi-Core Computing. - Linköping, Sweden : Linköping University. ; , s. 31-36
  • Conference paper (other academic/artistic)abstract
    • The introduction of multicores has made analysis of shared  resources, such as shared caches and shared DRAM bandwidth, an  important topic to study. We present two simple, but accurate, cache  sharing models that use high-level data that can easily be measured  on existing systems. We evaluate our model using a simulated  multicore processor with four cores and a shared L2 cache. Our  evaluation shows that we can predict average sharing in groups of  four benchmarks with an average error smaller than 0.79% for random caches and 1.34% for LRU caches.
  •  
21.
  • Sandberg, Andreas, 1984-, et al. (author)
  • A Software Technique for Reducing Cache Pollution
  • 2010
  • In: Proc. 3rd Swedish Workshop on Multi-Core Computing. - Göteborg, Sweden : Chalmers University of Technology. ; , s. 59-62
  • Conference paper (other academic/artistic)abstract
    • Contention for shared cache resources has been recognizedas a major bottleneck for multicores—especially for mixedworkloads of independent applications. While most modernprocessors implement instructions to manage caches, theseinstructions are largely unused due to a lack of understand-ing of how to best leverage them.We propose an automatic, low-overhead, method to reducecache contention by finding instructions that are prone tocache trashing and a method to automatically disable cachingfor such instructions. Practical experiments demonstratethat our software-only method can improve application per-formance up to 35% on x86 multicore hardware.
  •  
22.
  • Sandberg, Andreas, 1984-, et al. (author)
  • Efficient techniques for predicting cache sharing and throughput
  • 2012
  • In: Proc. 21st International Conference on Parallel Architectures and Compilation Techniques. - New York : ACM Press. - 9781450311823 ; , s. 305-314
  • Conference paper (peer-reviewed)abstract
    • This work addresses the modeling of shared cache contention in multicore systems and its impact on throughput and bandwidth. We develop two simple and fast cache sharing models for accurately predicting shared cache allocations for random and LRU caches.To accomplish this we use low-overhead input data that captures the behavior of applications running on real hardware as a function of their shared cache allocation. This data enables us to determine how much and how aggressively data is reused by an application depending on how much shared cache it receives. From this we can model how applications compete for cache space, their aggregate performance (throughput)¸ and bandwidth.We evaluate our models for two- and four-application workloads in simulation and on modern hardware. On a four-core machine, we demonstrate an average relative fetch ratio error of 6.7% for groups of four applications. We are able to predict workload bandwidth with an average relative error of less than 5.2% and throughput with an average error of less than 1.8%. The model can predict cache size with an average error of 1.3% compared to simulation.
  •  
23.
  • Sandberg, Andreas, 1984-, et al. (author)
  • Full Speed Ahead : Detailed Architectural Simulation at Near-Native Speed
  • 2014
  • Reports (other academic/artistic)abstract
    • Popular microarchitecture simulators are typically several orders of magnitude slower than the systems they simulate. This leads to two problems: First, due to the slow simulation rate, simulation studies are usually limited to the first few billion instructions, which corresponds to less than 10% the execution time of many standard benchmarks. Since such studies only cover a small fraction of the applications, they run the risk of reporting unrepresentative application behavior unless sampling strategies are employed. Second, the high overhead of traditional simulators make them unsuitable for hardware/software co-design studies where rapid turn-around is required.In spite of previous efforts to parallelize simulators, most commonly used full-system simulations remain single threaded. In this paper, we explore a simple and effective way to parallelize sampling full-system simulators. In order to simulate at high speed, we need to be able to efficiently fast-forward between sample points. We demonstrate how hardware virtualization can be used to implement highly efficient fast-forwarding in the standard gem5 simulator and how this enables efficient execution between sample points. This extremely rapid fast-forwarding enables us to reach new sample points much quicker than a single sample can be simulated. Together with efficient copying of simulator state, this enables parallel execution of sample simulation. These techniques allow us to implement a highly scalable sampling simulator that exploits sample-level parallelism.We demonstrate how virtualization can be used to fast-forward simulators at 90% of native execution speed on average. Using virtualized fast-forwarding, we demonstrate a parallel sampling simulator that can be used to accurately estimate the IPC of standard workloads with an average error of 2.2% while still reaching an execution rate of 2.0 GIPS (63% of native) on average. We demonstrate that our parallelization strategy scales almost linearly and simulates one core at up to 93% of its native execution rate, 19,000x faster than detailed simulation, while using 8 cores.
  •  
24.
  • Sandberg, Anders, et al. (author)
  • Model-Based Safety Engineering of Interdependent Functions in Automotive Vehicles Using EAST-ADL2
  • 2010
  • In: COMPUTER SAFETY, RELIABILITY, AND SECURITY. - Berlin, Heidelberg : Springer. - 9783642156502
  • Conference paper (peer-reviewed)abstract
    • For systems where functions are distributed but share support forcomputation, communication, environment sensing and actuation, it is essentialto understand how such functions can affect each other. Preliminary HazardAnalysis (PHA) is the task through which safety requirements are established.This is usually a document-based process where each system function isanalyzed alone, making it difficult to reason about the commonalities of relatedfunctional concepts and the distribution of safety mechanisms across a systemof-systems. This paper presents a model-based approach to PHA with theEAST-ADL2 language and in accordance with the ISO/DIS 26262 standard.The language explicitly supports the definition and handling of requirements,functions and technical solutions, and their various relations and constraints as acoherent whole with multiple views. We show in particular the engineeringneeds for a systematic approach to PHA and the related language features forprecise modeling of requirements, user functionalities, system operationcontexts, and the derived safety mechanisms.
  •  
25.
  • Sandberg, Andreas, et al. (author)
  • Neutral gas in Lyman-alpha emitting galaxies Haro 11 and ESO 338-IG04 measured through sodium absorption
  • 2013
  • In: Astronomy and Astrophysics. - : EDP Sciences. - 0004-6361 .- 1432-0746. ; 552
  • Journal article (peer-reviewed)abstract
    • Context. The Lyman alpha emission line of neutral hydrogen is an important tool for finding galaxies at high redshift, thus for probing the structure of the early universe. However, the resonance nature of the line and its sensitivity to dust and neutral gas is still not fully understood. Aims. We present measurements of the velocity, covering fraction and optical depth of neutral gas in front of two well-known, local blue compact galaxies that show Lyman alpha in emission: ESO 338-IG 04 and Haro 11. We thus observationally test the hypothesis that Lyman alpha can escape through neutral gas by being Doppler shifted out of resonance. Methods. We present integral field spectroscopy, obtained with the GIRAFFE/Argus spectrograph at VLT/FLAMES in Paranal, Chile. The excellent wavelength resolution allowed us to accurately measure the velocity of the ionized and neutral gas through the Ha emission and Na D absorption, which trace the ionized medium and cold interstellar gas, respectively. We also present independent measurements from the VLT/X-shooter spectrograph that confirm our results. Results. For ESO 338-IG 04 we measure no significant shift of neutral gas: the best fit velocity offset is -15 +/- 16 km s(-1). For Haro 11, we see an outflow from knot B at 44 +/- 13 km s(-1), and infalling gas towards knot C with 32 +/- 12 km s(-1). Based on the relative strength of the Na D absorption lines, we estimate low covering fractions of neutral gas (down to 10%) in all three cases. Conclusions. The Na D absorption most likely occurs in dense clumps with higher column densities than the medium in which the bulk of the Ly alpha scattering takes place. Still, we find no strong correlation between outflowing neutral gas and strong Ly alpha emission. The Ly alpha photons from these two galaxies are therefore likely to be escaping due to a low column density and/or covering fraction.
  •  
26.
  • Sandberg, Andreas, 1984-, et al. (author)
  • Reducing Cache Pollution Through Detection and Elimination of Non-Temporal Memory Accesses
  • 2010
  • In: Proc. International Conference for High Performance Computing, Networking, Storage and Analysis. - Piscataway, NJ : IEEE. - 9781424475575 ; , s. 11-
  • Conference paper (peer-reviewed)abstract
    • Contention for shared cache resources has been recognized as a major bottleneck for multicores—especially for mixed workloads of independent applications. While most modern processors implement instructions to manage caches, these instructions are largely unused due to a lack of understanding of how to best leverage them. This paper introduces a classification of applications into four cache usage categories. We discuss how applications from different categories affect each other's performance indirectly through cache sharing and devise a scheme to optimize such sharing. We also propose a low-overhead method to automatically find the best per-instruction cache management policy. We demonstrate how the indirect cache-sharing effects of mixed workloads can be tamed by automatically altering some instructions to better manage cache resources. Practical experiments demonstrate that our software-only method can improve application performance up to 35% on x86 multicore hardware.
  •  
27.
  • Sandberg, Andreas, 1984- (author)
  • Understanding Multicore Performance : Efficient Memory System Modeling and Simulation
  • 2014
  • Doctoral thesis (other academic/artistic)abstract
    • To increase performance, modern processors employ complex techniques such as out-of-order pipelines and deep cache hierarchies. While the increasing complexity has paid off in performance, it has become harder to accurately predict the effects of hardware/software optimizations in such systems. Traditional microarchitectural simulators typically execute code 10 000×–100 000× slower than native execution, which leads to three problems: First, high simulation overhead makes it hard to use microarchitectural simulators for tasks such as software optimizations where rapid turn-around is required. Second, when multiple cores share the memory system, the resulting performance is sensitive to how memory accesses from the different cores interleave. This requires that applications are simulated multiple times with different interleaving to estimate their performance distribution, which is rarely feasible with today's simulators. Third, the high overhead limits the size of the applications that can be studied. This is usually solved by only simulating a relatively small number of instructions near the start of an application, with the risk of reporting unrepresentative results.In this thesis we demonstrate three strategies to accurately model multicore processors without the overhead of traditional simulation. First, we show how microarchitecture-independent memory access profiles can be used to drive automatic cache optimizations and to qualitatively classify an application's last-level cache behavior. Second, we demonstrate how high-level performance profiles, that can be measured on existing hardware, can be used to model the behavior of a shared cache. Unlike previous models, we predict the effective amount of cache available to each application and the resulting performance distribution due to different interleaving without requiring a processor model. Third, in order to model future systems, we build an efficient sampling simulator. By using native execution to fast-forward between samples, we reach new samples much faster than a single sample can be simulated. This enables us to simulate multiple samples in parallel, resulting in almost linear scalability and a maximum simulation rate close to native execution.
  •  
28.
  • Sandberg, Frida, et al. (author)
  • Circadian variation in dominant atrial fibrillation frequency in persistent atrial fibrillation
  • 2010
  • In: Physiological Measurement. - : IOP Publishing. - 0967-3334 .- 1361-6579. ; 31:4, s. 531-542
  • Journal article (peer-reviewed)abstract
    • Circadian variation in atrial fibrillation (AF) frequency is explored in this paper by employing recent advances in signal processing. Once the AF frequency has been estimated and tracked by a hidden Markov model approach, the resulting trend is analyzed for the purpose of detecting and characterizing the presence of circadian variation. With cosinor analysis, the results show that the short-term variations in the AF frequency exceed the variation that may be attributed to circadian. Using the autocorrelation method, circadian variation was found in 13 of 18 ambulatory ECG recordings (Holter) acquired from patients with long-standing persistent AF. Using the ensemble correlation method, the highest AF frequency usually occurred during the afternoon, whereas the lowest usually occurred during late night. It is concluded that circadian variation is present in most patients with long-standing persistent AF though the short-term variation in the AF frequency is considerable and should be taken into account.
  •  
29.
  • Sandberg, Fredrik (author)
  • Recognition of Prior Learning in Health Care : From a Caring Ideology and Power, to Communicative Action and Recognition
  • 2012
  • Doctoral thesis (other academic/artistic)abstract
    • During the last decades Recognition of Prior Learning (RPL) has become a more frequently used method to recognise adult’s prior learning. This thesis analyses a process of RPL in health care, where health care assistants are assessed against subjects in the upper-secondary health care program. Prior research on RPL is to a high degree non-theoretical and focus is primarily on policy level research. This thesis adds to the field by progressing a critical social theory perspective on RPL. In the thesis the RPL process is analysed through Jürgen Habermas’ theory of communicative action and Axel Honneth’s Recognition theory. General questions posed are: What are the power issues in the RPL process? What implications does the tension between the lifeworld of work and the system of education have? What consequences does mutual understanding and communication have for the outcome of the RPL process? What part does recognition play for the participants? The results disclose the power relations that emerge in the relationship between participants and teachers. A caring ideology is developed and problematized. Further, the importance of mutual understanding between participant and teachers in the assessment of prior learning is discussed focusing on the consequences a lack of mutual understanding could have for the outcome in such assessments. On a macro level the analysis demonstrates the tension between the participants prior learning and the educational systems demand for formalising prior learning. In addition, analyses of a more developmental character that intends to show the potential for critical learning, change and recognition are progressed.The results suggest that communicative action can be used to develop RPL into processes focusing on critical learning and change. Recognition of traits and abilities could also enhance individual’s positive relations with the self. Such recognition could develop self-confidence and thus RPL could encourage learning and cultivate continuing self-realisation through work.
  •  
30.
  • Sandberg, Fredrik, 1976-, et al. (author)
  • The interactive researcher as a virtual participant : A Habermasian interpretation
  • 2013
  • In: Action Research. - : Sage Publications. - 1476-7503 .- 1741-2617. ; 11:2, s. 194-212
  • Journal article (peer-reviewed)abstract
    • This article explores the role of the interactive researcher by drawing on Jürgen Habermas’s theory of communicative action to develop the concept of virtual participant. An ideal interactive research project is used to explore the issues faced by interactive researchers in three phases – initial, implementation and conclusion. In each phase, an interactive research project is used to demonstrate the issues that are discussed. First, this article argues that the concept of communicative rationality can be helpful in understanding how mutually trusting relationships between practitioners and researchers can be established at the beginning of a project. Second, it argues that the idea of taking a virtual stand on validity claims can be used during a project to engage a performative attitude and achieve mutual understanding with actors in the practice system. Third, this article argues that the concept of the virtual participant can explain how the interactive researcher can engage in performative action without becoming captive to the practice system. The concept of the virtual participant helps to enhance understanding of the complexity of the role of the interactive researcher.
  •  
31.
  • Schulze, S., et al. (author)
  • GRB 120422A/SN 2012bz : Bridging the gap between low- and high-luminosity gamma-ray bursts
  • 2014
  • In: Astronomy and Astrophysics. - : EDP Sciences. - 0004-6361 .- 1432-0746. ; 566
  • Journal article (peer-reviewed)abstract
    • Context. At low redshift, a handful of gamma-ray bursts (GRBs) have been discovered with luminosities that are substantially lower (L-iso less than or similar to 10(48.5) erg s(-1)) than the average of more distant ones (L-iso greater than or similar to 10(49.5) erg s(-1)). It has been suggested that the properties of several low-luminosity (low-L) GRBs are due to shock break-out, as opposed to the emission from ultrarelativistic jets. This has led to much debate about how the populations are connected. Aims. The burst at redshift z = 0.283 from 2012 April 22 is one of the very few examples of intermediate-L GRBs with a gamma-ray luminosity of L-iso similar to 10(49.6-49.9) erg s(-1) that have been detected up to now. With the robust detection of its accompanying supernova SN 2012bz, it has the potential to answer important questions on the origin of low-and high-L GRBs and the GRB-SN connection. Methods. We carried out a spectroscopy campaign using medium-and low-resolution spectrographs with 6-10-m class telescopes, which covered a time span of 37.3 days, and a multi-wavelength imaging campaign, which ranged from radio to X-ray energies over a duration of similar to 270 days. Furthermore, we used a tuneable filter that is centred at H alpha to map star-formation in the host and the surrounding galaxies. We used these data to extract and model the properties of different radiation components and fitted the spectral energy distribution to extract the properties of the host galaxy. Results. Modelling the light curve and spectral energy distribution from the radio to the X-rays revealed that the blast wave expanded with an initial Lorentz factor of Gamma(0) similar to 50, which is a low value in comparison to high-L GRBs, and that the afterglow had an exceptionally low peak luminosity density of less than or similar to 2 x 10(30) erg s(-1) Hz(-1) in the sub-mm. Because of the weak afterglow component, we were able to recover the signature of a shock break-out in an event that was not a genuine low-L GRB for the first time. At 1.4 hr after the burst, the stellar envelope had a blackbody temperature of k(B)T similar to 16 eV and a radius of similar to 7 x 10(13) cm (both in the observer frame). The accompanying SN 2012bz reached a peak luminosity of M-V = -19.7 mag, which is 0.3 mag more luminous than SN 1998bw. The synthesised nickel mass of 0.58 M-circle dot, ejecta mass of 5.87 M-circle dot, and kinetic energy of 4.10x10(52) erg were among the highest for GRB-SNe, which makes it the most luminous spectroscopically confirmed SN to date. Nebular emission lines at the GRB location were visible, which extend from the galaxy nucleus to the explosion site. The host and the explosion site had close-to-solar metallicity. The burst occurred in an isolated star-forming region with an SFR that is 1/10 of that in the galaxy's nucleus. Conclusions. While the prompt gamma-ray emission points to a high-L GRB, the weak afterglow and the low Gamma(0) were very atypical for such a burst. Moreover, the detection of the shock break-out signature is a new quality for high-L GRBs. So far, shock break-outs were exclusively detected for low-L GRBs, while GRB 120422A had an intermediate L-iso of similar to 10(49.6-49.9) erg s(-1). Therefore, we conclude that GRB 120422A was a transition object between low-and high-L GRBs, which supports the failed-jet model that connects low-L GRBs that are driven by shock break-outs and high-L GRBs that are powered by ultra-relativistic jets.
  •  
32.
  • Spégel, Peter, et al. (author)
  • Glucose-dependent insulinotropic polypeptide lowers branched chain amino acids in hyperglycemic rats
  • 2014
  • In: Regulatory Peptides. - : Elsevier BV. - 0167-0115 .- 1873-1686. ; 189, s. 11-16
  • Journal article (peer-reviewed)abstract
    • Hypersecretion of the incretin hormone glucose-dependent insulinotropic polypeptide (GIP) has been associated with obesity and glucose intolerance. This condition has been suggested to be linked to GIP resistance. Besides its insulinotropic effect, GIP also directly affects glucose uptake and lipid metabolism. This notwithstanding, effects of GIP on other circulating metabolites than glucose have not been thoroughly investigated. Here, we examined effects of infusion of various concentrations of GIP in normo- and hyperglycemic rats on serum metabolite profiles. We found that, despite a decrease in serum glucose levels (-26%, p < 0.01), the serum metabolite profile was largely unaffected by GIP infusion in normoglycemic rats. Interestingly, levels of branched chain amino acids and the ketone body beta-hydroxybutyrate were decreased by 21% (p < 0.05) and 27% (p < 0.001), respectively, in hyperglycemic rats infused with 60 ng/ml GIP. Hence, our data suggest that GIP provokes a decrease in BCAA levels and ketone body production. Increased concentrations of these metabolites have been associated with obesity and T2D.
  •  
33.
  • Zackrisson, Erik, et al. (author)
  • Hunting for dark halo substructure using submilliarcsecond-scale observations of macrolensed radio jets
  • 2013
  • In: Monthly notices of the Royal Astronomical Society. - : Oxford University Press (OUP). - 0035-8711 .- 1365-2966. ; 431:3, s. 2172-2183
  • Journal article (peer-reviewed)abstract
    • Dark halo substructure may reveal itself through secondary, small-scale gravitational lensing effects on light sources that are macrolensed by a foreground galaxy. Here, we explore the prospects of using Very Long Baseline Interferometry (VLBI) observations of multiply-imaged quasar jets to search for submilliarcsecond-scale image distortions produced by various forms of dark substructures in the 10(3)-10(8) M-circle dot mass range. We present lensing simulations relevant for the angular resolutions attainable with the existing European VLBI Network, the global VLBI array and an upcoming observing mode in which the Atacama Large Millimeter Array (ALMA) is connected to the global VLBI array. While observations of this type would not be sensitive to standard cold dark matter subhaloes, they can be used to detect the more compact forms of halo substructure predicted in alternative structure formation scenarios. By mapping approximately five strongly lensed systems, it should be possible to detect or robustly rule out primordial black holes in the 10(3)-10(6) M-circle dot mass range if they constitute greater than or similar to 1 per cent of the dark matter in these lenses. Ultracompact minihaloes are harder to detect using this technique, but 10(6)-10(8) M-circle dot ultracompact minihaloes could in principle be detected if they constitute greater than or similar to 10 per cent of the dark matter.
  •  
34.
  • Östlin, Göran, et al. (author)
  • THE Ly alpha REFERENCE SAMPLE. I. SURVEY OUTLINE AND FIRST RESULTS FOR MARKARIAN 259
  • 2014
  • In: Astrophysical Journal. - 0004-637X .- 1538-4357. ; 797:1
  • Journal article (peer-reviewed)abstract
    • The Ly alpha Reference Sample (LARS) is a substantial program with the Hubble Space Telescope (HST) that provides a sample of local universe laboratory galaxies in which to study the detailed astrophysics of the visibility and strength of the Ly alpha line of neutral hydrogen. Ly alpha is the dominant spectral line in use for characterizing high-redshift (z) galaxies. This paper presents an overview of the survey, its selection function, and HST imaging observations. The sample was selected from the combined GALEX+Sloan Digital Sky Survey catalog at z = 0.028-0.19, in order to allow Ly alpha to be captured with combinations of long-pass filters in the Solar Blind Channel (SBC) of the Advanced Camera for Surveys (ACS) onboard HST. In addition, LARS utilizes H alpha and H beta narrowband and u, b, i broadband imaging with ACS and the Wide Field Camera 3 (WFC3). In order to study galaxies in which large numbers of Ly alpha photons are produced (whether or not they escape), we demanded an H alpha equivalent width W(H alpha) >= 100 angstrom. The final sample of 14 galaxies covers far-UV (FUV, lambda similar to 1500 angstrom) luminosities that overlap with those of high-z Ly alpha emitters (LAEs) and Lyman break galaxies (LBGs), making LARS a valid comparison sample. We present the reduction steps used to obtain the Ly alpha images, including our LARS eXtraction software (LaXs), which utilizes pixel-by-pixel spectral synthesis fitting of the energy distribution to determine and subtract the continuum at Ly alpha. We demonstrate that the use of SBC long-pass-filter combinations increase the signal-to-noise ratio by an order of magnitude compared to the nominal Ly alpha filter available in SBC. To exemplify the science potential of LARS, we also present some first results for a single galaxy, Mrk 259 (LARS #1). This irregular galaxy shows bright and extended (indicative of resonance scattering) but strongly asymmetric Ly alpha emission. Spectroscopy from the Cosmic Origins Spectrograph on board HST centered on the brightest UV knot shows a moderate outflow in the neutral interstellar medium (probed by low ionization stage absorption features) and Ly alpha emission with an asymmetric profile. Radiative transfer modeling is able to reproduce the essential features of the Ly alpha line profile and confirms the presence of an outflow. From the integrated photometry we measure an Ly alpha luminosity of L-Ly alpha= 1.3x10(42) erg s(-1) an equivalent width W(Ly alpha) = 45 angstrom and an FUV absolute magnitude M-FUV = -19.2 (AB). Mrk 259 would hence be detectable in high-z Ly alpha and LBG surveys. The total Ly alpha escape fraction is 12%. This number is higher than the low-z average, but similar to that at z > 4, demonstrating that LARS provides a valid comparison sample for high-z galaxy studies.
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-34 of 34
Type of publication
journal article (14)
conference paper (11)
reports (3)
doctoral thesis (2)
book chapter (2)
editorial collection (1)
show more...
research review (1)
show less...
Type of content
peer-reviewed (23)
other academic/artistic (11)
Author/Editor
Guaita, Lucia (4)
Johansson, Rolf (2)
Lönn, Henrik (2)
Huge-Brodin, Maria (2)
Rehme, Jakob (2)
Sandberg, Erik (2)
show more...
Sollerman, Jesper (2)
Xu, D. (1)
Wright, D. (1)
Martin, S. (1)
Winters, J.M. (1)
Rossi, A. (1)
Page, K. L. (1)
Cano, Z. (1)
Kann, D. A. (1)
Covino, S. (1)
Smedby, Karin E. (1)
Pandey, S. B. (1)
D'Elia, V (1)
Goldoni, P. (1)
Tagliaferri, G. (1)
Greiner, J. (1)
Schady, P. (1)
Sandberg, Frida (1)
Holmberg, Lars (1)
Mansouri, Larry (1)
Juliusson, Gunnar (1)
Abele, Andreas (1)
Papadopoulos, Yianni ... (1)
Reiser, Mark-Oliver (1)
Weber, Matthias (1)
Lindqvist, Andreas (1)
Wierup, Nils (1)
Persson, Magnus (1)
Geisler, Christian H (1)
Nilsson, Annika (1)
O'Brien, P. T. (1)
Fejes, Andreas, 1977 ... (1)
Klose, S. (1)
Krühler, T. (1)
Castro-Tirado, A. J. (1)
Fynbo, J. P. U. (1)
Hjorth, J. (1)
Levan, A. J. (1)
Milvang-Jensen, B. (1)
Perley, D. A. (1)
Pignata, G. (1)
Tanvir, N. R. (1)
Wiersema, K. (1)
Spégel, Peter (1)
show less...
University
Uppsala University (14)
Stockholm University (8)
Linköping University (5)
Lund University (5)
University of Gothenburg (2)
Umeå University (2)
show more...
Royal Institute of Technology (2)
Karolinska Institutet (2)
Örebro University (1)
Malmö University (1)
Chalmers University of Technology (1)
RISE (1)
Högskolan Dalarna (1)
show less...
Language
English (32)
Swedish (2)
Research subject (UKÄ/SCB)
Natural sciences (14)
Engineering and Technology (8)
Medical and Health Sciences (5)
Social Sciences (5)

Year

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view