SwePub
Sök i SwePub databas

  Extended search

Träfflista för sökning "WFRF:(Maguire Jr. Gerald Q.) "

Search: WFRF:(Maguire Jr. Gerald Q.)

  • Result 1-50 of 332
Sort/group result
   
EnumerationReferenceCoverFind
1.
  • Eriksson, Thomas, et al. (author)
  • Are low-dose CT scans a satisfactory substitute for stereoradiographs for migration studies? A preclinical test of low-dose CT scanning protocols and their application in a pilot patient.
  • 2019
  • In: Acta Radiologica. - : Sage Publications. - 0284-1851 .- 1600-0455.
  • Journal article (peer-reviewed)abstract
    • BACKGROUND: Computed tomography (CT) has the potential to acquire the data needed for migration studies of orthopedic joint implants of patients who have had tantalum beads implanted at the time of joint replacement surgery. This can be accomplished with the same precision as radiostereometric analysis (RSA). Switching to CT would increase availability without the need for the specific facilities required for RSA. However, higher effective dose is a concern.PURPOSE: To investigate if migration measurements can be done with CT with an accuracy and effective dose comparable to that of conventional RSA.MATERIAL AND METHODS: Fourteen scanning protocols were tested in a hip phantom that incorporated tantalum beads and an uncemented femoral stem. The protocols were graded for clinical practice according to the three parameters of image quality, effective dose, and robustness of numerical data. After grading, the two protocols that graded best overall were applied to a pilot patient.RESULTS: All protocols produced scans in which the numerical data were sufficient for a migration analysis at least as precise as would be expected using RSA. A protocol with an effective dose of 0.70 mSv was shown to be applicable in a pilot patient.CONCLUSION: Low-dose CT scans with an effective dose comparable to a set of routine plain radiographs can be used for precise migration measurements.
  •  
2.
  • Försth, Peter, 1966-, et al. (author)
  • Motion Analysis in Lumbar Spinal Stenosis With Degenerative Spondylolisthesis : A Feasibility Study of the 3DCT Technique Comparing Laminectomy Versus Bilateral Laminotomy.
  • 2018
  • In: Clinical spine surgery. - : Wolters Kluwer. - 2380-0186 .- 2380-0194. ; 31:8, s. E397-E402
  • Journal article (peer-reviewed)abstract
    • Study Design: This was a randomized radiologic biomechanical pilot study in vivo. Objective: The objectives of this study was to evaluate if 3-dimensional computed tomography is a feasible tool in motion analyses of the lumbar spine and to study if preservation of segmental midline structures offers less postoperative instability compared with central decompression in patients with lumbar spinal stenosis with degenerative spondylolisthesis. Summary of Background Data: The role of segmental instability after decompression is controversial. Validated techniques for biomechanical evaluation of segmental motion in human live subjects are lacking. Methods: In total, 23 patients (mean age, 68 y) with typical symptoms and magnetic resonance imaging findings of spinal stenosis with degenerative spondylolisthesis (>3 mm) in 1 or 2 adjacent lumbar levels from L3 to L5 were included. They were randomized to either laminectomy (LE) or bilateral laminotomy (LT) (preservation of the midline structures). Documentation of segmental motion was made preoperatively and 6 months postoperatively with CT in provoked flexion and extension. Analyses of movements were performed with validated software. The accuracy for this method is 0.6 mm in translation and 1 degree in rotation. Patient-reported outcome measures were collected from the Swespine register preoperatively and 2-year postoperatively. Results: The mean preoperative values for 3D rotation and translation were 6.2 degrees and 1.8 mm. The mean increase in 3D rotation 6 months after surgery was 0.25 degrees after LT and 0.7 degrees after LE (P=0.79) while the mean increase in 3D translation was 0.15 mm after LT and 1.1 mm after LE (P=0.42). Both surgeries demonstrated significant improvement in patient-reported outcome measures 2 years postoperatively. Conclusions: The 3D computed tomography technique proved to be a feasible tool in the evaluation of segmental motion in this group of older patients. There was negligible increase in segmental motion after decompressive surgery. LE with removal of the midline structures did not create a greater instability compared with when these structures were preserved.
  •  
3.
  • Goldvasser, Dov, et al. (author)
  • In vivo and ex vivo measurement of polyethylene wear in total hip arthroplasty
  • 2014
  • In: Acta Orthopaedica. - : Medical Journals Sweden AB. - 1745-3674 .- 1745-3682. ; 85:3, s. 271-275
  • Journal article (peer-reviewed)abstract
    • Background - Determination of the amount of wear in a polyethylene liner following total hip arthroplasty (THA) is important for both the clinical care of individual patients and the development of new types of liners. Patients and methods - We measured in vivo wear of the polyethylene liner using computed tomography (CT) (obtained in the course of regular clinical care) and compared it to coordinate-measuring machine (CMM) readings. Also, changes in liner thickness of the same retrieved polyethylene liner were measured using a micrometer, and were compared to CT and CMM measurements. The distance between the centers of the acetabular cup and femoral head component was measured in 3D CT, using a semi-automatic analysis method. CMM readings were performed on each acetabular liner and data were analyzed using 3D computer-aided design software. Micrometer readings compared the thickest and thinnest regions of the liner. We analyzed 10 THA CTs and retrievals that met minimal requirements for CT slice thickness and explanted cup condition. Results - For the 10 cups, the mean difference between the CT readings and the CMM readings was -0.09 (-0.38 to 0.20) mm. This difference was not statistically significant (p = 0.6). Between CT and micrometer, the mean difference was 0.11 (-0.33 to 0.55) mm. This difference was not statistically significant (p = 0.6). Interpretation - Our results show that CT imaging is ready to be used as a tool in clinical wear measurement of polyethylene liners used in THA.
  •  
4.
  • Hatherly, Robert, et al. (author)
  • Technical Requirements for Na18F PET Bone Imaging of Patients Being Treated Using a Taylor Spatial Frame.
  • 2014
  • In: Journal of nuclear medicine technology. - : Society of Nuclear Medicine. - 1535-5675 .- 0091-4916.
  • Journal article (peer-reviewed)abstract
    • Diagnosis of new bone growth in patients with compound tibia fractures or deformities treated using a Taylor spatial frame is difficult with conventional radiography because the frame obstructs the images and creates artifacts. The use of Na(18)F PET studies may help to eliminate this difficulty.METHODS: Patients were positioned on the pallet of a clinical PET/CT scanner and made as comfortable as possible with their legs immobilized. One bed position covering the site of the fracture, including the Taylor spatial frame, was chosen for the study. A topogram was performed, as well as diagnostic and attenuation correction CT. The patients were given 2 MBq of Na(18)F per kilogram of body weight. A 45-min list-mode acquisition was performed starting at the time of injection, followed by a 5-min static acquisition 60 min after injection. The patients were examined 6 wk after the Taylor spatial frame had been applied and again at 3 mo to assess new bone growth.RESULTS: A list-mode reconstruction sequence of 1 × 1,800 and 1 × 2,700 s, as well as the 5-min static scan, allowed visualization of regional bone turnover.CONCLUSION: With Na(18)F PET/CT, it was possible to confirm regional bone turnover as a means of visualizing bone remodeling without the interference of artifacts from the Taylor spatial frame. Furthermore, dynamic list-mode acquisition allowed different sequences to be performed, enabling, for example, visualization of tracer transport from blood to the fracture site.
  •  
5.
  • Weidenhielm, Lars, et al. (author)
  • Prosthetic liner wear in total hip replacement : a longitudinal 13-year study with computed tomography.
  • 2018
  • In: Skeletal Radiology. - : Springer. - 0364-2348 .- 1432-2161. ; 47:6, s. 883-887
  • Journal article (peer-reviewed)abstract
    • This case report follows a woman who had a total hip replacement in 1992 when she was 45 years old. Six serial computed tomography (CT) examinations over a period of 13 years provided information that allowed her revision surgery to be limited to liner replacement as opposed to replacement of the entire prosthesis. Additionally, they provided data that ruled out the presence of osteolysis and indeed none was found at surgery. In 2004, when the first CT was performed, the 3D distance the femoral head had penetrated into the cup was determined to be 2.6 mm. By 2017, femoral head penetration had progressed to 5.0 mm. The extracted liner showed wear at the thinnest part to be 5.5 mm, as measured with a micrometer. The use of modern CT techniques can identify problems, while still correctable without major surgery. Furthermore, the ability of CT to assess the direction of wear revealed that the liner wear changed from the cranial to dorsal direction.
  •  
6.
  • West, Jay B., et al. (author)
  • Comparison and evaluation of retrospective intermodality image registration techniques
  • 1997
  • In: SPIE - The International Society for Optical Engineering. - : SPIE - International Society for Optical Engineering. ; , s. 332-347
  • Conference paper (peer-reviewed)abstract
    • All retrospective image registration methods have attached to them some intrinsic estimate of registration error. However, this estimate of accuracy may not always be a good indicator of the distance between actual and estimated positions of targets within the cranial cavity. This paper describes a project whose principal goal is to use a prospective method based on fiducial markers as a ’gold standard’ to perform an objective, blinded evaluation of the accuracy of several retrospective image-to-image registration techniques. Image volumes of three modalities – CT, MR, and PET – were taken of patients undergoing neurosurgery at Vanderbilt University Medical Center. These volumes had all traces of the fiducial markers removed, and were provided to project collaborators outside Vanderbilt, who then performed retrospective registrations on the volumes, calculating transformations from CT to MR and/or from PET to MR, and communicated their transformations to Vanderbilt where the accuracy of each registration was evaluated. In this evaluation the accuracy is measured at multiple ’regions of interest,’ i.e. areas in the brain which would commonly be areas of neurological interest. A region is defined in the MR image and its centroid C is determined. Then the prospective registration is used to obtain the corresponding point C’ in CT or PET. To this point the retrospective registration is then applied, producing C’ in MR. Statistics are gathered on the target registration error (TRE), which is the disparity between the original point C and its corresponding point C’. A second goal of the project is to evaluate the importance of correcting geometrical distortion in MR images, by comparing the retrospective TRE in the rectified images, i.e., those which have had the distortion correction applied, with that of the same images before rectification. This paper presents preliminary results of this study along with a brief description of each registration technique and an estimate of both preparation and execution time needed to perform the registration.
  •  
7.
  •  
8.
  • Aguilar, Antonio, et al. (author)
  • Positive Patient Identification using RFID and Wireless  Networks
  • 2006
  • In: Proceedings of the HISI 11th Annual Conference and Scientific Symposium, Dublin, Ireland. - Dublin, Ireland.
  • Conference paper (peer-reviewed)abstract
    • The increased focus on patient safety in hospitals has yielded a flood of new technologies and tools seeking to improve the quality of patient care at the point-of-care. Hospitals are complex institutions by nature, and are constantly challenged to improve the quality of healthcare delivered to patients while trying to reduce the rate of medical errors and improve patient safety. Here a simple mistake such as patient misidentification, specimen misidentification, wrong medication, or wrong blood transfusion can cause the loss of a patient's life. The focus of this paper is the implementation and evaluation of a handheld-based patient identification system that uses radio frequency identification (RFID) and 802.11b wireless networks to identify patients. In this approach, each patient is given a RFID wristband which contains demographic information (patient ID number, patient summary, hospital code) of the patient. A handheld device equipped with 802.11b wireless connectivity and a RFID reader is then used by the medical staff to read the patient's wristband and identify the patient. This work was carried out at the Department of Medical Physics and Bioengineering at the University College Hospital Galway, Ireland and in co-operation with the National University of Ireland, Galway.
  •  
9.
  • Ahlin, Lars, et al. (author)
  • Automous Tactical Communications Possibilities and Problems
  • 1997
  • In: MILCOM 97 Proceedings. - 0780342496 ; , s. 393-397
  • Conference paper (peer-reviewed)abstract
    • In the battlefield of the future, more and more information will be available for making decisions on a tactical level, provided that this information can be dispersed rapidly and accurately. As a consequence, advanced tactical decision support that now is limited to advanced platforms (e.g. combat aircrafts) will become available at a much lower level, ranging from different kinds of vehicles, down to the individual soldier by means of ultra-light weight ``wearable'' equipment. Establishing reliable wireless communications in such a large group of users with unprecidented bandwidth demands and requirements on survivability constitutes a considerable enginerring challenge. In the paper we will, after a short review of some existing approaches, investigate the specific engineering challenges and the fundamental limitations of such low level, autonomous communication systems. Further we give an example of a system architecture, harmonized with a proposed structure for third generation commercial wireless systems (e.g. UMTS). Our conclusions show that mainly distributed computing complexity, device power consumption and available bandwidth constitute the fundamental problems.
  •  
10.
  • Aitken, Candice L., et al. (author)
  • Comparison of three methods used for fusion of SPECT-CT images of liver matastases
  • 1998
  • In: Fusion98, International Conference on Multisource-Mulltisensor Information Fusion. - : CSREA Press. - 1892512009 ; , s. 435-442
  • Conference paper (peer-reviewed)abstract
    • We compare three methods for fusing SPECT-CT images: ImageMatch - an automatic three-dimensional/two-dimensional method developed by Focus Imaging; IBM Visualization Data Explorer - a three-diemensional interactive method developed by Internation Business Machines, Inc.; and qsh - an interactive three-dimensional/two-dimensional method developed at New York University. While many fusion methods have proved successful for registering brain images, most methods have been less successful for thoracic and abdominal images. We use images of liver metastases obtained with a radiolabeled breast tumor-directed antibody to illustrate the strengths and weakness of the methods reviewed. The images used are typical clinical images from eigth patients. We conclude that an optimal image fusion program should combine the strengths of each of the methods reviewed.
  •  
11.
  • Aitken, Candice L., et al. (author)
  • Tumor localization and image registration of 18-FDG SPECT scans with CT scans
  • 1999
  • In: Journal of Nuclear Medicine. - : Society of Nuclear Medicine. - 0161-5505 .- 1535-5667. ; 40:5, s. 290P-291P
  • Journal article (peer-reviewed)abstract
    • PURPOSE: The aim of this study was to determine the feasibility of registering routine clinical F-18 fluorodeoxyglucose (FDG) coincidence detection (CD) scans with computed tomographic (CT) scans for radiation treatment planning and case management. METHODS: F-18 FDG CD and chest CT scans, performed in 10 randomly selected patients with confirmed or possible adenocarcinoma of the lung, were evaluated. The quality of the matches was verified by comparisons of the center-to-center distance between a region of interest (ROI) manually drawn on the CT slice and warped onto the CD slice with an ROI drawn manually directly on the CD slice. In addition, the overlap between the two ROIs was calculated. RESULTS: All 10 F-18 FDG CD and CT scans were registered with good superimposition of soft tissue density on increased radionuclide activity. The center-to-center distance between the ROIs ranged from 0.29 mm to 8.08 mm, with an average center-to-center distance of 3.89 mm +/- 2.42 mm (0.69 pixels +/- 0.34 pixels). The ROI overlap ranged from 77% to 99%, with an average of 90% +/- 5.6%. CONCLUSIONS: Although the use of F-18 FDG CD shows great promise for the identification of tumors, it shares the same drawbacks as those associated with radiolabeled monoclonal antibody SPECT and ligand-based positron emission tomographic scans in that anatomic markers are limited. This study shows that image registration is feasible and may improve the clinical relevance of CD images.
  •  
12.
  • Aitken, Candice L., et al. (author)
  • Tumor localization and image registration of F-18FDG coincidence detection scans with computed tomographic scans
  • 2002
  • In: Clinical Nuclear Medicine. - : Ovid Technologies (Wolters Kluwer Health). - 0363-9762 .- 1536-0229. ; 27:4, s. 275-282
  • Journal article (peer-reviewed)abstract
    • Purpose: The aim of this study was to determine the feasibility of registering routine clinical F-18 fluorodeoxyglucose (FDG) coincidence detection (CD) scans with computed tomographic (CT) scans for radiation treatment planning and case management. Methods: F-18 FDG CD and chest CT scans, performed in 10 randomly selected patients with confirmed or possible adenocarcinoma of the lung, were evaluated. The quality of the matches was verified by comparisons of the center-to-center distance between a region of interest (ROI) manually drawn on the CT slice and warped onto the CD slice with an ROI drawn manually directly on the CD slice. In addition, the overlap between the two ROIs was calculated. Results: All 10 F-18 FDG CD and CT scans were registered with good superimposition of soft tissue density on increased radionuclide activity. The center-to-center distance between the ROIs ranged from 0.29 mm to 8.08 mm, with an average center-to-center distance of 3.89 mm 2.42 mm (0.69 pixels +/- 0.34 pixels). The ROI overlap ranged from 77% to 99%, with an average of 90% +/- 5.6%. Conclusions: Although the use of F-18 FDG CD shows great promise for the identification of tumors, it shares the same drawbacks as those associated with radiolabeled monoclonal antibody SPECT and ligand-based positron emission tomographic scans in that anatomic markers are limited. This study shows that image registration is feasible and may improve the clinical relevance of CD images.
  •  
13.
  • Alcala, Yvonne, et al. (author)
  • Qualifying CT for wrist arthroplasty : Extending techniques for total hip arthroplasty to total wrist arthroplasty
  • 2005
  • In: Medical Imaging 2005. - : SPIE - The International Sooceity for Optical Engineeering. - 0819457213 ; , s. 1155-1164
  • Conference paper (peer-reviewed)abstract
    • The purpose of this study was to extend previous work to detect migration of total wrist arthroplasty non-invasively, and with greater accuracy. Two human cadaverous arms, each with a cemented total wrist implant, were used in this study. In one of the arms, I mm tantalum balls were implanted, six in the carpal bones and five in the radius. Five CT scans of each arm were acquired, changing the position of the arm each time to mimic different positions patients might take on repeated examinations. Registration of CT volume data sets was performed using an extensively validated, 3D semi-automatic volume fusion tool in which co-homologous point pairs (landmarks) are chosen on each volume to be registered. Three sets of ten cases each were obtained by placing landmarks on 1) bone only (using only arm one), 2) tantalum implants only, and 3) bone and tantalum implants (both using only arm two). The accuracy of the match was assessed visually in 2D and 3D, and numerically by calculating the distance difference between the actual position of the transformed landmarks and their ideal position (i.e., the reference landmark positions). All cases were matched visually within one width of cortical bone and numerically within one half CT voxel (0.32 mm, p = 0.05). This method matched only the bone/arm and not the prosthetic component per se, thus making it possible to detect prosthetic movement and wear. This method was clinically used for one patient with pain. Loosening of the carpal prosthetic component was accurately detected and this was confirmed at surgery.
  •  
14.
  •  
15.
  • Anderlind, Eva, et al. (author)
  • Will haptic feedback speed up medical imaging? An application to radiation treatment planning
  • 2008
  • In: Acta Oncologica. - OSLO, Norge : Taylor & Francis. - 0284-186X .- 1651-226X. ; 47:1, s. 32-37
  • Journal article (peer-reviewed)abstract
    • Haptic technology enables us to incorporate the sense of touch into computer applications, providing an additional input/output channel. The purpose of this study was to examine if haptic feedback can help physicians and other practitioners to interact with medical imaging and treatment planning systems. A haptic application for outlining target areas (a key task in radiation therapy treatment planning) was implemented and then evaluated via a controlled experiment with ten subjects. Even though the sample size was small, and the application only a prototype, results showed that haptic feedback can significantly increase (p0.05) the speed of outlining target volumes and organs at risk. No significant differences were found regarding precision or perceived usability. This promising result warrants further development of a full haptic application for this task. Improvements to the usability of the application as well as to the forces generated have been implemented and an experiment with more subjects is planned.
  •  
16.
  • Barbette, Tom, 1990-, et al. (author)
  • A High-Speed Load-Balancer Design with Guaranteed Per-Connection-Consistency
  • 2020
  • In: Proceedings of the 17th USENIX Symposium on Networked Systems Design and Implementation, NSDI 2020. - Santa Clara, CA, USA : USENIX Association. ; , s. 667-683
  • Conference paper (peer-reviewed)abstract
    • Large service providers use load balancers to dispatch millions of incoming connections per second towards thousands of servers. There are two basic yet critical requirements for a load balancer: uniform load distribution of the incoming connections across the servers and per-connection-consistency (PCC), i.e., the ability to map packets belonging to the same connection to the same server even in the presence of changes in the number of active servers and load balancers. Yet, meeting both these requirements at the same time has been an elusive goal. Today's load balancers minimize PCC violations at the price of non-uniform load distribution.This paper presents Cheetah, a load balancer that supports uniform load distribution and PCC while being scalable, memory efficient, resilient to clogging attacks, and fast at processing packets. The Cheetah LB design guarantees PCC for any realizable server selection load balancing mechanism and can be deployed in both a stateless and stateful manner, depending on the operational needs. We implemented Cheetah on both a software and a Tofino-based hardware switch. Our evaluation shows that a stateless version of Cheetah guarantees PCC, has negligible packet processing overheads, and can support load balancing mechanisms that reduce the flow completion time by a factor of 2–3×.
  •  
17.
  • Barbette, Tom, 1990-, et al. (author)
  • Cheetah : A High-Speed Programmable Load-Balancer Framework with Guaranteed Per-Connection-Consistency
  • 2022
  • In: IEEE/ACM Transactions on Networking. - : Institute of Electrical and Electronics Engineers (IEEE). - 1063-6692 .- 1558-2566. ; 30:1, s. 354-367
  • Journal article (peer-reviewed)abstract
    • Large service providers use load balancers to dispatch millions of incoming connections per second towards thousands of servers. There are two basic yet critical requirements for a load balancer: uniform load distribution of the incoming connections across the servers, which requires to support advanced load balancing mechanisms, and per-connection-consistency (PCC), i.e, the ability to map packets belonging to the same connection to the same server even in the presence of changes in the number of active servers and load balancers. Yet, simultaneously meeting these requirements has been an elusive goal. Today's load balancers minimize PCC violations at the price of non-uniform load distribution. This paper presents Cheetah, a load balancer that supports advanced load balancing mechanisms and PCC while being scalable, memory efficient, fast at processing packets, and offers comparable resilience to clogging attacks as with today's load balancers. The Cheetah LB design guarantees PCC for any realizable server selection load balancing mechanism and can be deployed in both stateless and stateful manners, depending on operational needs. We implemented Cheetah on both a software and a Tofino-based hardware switch. Our evaluation shows that a stateless version of Cheetah guarantees PCC, has negligible packet processing overheads, and can support load balancing mechanisms that reduce the flow completion time by a factor of 2-3 ×.
  •  
18.
  • Barbette, Tom, 1990-, et al. (author)
  • RSS++: load and state-aware receive side scaling
  • 2019
  • In: Proceedings of the 15th International Conference on emerging Networking EXperiments and Technologies. - Orlando, FL, USA : Association for Computing Machinery (ACM). - 9781450369985
  • Conference paper (peer-reviewed)abstract
    • While the current literature typically focuses on load-balancing among multiple servers, in this paper, we demonstrate the importance of load-balancing within a single machine (potentially with hundreds of CPU cores). In this context, we propose a new load-balancing technique (RSS++) that dynamically modifies the receive side scaling (RSS) indirection table to spread the load across the CPU cores in a more optimal way. RSS++ incurs up to 14x lower 95th percentile tail latency and orders of magnitude fewer packet drops compared to RSS under high CPU utilization. RSS++ allows higher CPU utilization and dynamic scaling of the number of allocated CPU cores to accommodate the input load, while avoiding the typical 25% over-provisioning. RSS++ has been implemented for both (i) DPDK and (ii) the Linux kernel. Additionally, we implement a new state migration technique, which facilitates sharding and reduces contention between CPU cores accessing per-flow data. RSS++ keeps the flow-state by groups that can be migrated at once, leading to a 20% higher efficiency than a state of the art shared flow table.
  •  
19.
  • Barbette, Tom, 1990-, et al. (author)
  • Stateless CPU-aware datacenter load-balancing
  • 2020
  • In: Poster: Stateless CPU-aware datacenter load-balancing. - New York, NY, USA : Association for Computing Machinery (ACM). ; , s. 548-549
  • Conference paper (peer-reviewed)abstract
    • Today, datacenter operators deploy Load-balancers (LBs) to efficiently utilize server resources, but must over-provision server resources (by up to 30%) because of load imbalances and the desire to bound tail service latency. We posit one of the reasons for these imbalances is the lack of per-core load statistics in existing LBs. As a first step, we designed CrossRSS, a CPU core-aware LB that dynamically assigns incoming connections to the least loaded cores in the server pool. CrossRSS leverages knowledge of the dispatching by each server's Network Interface Card (NIC) to specific cores to reduce imbalances by more than an order of magnitude compared to existing LBs in a proof-of-concept datacenter environment, processing 12% more packets with the same number of cores.
  •  
20.
  •  
21.
  •  
22.
  •  
23.
  •  
24.
  • Beadle, H.W.P., et al. (author)
  • Location Aware Mobile Computing
  • 1997
  • In: Proceedings of ICT '97. - : IEEE. ; , s. 1319-1324
  • Conference paper (peer-reviewed)
  •  
25.
  •  
26.
  • Beadle, H.W.P., et al. (author)
  • Using location and environment awareness in mobile communications
  • 1997
  • In: Proceedings of the International Conference on Information, Communications and Signal Processing, ICICS. - : IEEE. ; , s. 1781-1785
  • Conference paper (peer-reviewed)abstract
    • We are investigating the use of badge based wearable computers to create highly mobile location and environment aware systems. When coupled to intelligent servers the badges provide an unparalleled platform for human centred information environments. This paper describes the architecture of the badge, its distributed computing environment, and presents initial results of application development trials conducted by a class of telecommunications students at KTH.
  •  
27.
  • Biema, Michael K. van, et al. (author)
  • The constraint-based paradigm : integrating object-oriented and rule-based programming
  • 1990
  • In: Proceedings of the Twenty-Third Annual Hawaii International Conference on System Sciences. Volume 1. - : IEEE Computer Society. ; , s. 358-366
  • Conference paper (peer-reviewed)abstract
    • The authors introduce a novel formalism that combines the object-oriented and rule-based paradigms in an elegant and orthogonal way. The constraint-based model is a generalization of traditional object-oriented paradigms and is based on three orthogonal subparadigms. The first is constraint-based invocation, which is a generalization of the traditional invocation where dispatch is done based on the types of the arguments. In constraint-based invocation, dispatch is done based on constraints that are arbitrary user-defined predicates. The second subparadigm is instance inheritance, a dual to the concept of class inheritance in the sense that class inheritance structures classes and instance inheritance structures instances. The third is procedural attachments (also known as active values or access-oriented programming), where a function is called in a data-driven manner. The semantics of this concept are generalized to all objects in the constraint-based model. A central philosophical argument is that so-called multiparadigm languages should be developed not by combination of paradigms in a partially integrated system, but by their synergistic unification under a new, subsuming paradigm.
  •  
28.
  • Biema, Michael K. van, et al. (author)
  • The Design and Implementation of System Level Language for the DADO Parallel Machine
  • 1987
  • In: Proceedings of the Twentieth Annual Hawaii International Conference on System Sciences 1987. Volume 3. - : IEEE and ACM. ; , s. 152-162
  • Conference paper (peer-reviewed)abstract
    • In this paper we describe necessary criteria for the design of parallel system level languages and their support environment. We base the criteria on our own experiences in building a system level language parallel PSL (Parallel Portable Standard Lisp) for the DADO machine developed at Columbia University. The DADO machine is a special purpose massively parallel binary tree structured architecture. We describe the process of language design and implementation on a 1023 node prototype machine. After generalizing what we have learned from this specific implementation to the generic task of building a system level language for a parallel machine, we conclude with a discussion of desirable characteristics such a language should have to allow the easy transition from a language with explicit parallelism to one where the parallelism is implicit.
  •  
29.
  •  
30.
  • Birnbaum, Bernard A., et al. (author)
  • Hepatic hemangiomas: diagnosis with fusion of MR, CT, and Tc-99m-labeled red blood cell SPECT images
  • 1991
  • In: Radiology. - : Radiological Society of North America. - 0033-8419 .- 1527-1315. ; 181:2, s. 469-474
  • Journal article (peer-reviewed)abstract
    • A method of image analysis was developed for correlation of hemangiomas detected at computed tomography {(CT)} and/or magnetic resonance {(MR)} imaging with increased blood pool activity evident at single photon emission {CT} {(SPECT)} performed after labeling of red blood cells with technetium-99m. Image analysis was performed in 20 patients with 35 known hepatic hemangiomas. After section thickness and pixel sizes of the different studies were matched, intrinsic landmarks were chosen to identify anatomically corresponding locations. Regions of interest {(ROIs)} drawn on the {CT} and/or {MR} images were translated, rotated, and reprojected to match the areas of interest on the corresponding {SPECT} images by means of a two-dimensional polynomial-based warping algorithm. Analysis of {ROIs} on 30 {SPECT-MR} and 20 {SPECT-CT} pairs of registered images provided absolute confirmation that 34 suspected hemangiomas identified on {SPECT} images correlated exactly with lesions seen on {CT} and/or {MR} images. Accuracy of fusion was within an average of 1.5 pixels +/- 0.8 (+/- 1 standard deviation). The technique enabled diagnostic confirmation of hemangiomas as small as 1.0 cm and proved useful for evaluating lesions located adjacent to intrahepatic vessels.
  •  
31.
  • Bogdanov, Kirill, 1987- (author)
  • Enabling Fast and Accurate Run-Time Decisions in Geo-Distributed Systems : Better Achieving Service Level Objectives
  • 2018
  • Doctoral thesis (other academic/artistic)abstract
    • Computing services are highly integrated into modern society and used  by millions of people daily. To meet these high demands, many popular  services are implemented and deployed as geo-distributed applications on  top of third-party virtualized cloud providers. However, the nature of  such a deployment leads to variable performance. To deliver high quality  of service, these systems strive to adapt to ever-changing conditions by  monitoring changes in state and making informed run-time decisions, such  as choosing server peering, replica placement, and redirection of requests. In  this dissertation, we seek to improve the quality of run-time decisions made  by geo-distributed systems. We attempt to achieve this through: (1) a better  understanding of the underlying deployment conditions, (2) systematic and  thorough testing of the decision logic implemented in these systems, and (3)  by providing a clear view of the network and system states allowing services  to make better-informed decisions.  First, we validate an application’s decision logic used in popular  storage systems by examining replica selection algorithms. We do this by  introducing GeoPerf, a tool that uses symbolic execution and modeling to  perform systematic testing of replica selection algorithms. GeoPerf was used  to test two popular storage systems and found one bug in each.  Then, using measurements across EC2, we observed persistent correlation  between network paths and network latency. Based on these observations,  we introduce EdgeVar, a tool that decouples routing and congestion based  changes in network latency. This additional information improves estimation  of latency, as well as increases the stability of network path selection.  Next, we introduce Tectonic, a tool that tracks an application’s requests  and responses both at the user and kernel levels. In combination with  EdgeVar, it decouples end-to-end request completion time into three  components of network routing, network congestion, and service time.  Finally, we demonstrate how this decoupling of request completion  time components can be leveraged in practice by developing Kurma, a  fast and accurate load balancer for geo-distributed storage systems. At  runtime, Kurma integrates network latency and service time distributions to  accurately estimate the rate of Service Level Objective (SLO) violations, for  requests redirected between geo-distributed datacenters. Using real-world  data, we demonstrate Kurma’s ability to effectively share load among  datacenters while reducing SLO violations by a factor of up to 3 in high  load settings or reducing the cost of running the service by up to 17%. The  techniques described in this dissertation are important for current and future  geo-distributed services that strive to provide the best quality of service to  customers while minimizing the cost of operating the service.  
  •  
32.
  • Bogdanov, Kirill, et al. (author)
  • Fast and accurate load balancing for geo-distributed storage systems
  • 2018
  • In: SoCC 2018 - Proceedings of the 2018 ACM Symposium on Cloud Computing. - New York, NY, USA : Association for Computing Machinery (ACM). - 9781450360111 ; , s. 386-400
  • Conference paper (peer-reviewed)abstract
    • The increasing density of globally distributed datacenters reduces the network latency between neighboring datacenters and allows replicated services deployed across neighboring locations to share workload when necessary, without violating strict Service Level Objectives (SLOs). We present Kurma, a practical implementation of a fast and accurate load balancer for geo-distributed storage systems. At run-time, Kurma integrates network latency and service time distributions to accurately estimate the rate of SLO violations for requests redirected across geo-distributed datacenters. Using these estimates, Kurma solves a decentralized rate-based performance model enabling fast load balancing (in the order of seconds) while taming global SLO violations. We integrate Kurma with Cassandra, a popular storage system. Using real-world traces along with a geo-distributed deployment across Amazon EC2, we demonstrate Kurma’s ability to effectively share load among datacenters while reducing SLO violations by up to a factor of 3 in high load settings or reducing the cost of running the service by up to 17%.
  •  
33.
  •  
34.
  • Bogdanov, Kirill, 1987- (author)
  • Reducing Long Tail Latencies in Geo-Distributed Systems
  • 2016
  • Licentiate thesis (other academic/artistic)abstract
    • Computing services are highly integrated into modern society. Millions of people rely on these services daily for communication, coordination, trading, and accessing to information. To meet high demands, many popular services are implemented and deployed as geo-distributed applications on top of third party virtualized cloud providers. However, the nature of such deployment provides variable performance characteristics. To deliver high quality of service, such systems strive to adapt to ever-changing conditions by monitoring changes in state and making run-time decisions, such as choosing server peering, replica placement, and quorum selection.In this thesis, we seek to improve the quality of run-time decisions made by geo-distributed systems. We attempt to achieve this through: (1) a better understanding of the underlying deployment conditions, (2) systematic and thorough testing of the decision logic implemented in these systems, and (3) by providing a clear view into the network and system states which allows these services to perform better-informed decisions.We performed a long-term cross datacenter latency measurement of the Amazon EC2 cloud provider. We used this data to quantify the variability of network conditions and demonstrated its impact on the performance of the systems deployed on top of this cloud provider.Next, we validate an application’s decision logic used in popular storage systems by examining replica selection algorithms. We introduce GeoPerf, a tool that uses symbolic execution and lightweight modeling to perform systematic testing of replica selection algorithms. We applied GeoPerf to test two popular storage systems and we found one bug in each.Then, using traceroute and one-way delay measurements across EC2, we demonstrated persistent correlation between network paths and network latency. We introduce EdgeVar, a tool that decouples routing and congestion based changes in network latency. By providing this additional information, we improved the quality of latency estimation, as well as increased the stability of network path selection.Finally, we introduce Tectonic, a tool that tracks an application’s requests and responses both at the user and kernel levels. In combination with EdgeVar, it provides a complete view of the delays associated with each processing stage of a request and response. Using Tectonic, we analyzed the impact of sharing CPUs in a virtualized environment and can infer the hypervisor’s scheduling policies. We argue for the importance of knowing these policies and propose to use them in applications’ decision making process.
  •  
35.
  • Bogdanov, Kirill, et al. (author)
  • The Nearest Replica Can Be Farther Than You Think
  • 2015
  • In: Proceedings of the ACM Symposium on Cloud Computing 2015. - New York, NY, USA : Association for Computing Machinery (ACM). ; , s. 16-29
  • Conference paper (peer-reviewed)abstract
    • Modern distributed systems are geo-distributed for reasons of increased performance, reliability, and survivability. At the heart of many such systems, e.g., the widely used Cassandra and MongoDB data stores, is an algorithm for choosing a closest set of replicas to service a client request. Suboptimal replica choices due to dynamically changing network conditions result in reduced performance as a result of increased response latency. We present GeoPerf, a tool that tries to automate the process of systematically testing the performance of replica selection algorithms for geodistributed storage systems. Our key idea is to combine symbolic execution and lightweight modeling to generate a set of inputs that can expose weaknesses in replica selection. As part of our evaluation, we analyzed network round trip times between geographically distributed Amazon EC2 regions, and showed a significant number of daily changes in nearestK replica orders. We tested Cassandra and MongoDB using our tool, and found bugs in each of these systems. Finally, we use our collected Amazon EC2 latency traces to quantify the time lost due to these bugs. For example due to the bug in Cassandra, the median wasted time for 10% of all requests is above 50 ms.
  •  
36.
  • Bogdanov, Kirill, et al. (author)
  • Toward Automated Testing of Geo-Distributed Replica Selection Algorithms
  • 2015
  • In: Proceedings of the 2015 ACM Conference on Special Interest Group on Data Communication. - New York, NY, USA : Association for Computing Machinery (ACM). ; , s. 89-90
  • Conference paper (peer-reviewed)abstract
    • Many geo-distributed systems rely on a replica selection algorithms to communicate with the closest set of replicas.  Unfortunately, the bursty nature of the Internet traffic and ever changing network conditions present a problem in identifying the best choices of replicas. Suboptimal replica choices result in increased response latency and reduced system performance. In this work we present GeoPerf, a tool that tries to automate testing of geo-distributed replica selection algorithms. We used GeoPerf to test Cassandra and MongoDB, two popular data stores, and found bugs in each of these systems.
  •  
37.
  • Bogdanov, Kirill, et al. (author)
  • Toward Automated Testing of Geo-Distributed Replica Selection Algorithms
  • 2015
  • In: Computer communication review. - : Association for Computing Machinery (ACM). - 0146-4833 .- 1943-5819. ; 45:4, s. 89-90
  • Journal article (peer-reviewed)abstract
    • Many geo-distributed systems rely on a replica selection algorithms to communicate with the closest set of replicas. Unfortunately, the bursty nature of the Internet traffic and ever changing network conditions present a problem in identifying the best choices of replicas. Suboptimal replica choices result in increased response latency and reduced system performance. In this work we present GeoPerf, a tool that tries to automate testing of geo-distributed replica selection algorithms. We used GeoPerf to test Cassandra and MongoDB, two popular data stores, and found bugs in each of these systems.
  •  
38.
  • Brodén, Cyrus, et al. (author)
  • Accuracy and precision of a CT method for assessing migration in shoulder arthroplasty : an experimental study
  • 2019
  • In: Acta Radiologica. - : Sage Publications. - 0284-1851 .- 1600-0455.
  • Journal article (peer-reviewed)abstract
    • Background: Radiostereometric analysis (RSA) is the gold standard to measure early implant migration which is a predictive factor for implant survival. Purpose: To validate an alternative computed tomography (CT) technique to measure implant migration in shoulder arthroplasty. Material and Methods: A cadaver proximal humerus and a scapula, which had tantalum beads incorporated within them, were prepared to accept a short-stemmed humeral component and a two-pegged glenoid component of a commercial total shoulder arthroplasty (TSA) system. A five degree of freedom micrometer and goniometer equipped rig was used to translate and rotate the implant components relative to the respective bone to predetermined positions. Double CT examinations were performed for each position and CT motion analysis software (CTMA) was used to assess these movements. The accuracy and precision of the software was estimated using the rig’s micrometers and goniometers as the gold standard. The technique’s effective dose was also assessed. Results: The accuracy was in the range of 0.07–0.23 mm in translation and 0.22–0.71° in rotation. The precision was in the range of 0.08–0.15 mm in translation and 0.23–0.54° in rotation. The mean effective dose for the CT scans was calculated to be 0.27 mSv. Conclusion: In this experimental setting, accuracy, precision, and effective dose of the CTMA technique were found to be comparable to that of RSA. Therefore, we believe clinical studies are warranted to determine if CTMA is a suitable alternative to traditional RSA for migration measurements in TSA.
  •  
39.
  • Brodén, Cyrus, et al. (author)
  • Accuracy and Precision of Three-Dimensional Low Dose CT Compared to Standard RSA in Acetabular Cups : An Experimental Study
  • 2016
  • In: BioMed Research International. - : Hindawi Publishing Corporation. - 2314-6133 .- 2314-6141.
  • Journal article (peer-reviewed)abstract
    • Background and Purpose. The gold standard for detection of implant wear and migration is currently radiostereometry (RSA). The purpose of this study is to compare a three-dimensional computed tomography technique (3D CT) to standard RSA as an alternative technique for measuring migration of acetabular cups in total hip arthroplasty.Materials and Methods. With tantalum beads, we marked one cemented and one uncemented cup and mounted these on a similarly marked pelvic model. A comparison was made between 3D CT and standard RSA for measuring migration. Twelve repeated stereoradiographs and CT scans with double examinations in each position and gradual migration of the implants were made. Precision and accuracy of the 3D CT were calculated.Results. The accuracy of the 3D CT ranged between 0.07 and 0.32 mm for translations and 0.21 and 0.82° for rotation. The precision ranged between 0.01 and 0.09 mm for translations and 0.06 and 0.29° for rotations, respectively. For standard RSA, the precision ranged between 0.04 and 0.09 mm for translations and 0.08 and 0.32° for rotations, respectively. There was no significant difference in precision between 3D CT and standard RSA. The effective radiation dose of the 3D CT method, comparable to RSA, was estimated to be 0.33 mSv.Interpretation. Low dose 3D CT is a comparable method to standard RSA in an experimental setting.
  •  
40.
  • Brown, Lisa G., et al. (author)
  • Landmark-based 3D fusion of SPECT and CT images
  • 1993
  • In: Sensor fusion VI. - : SPIE - International Society for Optical Engineering. - 0819413240 ; , s. 166-174
  • Conference paper (peer-reviewed)abstract
    • In this paper we present interactive visualization procedures for registration of SPECT and CT images based on landmarks. Because of the poor anatomic detail available in many SPECT images, registration of SPECT images with other modalities often requires the use of external markers. These markers may correspond to anatomic structures identifiable in the other modality image. In this work, we present a method to nonrigidly register SPECT and CT images based on automatic marker localization and interactive anatomic localization using 3D surface renderings of skin. The images are registered in 3D by fitting low order polynomials which are constrained to be near rigid. The method developed here exploits 3D information to attain greater accuracy and reduces the amount of time needed for expert interaction.
  •  
41.
  •  
42.
  •  
43.
  •  
44.
  •  
45.
  • Chapnick, J. V., et al. (author)
  • Techniques for multimodality image registration
  • 1993
  • In: Bioengineering, Proceedings of the Northeast Conference. - 0780309251 ; , s. 221-222
  • Conference paper (peer-reviewed)abstract
    • The authors describe the development of techniques used for cross-modality correlation of medical images. To accomplish this goal, software routines were developed which automate and standardize the comparison of images within and between three-dimensional tomographic imaging modalities. Data from phantoms and clinical studies reflect the success of this technique.
  •  
46.
  • Crafoord, Joakim, et al. (author)
  • Comparison of two landmark based image registration methods for use with a body atlas
  • 2000
  • In: Physica medica (Testo stampato). - 1120-1797 .- 1724-191X. ; 16:2, s. 75-82
  • Journal article (peer-reviewed)abstract
    • We describe preliminary work registering abdominal MRI images from three healthy male volunteers. Anatomically selected 3D homologous point pairs (landmarks), from which eigenvalues were generated to form the basis for a 3D non-affine polynomial transformation, were placed on axial slices alone and on axial, coronal and sagittal slices. Registration accuracy was judged visually by comparing superimposed 3D isosurfaces from the reference, untransformed, and transformed volume data and by comparing merged 2D slices projected fi om the transformed and reference volume data superimposed with 2D isolines. The squared sum of intensity differences between the transformed/untransformed and the reference volume was significant at the 0.05 (p >0.05) confidence level. The correlation coefficient improved by an average of 38% and the cross correlation between pixel values improved by an average of 22%. In each trial, the standard deviation of the landmarks after transformation was within one voxel and the standard error of the mean was not significantly different from zero at the 0.05 confidence level. Abdominal isosurface volume differences (between individuals) changed from an average of 14.5% before registration to 2.9% after registration. This experiment shows that it is possible to choose landmarks such that abdominal data from different subject volumes can be mapped to a common reference, and thus that it is possible to use this combined volume both to form an atlas and to warp abdominal data from an atlas to a patient volume.
  •  
47.
  • Dewyngaert, J. Keith, et al. (author)
  • Procedure for unmasking localization information from ProstaScint scans for prostate radiation therapy treatment planning
  • 2004
  • In: International Journal of Radiation Oncology, Biology, Physics. - : Elsevier BV. - 0360-3016 .- 1879-355X. ; 60:2, s. 654-662
  • Journal article (peer-reviewed)abstract
    • Purpose: To demonstrate a method to extract the meaningful biologic information from In-111-radiolabeled capromab pendetide (ProstaScint) SPECT scans for use in radiation therapy treatment planning by removing that component of the In-111 SPECT images associated with normal structures. Methods and Materials: We examined 20 of more than 80 patients who underwent simultaneous Tc-99m/In-111 SPECT scans, which were subsequently registered to the corresponding CT/MRI scans. A thresholding algorithm was used to identify Tc-99m uptake associated with blood vessels and CT electron density associated with bone marrow. Corresponding voxels were removed from the In-111 image set. Results: No single threshold value was found to be associated with the Tc-99m uptake that corresponded to the blood vessels. Intensity values were normalized to a global maximum and, as such, were dependent upon the quantity of Tc-99m pooled in the bladder. The reduced ProstaScint volume sets were segmented by use of a thresholding feature of the planning system and superimposed on the CT/MRI scans. Conclusions: ProstaScint images are now closer to becoming a biologically and therapeutically useful and accurate image set. After known sources of normal intensity are stripped away, the remaining areas that demonstrate uptake may be segmented and superimposed on the treatment-planning CT/MRI volume.
  •  
48.
  •  
49.
  • Duchamp, Daniel J., et al. (author)
  • Software technology for wireless mobile computing
  • 1991
  • In: IEEE Network. - : IEEE. - 0890-8044 .- 1558-156X. ; 5:6, s. 12-18
  • Journal article (peer-reviewed)abstract
    • Some of the possibilities and requirements for mobile computing on wireless local area networks (LANs) are discussed from the systems software viewpoint. The design of the Student Electronic Notebook (SEN) is sketched to provide a partial catalog of problems in building a real system for wireless mobile computing. This project was initiated to investigate the potential of wireless mobile computing to reshape education. Some of the key directions for research in software technology for wireless, mobile computing are examined. Some of the authors' experience with wireless LANs is related.
  •  
50.
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-50 of 332
Type of publication
journal article (144)
conference paper (123)
reports (34)
book chapter (8)
book (5)
doctoral thesis (5)
show more...
licentiate thesis (5)
other publication (4)
patent (4)
artistic work (1)
show less...
Type of content
peer-reviewed (267)
other academic/artistic (58)
pop. science, debate, etc. (7)
Author/Editor
Maguire Jr., Gerald ... (292)
Noz, Marilyn E. (159)
Zeleznik, Michael P. (56)
Kramer, Elissa L. (40)
Olivecrona, Henrik (31)
Schimpf, James H. (27)
show more...
Kostic, Dejan (26)
Horii, Steven C. (24)
Weidenhielm, Lars (19)
Maguire Jr., Gerald ... (16)
Sanger, Joseph J. (15)
Baxter, Brent S. (14)
Maguire, Gerald Q. J ... (12)
Olivecrona, Lotta (11)
Smith, Mark T. (11)
Ioannidis, John (11)
Barbette, Tom, 1990- (10)
Crafoord, Joakim (10)
Katsikas, Georgios P ... (9)
Hitchner, Lewis E. (9)
Roozbeh, Amir, 1983- (9)
Stolfo, Salvatore J. (8)
Megibow, Alec J. (8)
Farshin, Alireza, 19 ... (8)
Chiesa, Marco (7)
Reichert, Frank (7)
Lerner, Mark D. (6)
Birnbaum, Bernard A. (6)
Chapnick, Jeffrey V (6)
Bogdanov, Kirill (6)
Lundblad, Henrik (6)
Erdman, William A. (6)
Svedmark, Per (6)
Reddy, David P. (5)
Correia, Miguel (5)
Jonsson, Cathrine (5)
Yalew, Sileshi Demes ... (5)
Stark, Andreas (5)
Maguire Jr., Gerald ... (5)
Moy, Linda (5)
Mahmoud, Faaiza (4)
Ayani, Rassul (4)
Beadle, H.W.P. (4)
Kostic, Dejan, Profe ... (4)
Maguire Jr., Gerald ... (4)
Jacobsson, Hans (4)
Dewyngaert, J. Keith (4)
Duchamp, Daniel J. (4)
Griss, Martin L. (4)
Liu, Micky (4)
show less...
University
Royal Institute of Technology (331)
Karolinska Institutet (35)
Uppsala University (5)
Stockholm University (4)
Mid Sweden University (3)
Umeå University (2)
show more...
Lund University (2)
RISE (2)
University of Gothenburg (1)
show less...
Language
English (330)
Spanish (2)
Research subject (UKÄ/SCB)
Medical and Health Sciences (143)
Engineering and Technology (118)
Natural sciences (85)
Humanities (9)

Year

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view