SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "L773:0169 2607 "

Sökning: L773:0169 2607

  • Resultat 1-50 av 110
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Dewaraja, YK, et al. (författare)
  • A parallel Monte Carlo code for planar and SPECT imaging: implementation, verification and applications in I-131 SPECT
  • 2002
  • Ingår i: Computer Methods and Programs in Biomedicine. - 0169-2607. ; 67:2, s. 115-124
  • Tidskriftsartikel (refereegranskat)abstract
    • This paper reports the implementation of the SIMIND Monte Carlo code on an IBM SP2 distributed memory parallel computer. Basic aspects of running Monte Carlo particle transport calculations on parallel architectures are described. Our parallelization is based on equally partitioning photons among the processors and uses the Message Passing Interface (MPI) library for interprocessor communication and the Scalable Parallel Random Number Generator (SPRNG) to generate uncorrelated random number streams. These parallelization techniques are also applicable to other distributed memory architectures. A linear increase in computing speed with the number of processors is demonstrated for Lip to 32 processors. This speed-up is especially significant in Single Photon Emission Computed Tomography (SPECT) simulations involving higher energy photon emitters, where explicit modeling of the phantom and collimator is required. For I-131, the accuracy of the parallel code is demonstrated by comparing simulated and experimental SPECT images from a heart/thorax phantom. Clinically realistic SPECT simulations using the voxel-man phantom are carried out to assess scatter and attenuation correction.
  •  
2.
  • Ljungberg, Michael, et al. (författare)
  • A Monte Carlo program for the simulation of scintillation camera characteristics
  • 1989
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier BV. - 0169-2607. ; 29:4, s. 257-272
  • Tidskriftsartikel (refereegranskat)abstract
    • There is a need for mathematical modelling for the evaluation of important parameters for photon imaging systems. A Monte Carlo program which simulates medical imaging nuclear detectors has been developed. Different materials can be chosen for the detector, a cover and a phantom. Cylindrical, spherical, rectangular and more complex phantom and source shapes can be simulated. Photoelectric, incoherent, coherent interactions and pair production are simulated. Different detector parameters, e.g. the energy pulse-height distribution and pulse pile-up due to finite decay time of the scintillation light emission, can be calculated. An energy resolution of the system is simulated by convolving the energy imparted with an energy-dependent Gaussian function. An image matrix of the centroid of the events in the detector can be simulated. Simulation of different collimators permits studies of spatial resolution and sensitivity. Comparisons of our results with experimental data and other published results have shown good agreement. The usefulness of the Monte Carlo code for the accurately simulation of important parameters in scintillation camera systems, stationary as well as SPECT (single-photon emission computed tomography) systems, has been demonstrated.
  •  
3.
  • Edvardsson, Hannes, et al. (författare)
  • Compact and efficient 3D shape description through radial function approximation
  • 2003
  • Ingår i: Computer Methods and Programs in Biomedicine. - 0169-2607 .- 1872-7565. ; 72:2, s. 89-97
  • Tidskriftsartikel (refereegranskat)abstract
    • A fast and simple method for three-dimensional shape description is described. The method views a 3D object as a radial distance function on the unit sphere, and thus reduces the dimensionality of the description problem by one. The radial distance function is approximated by Fourier methods in the basis of the spherical harmonic polynomials. The necessary integration is carried out on the object boundary, rather than on the unit sphere. Consequently, there is no need of a parameterisation of the object surface. The description makes it possible to compare shapes in a computationally very simple way. Solutions on how to cope with translated and rotated objects are discussed. The method is developed for star-shaped objects, but is stable even if the input image is non-star-shaped. The method is tested in a data set from magnetic resonance imaging (MRI) of the brain. Potential medical applications are discussed. ⌐ 2002 Elsevier Science Ireland Ltd. All rights reserved.
  •  
4.
  • Foracchia, Marco, et al. (författare)
  • POPED, a software for optimal experiment design in population kinetics.
  • 2004
  • Ingår i: Computer Methods and Programs in Biomedicine. - 0169-2607 .- 1872-7565. ; 74:1, s. 29-46
  • Tidskriftsartikel (refereegranskat)abstract
    • Population kinetic analysis is the methodology used to quantify inter-subject variability in kinetic studies. It entails the collection of (possibly sparse) data from dynamic experiments in a group of subjects and their quantitative interpretation by means of a mathematical model. This methodology is widely used in the pharmaceutical industry (where it is termed "pharmacokinetic population analysis") and recently it is becoming increasingly used in other areas of biomedical research. Unlike traditional kinetic studies, where the number of subjects can be quite small, population kinetic studies require large numbers of subjects. It is, therefore, of great interest to design these studies in the most efficient manner possible, to maximize the information content provided by the data. In this paper we propose an algorithm and a computer program, POPED, for the optimal design of a population kinetic experiment. In particular, the number of samples for each subject and the design of the individual sampling strategies, i.e. the number and location of the time points at which the output variable is sampled, will be considered. Among the various criteria proposed in the literature, D and ED optimality are the ones implemented in our software program, since they are the most widely used. A brief description of the techniques employed to perform design optimization is given, together with some details on their actual implementation. Some examples are then presented to show the program usage and the results provided.
  •  
5.
  • Gustafsson, Mikael, et al. (författare)
  • A distributed image-processing system for measurements of intracellular calcium in living cells
  • 1991
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier. - 0169-2607 .- 1872-7565. ; 36:4, s. 199-221
  • Tidskriftsartikel (refereegranskat)abstract
    • During the last decade, image-processing techniques have been introduced as a valuable tool in biologically oriented research. In combination with novel fluorescent probes, these techniques permit assessment of subcellular distributions of several intracellularly important cations, such as free calcium ions and protons. Typically, systems used for image processing are located centrally around the experimental setup. This configuration has drawbacks, mainly because the laborious extraction and processing of data that generally follow an experimental session limits the access to the system for other investigators. We describe here the principles of a distributed image processing system, based on IBM-compatible personal computers (PCs), that without extra hardware can cope with all the necessary image processing involved in imaging of intracellular cations. The potential of the PC as an image processor, however, reaches beyond this specific application and many image processing tasks can be carried out successfully on a standard PC. Thus, the centrally located dedicated image processor is used only for image acquisition in the experimental situation. This in turn optimizes the utilization of expensive resources and increases efficiency. The mouse-operated software is described in detail, so that interested investigators can extract useful parts for integration into their own applications and experimental environment.
  •  
6.
  •  
7.
  •  
8.
  •  
9.
  • Knutsson, Hans, et al. (författare)
  • Spatio-temporal filtering of digital angiography image data
  • 1998
  • Ingår i: Computer Methods and Programs in Biomedicine. - 0169-2607 .- 1872-7565. ; 57:1-2, s. 115-123
  • Tidskriftsartikel (refereegranskat)abstract
    • As welfare diseases become more common all over the world the demand for angiography examinations is increasing rapidly. The development of advanced medical signal processing methods has with few exceptions been concentrated towards CT and MR while traditional contrast based radiology depend on methods developed for ancient photography techniques despite the fact that angiography sequences are generally recorded in digital form. This article presents a new approach for processing of angiography sequences based on advanced image processing methods. The developed algorithm automatically processes angiography sequences containing motion artifacts that cannot be processed by conventional methods like digital subtraction angiography (DSA) and pixel shift due to non uniform motions. The algorithm can in simple terms be described as an ideal pixelshift filter carrying out shifts of different directions and magnitude according to the local motions in the image. In difference to conventional methods it is fully automatic, no mask image needs to be defined and the manual pixelshift operations, which are extremely time consuming, are eliminated. The algorithm is efficient and robust and is designed to run on standard hardware of a powerful workstation which excludes the need for expensive dedicated angiography platforms. Since there is no need to make additional recordings if the patient moves, the patient is exposed to less amount of radiation and contrast fluid. The most exciting benefits by this method are, however, that it opens up new areas for contrast based angiography that are not possible to process with conventional methods e.g. nonuniform motions and multiple layers of moving tissue. Advanced image processing methods provide significantly better image quality and noise suppression but do also provide the means to compute flow velocity and visualize the flow dynamics in the arterial trees by e.g. using color. Initial tests have proven that it is possible to discriminate capillary blood flow from angiography data which opens up interesting possibilities for estimating the blood flow in the heart muscle without use of nuclear methods.
  •  
10.
  • Koch, Sabine, et al. (författare)
  • Controlled Diagnosis-Oriented Enhancement of Automatically Segmented Radiographs in Dentistry
  • 1998
  • Ingår i: Computer Methods and Programs in Biomedicine. - 0169-2607 .- 1872-7565. ; 57:1-2, s. 125-131
  • Tidskriftsartikel (refereegranskat)abstract
    • A method for controlled diagnosis-oriented enhancement of selected regions of interest in intraoral radiographs is presented. Image enhancement is accomplished by adaptive non-linear grey scale transformation depending on the result of objective quality measurement. In order to assure reliable image duality measurement as well as controlled image enhancement, automatic image segmentation is applied to avoid the influence of disturbing factors (e.g. metallic restorations) on quality measurement and image enhancement. Based on existing a-priori knowledge about object structure and composition of the selected regions of interest in intraoral radiographs, different image segmentation algorithms and image enhancement procedures were developed for different types of diagnosis. (C) 1998 Elsevier Science Ireland Ltd. All rights reserved.
  •  
11.
  • Kohli, Sunil, et al. (författare)
  • Individuals living in areas with high background radon : a GIS method to identify populations at risk
  • 1997
  • Ingår i: Computer Methods and Programs in Biomedicine. - 0169-2607 .- 1872-7565. ; 53:2, s. 105-112
  • Tidskriftsartikel (refereegranskat)abstract
    • Objective: to identify and link populations and individuals that live within high risk areas. Design: census registers and disease registers which contain data on individuals can only give aggregate statistics relating to postal code districts, town, county or state boundaries. However environmental risk factors rarely, if ever, respect these man-made boundaries. What is needed is a method to rapidly identify individuals who may live within a described area or region and to further identify the disease(s) occurring among these individuals and/or in these areas. Method: this paper describes a method for linking the standard registers available in Sweden, notably the residence-property addresses they contain and the geographical coordinate setting of these, to map the population as a point coverage. Using standard GIS methods this coverage could be linked, merged or intersected with any other map to create new subsets of population. Representation of populations down to the individual level by automatised spatialisation of available census data is in its simplicity a new informatics method which in the designated GIS medium adds a new power of resolution. Results: we demonstrate this using the radon maps provided by the local communes. The Swedish annual population registration records of 1991 for the county ofÖstergötland and the property register available at the Central Statistical Bureau of Sweden formed the main data sources. By coupling the address in the population register to the property register each individual was mapped to the centroid of a property. By intersecting the population coverage with the radon maps, the population living in high, normal or low risk areas was identified and then analysed and stratified by commune, sex and age. The resulting tables can be linked to other databases, e.g. disease registers, to visualise and analyse geographical and related patterns. The methodology can be adapted for use with any other environmental map or small area. It can also be expanded to the fourth dimension by linking likewise available migration information to generate immediately coordinate-set, accumulated exposition and similar data.
  •  
12.
  •  
13.
  • Noz, Marilyn E., et al. (författare)
  • QSH : a minimal but highly portable image display and handling toolkit
  • 1988
  • Ingår i: Computer Methods and Programs in Biomedicine. - : ELSEVIER SCI IRELAND LTD. - 0169-2607 .- 1872-7565. ; 27:3, s. 229-240
  • Tidskriftsartikel (refereegranskat)abstract
    • We describe a software system developed to handle images obtained from different sources, namely, computer-assisted tomography, positron emission tomography, single photon emission tomography and magnetic resonance imaging. In developing the system, it was necessary to address the following points. (1) The types of values that were encountered in both the header information and the pixel elements, namely, integers, floating point numbers, complex numbers and strings. (2) The use of domain-dependent sets of keys, that is, how to choose keys and how to stabilize the use of keys among the user population. This is, for example, how information such as the patient name, or the activity in becquerel is kept. It is necessary to keep both the key values and the units. (3) The development of a method for providing a database using flat files, i.e. linear text. (4) The maintenance of a history of values and operations. This is necessary in order to address the problem of determining from an image was produced. The connection between an image and how it was derived is analogous to describing how a secondary standard is derived from a primary one.
  •  
14.
  • Persson, Mats, 1954-, et al. (författare)
  • Development and maintenance of guideline-based decision support for pharmacological treatment of hypertension
  • 2000
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier. - 0169-2607 .- 1872-7565. ; 61:3, s. 209-219
  • Tidskriftsartikel (refereegranskat)abstract
    • The objective was to build a computer-based decision support system (DSS), which could apply the formal rules embedded in guidelines regarding pharmacological treatment of hypertension. The aim was also to test VISUAL BASIC as a development tool for DSS's in health care. From the Swedish guidelines for treatment of hypertension, the most widely accepted and scientifically best proved treatment strategies were chosen and implemented as rules. A DSS that is capable of applying the evidence-based rules extracted from guidelines regarding drug treatment of hypertension, to any patient's medical profile, was constructed. The output consists of a recommendation regarding preferred generic drug class and also a written report, reflecting decision steps provided by the rule-base and inference engine. We also provide methods for formalising an implementable language of guidelines. A mainstream programming language like VISUAL BASIC can be an alternative when building complicated decision support systems. A logic formal notation can facilitate communication between the expert and the programmer. The program is a stand-alone product independent of computerized medical records and thereby easy to install and maintain.
  •  
15.
  • Sandborg, Michael, 1961-, et al. (författare)
  • A Monte Carlo program for the calculation of contrast, noise and absorbed dose in diagnostic radiology
  • 1994
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier BV. - 0169-2607 .- 1872-7565. ; 42:3, s. 167-180
  • Tidskriftsartikel (refereegranskat)abstract
    • A Monte Carlo computer program has been developed for the simulation of X-ray photon transport in diagnostic X-ray examinations. The simulation takes account of the incident photon energy spectrum and includes a phantom (representing the patient), an anti-scatter grid and an image receptor. The primary objective for developing the program was to study and optimise the design of anti-scatter grids. The program estimates image quality in terms of contrast and signal-to-noise ratio, and radiation risk in terms of mean absorbed dose in the patient. It therefore serves as a tool for the optimisation of the radiographic procedure. A description is given of the program and the variance-reduction techniques used. The computational method was validated by comparison with measurements and other Monte Carlo simulations.
  •  
16.
  • Seipel, Stefan, et al. (författare)
  • Oral Implant Treatment Planning in a Virtual Reality Environment
  • 1998
  • Ingår i: Computer Methods and Programs in Biomedicine. - 0169-2607 .- 1872-7565. ; 57:1-2, s. 95-103
  • Tidskriftsartikel (refereegranskat)abstract
    • A system for three-dimensional oral implant treatment planning is presented. Virtual reality technologies are used in order to improve the human image interpretation and planning performance. The methods described are based on computer tomography (CT) data of the mandible and of the maxilla. A novel approach to volume rendering and voxel based modelling of implants is introduced which allows interactive three-dimensional manipulation of the anatomic model and real-time manipulation of virtual implants. A spline-based reconstruction method is described to assess the implant site in a clinically oriented view with regard to bone structures and angulation. Two parameters are deduced which represent the bone properties at the surface of implants. While an implant is navigated with six degrees of freedom, these parameters are acoustically rendered which is a novel approach to exploration of spatial bone properties in a CT data set.
  •  
17.
  • Smedby, Örjan (författare)
  • A scanning system for digital analysis of cineangiography films
  • 1992
  • Ingår i: Computer Methods and Programs in Biomedicine. - 0169-2607 .- 1872-7565. ; 39:1-2, s. 103-111
  • Tidskriftsartikel (refereegranskat)abstract
    • A system for scanning and digital analysis of cinefilms is presented and its performance is compared with entirely digital radiographic equipment. Apart from the difference between logarithmic and linear gray-scale representation, a higher noise level was found in the scanning system. When its spatial resolution was assessed visually, it was comparable to that of the digital system, although lower than when the cinefilming and scanning steps were evaluated separately. Algorithms for the correction of varying exposure and geometric ("pin-cushion") distortion are also presented. It is concluded that digital analysis after scanning of cinefilms can be a useful alternative to completely digital cineradiographic studies.
  •  
18.
  • Tammi, Martti, et al. (författare)
  • TRAP : Tandem Repeat Assembly Program produces improved shotgun assemblies of repetitive sequences
  • 2003
  • Ingår i: Computer Methods and Programs in Biomedicine. - 0169-2607 .- 1872-7565. ; 70:1, s. 47-59
  • Tidskriftsartikel (refereegranskat)abstract
    • The software commonly used for assembly of shotgun sequence data has several limitations. One such limitation becomes obvious when repetitive sequences are encountered. Shotgun assembly is a difficult task, even for non-repetitive regions, but the use of quality assessments of the data and efficient matching algorithms have made it possible to assemble most sequences efficiently. In the case of highly repetitive sequences, however, these algorithms fail to distinguish between sequencing errors and single base differences in regions containing nearly identical repeats. None of the currently available fragment assembly programs are able to correctly assemble highly similar repetitive data, and we, therefore, present a novel shotgun assembly program, Tandem Repeat Assembly Program (TRAP). The main feature of this program is the ability to separate long repetitive regions from each other by distinguishing single base substitutions as well as insertions/deletions from sequencing errors. This is accomplished by using a novel multiple-alignment based analysis method. Since repeats are a common complication in most sequencing projects, this software should be of use for the whole sequencing community.
  •  
19.
  • Thurfjell, Lennart, et al. (författare)
  • CBA—an atlas-based software tool used to facilitate the interpretation of neuroimaging data
  • 1995
  • Ingår i: Computer Methods and Programs in Biomedicine. - : ELSEVIER SCI PUBL IRELAND LTD. - 0169-2607 .- 1872-7565. ; 47:1, s. 51-71
  • Tidskriftsartikel (refereegranskat)abstract
    • CBA, a software tool used to improve quantification and evaluation of neuroimaging data has been developed. It uses a detailed 3-dimensional brain atlas that can be adapted to fit the brain of an individual patient represented by a series of displayed ima
  •  
20.
  • Abu-Rmileh, Amjad, et al. (författare)
  • Wiener sliding-mode control for artificial pancreas : A new nonlinear approach to glucose regulation
  • 2012
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier. - 0169-2607 .- 1872-7565. ; 107:2, s. 327-340
  • Tidskriftsartikel (refereegranskat)abstract
    • Type 1 diabetic patients need insulin therapy to keep their blood glucose close to normal. In this paper an attempt is made to show how nonlinear control-oriented model may be used to improve the performance of closed-loop control of blood glucose in diabetic patients. The nonlinear Wiener model is used as a novel modeling approach to be applied to the glucose control problem. The identified Wiener model is used in the design of a robust nonlinear sliding mode control strategy. Two configurations of the nonlinear controller are tested and compared to a controller designed with a linear model. The controllers are designed in a Smith predictor structure to reduce the effect of system time delay. To improve the meal compensation features, the controllers are provided with a simple feedforward controller to inject an insulin bolus at meal time. Different simulation scenarios have been used to evaluate the proposed controllers. The obtained results show that the new approach out-performs the linear control scheme, and regulates the glucose level within safe limits in the presence of measurement and modeling errors, meal uncertainty and patient variations.
  •  
21.
  • Acharya, Chayan, et al. (författare)
  • A diagnostic tool for population models using non-compartmental analysis : The ncappc package for R
  • 2016
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier BV. - 0169-2607 .- 1872-7565. ; 127, s. 83-93
  • Tidskriftsartikel (refereegranskat)abstract
    • Background and objective: Non-compartmental analysis (NCA) calculates pharmacokinetic (PK) metrics related to the systemic exposure to a drug following administration, e.g. area under the concentration time curve and peak concentration. We developed a new package in R, called ncappc, to perform (i) a NCA and (ii) simulation-based posterior predictive checks (ppc) for a population PK (PopPK) model using NCA metrics. Methods: The nca feature of ncappc package estimates the NCA metrics by NCA. The ppc feature of ncappc estimates the NCA metrics from multiple sets of simulated concentration time data and compares them with those estimated from the observed data. The diagnostic analysis is performed at the population as well as the individual level. The distribution of the simulated population means of each NCA metric is compared with the corresponding observed population mean. The individual level comparison is performed based on the deviation of the mean of any NCA metric based on simulations for an individual from the corresponding NCA metric obtained from the observed data. The ncappc package also reports the normalized prediction distribution error (NPDE) of the simulated NCA metrics for each individual and their distribution within a population. Results: The ncappc produces two default outputs depending on the type of analysis performed, i.e., NCA and PopPK diagnosis. The PopPK diagnosis feature of ncappc produces 8 sets of graphical outputs to assess the ability of a population model to simulate the concentration time profile of a drug and thereby evaluate model adequacy. In addition, tabular outputs are generated showing the values of the NCA metrics estimated from the observed and the simulated data, along with the deviation, NPDE, regression parameters used to estimate the elimination rate constant and the related population statistics. Conclusions: The ncappc package is a versatile and flexible tool-set written in R that successfully estimates NCA metrics from concentration time data and produces a comprehensive set of graphical and tabular output to summarize the diagnostic results including the model specific outliers. The output is easy to interpret and to use in evaluation of a population PK model. ncappc is freely available on CRAN (http://crantoprojectorg/web/packages/ncappc/index.html/) and GitHub (https://github.comicacha0227/ncappc/). 
  •  
22.
  • af, Klercker T, et al. (författare)
  • Decision support system for primary health care in an inter/intranet environment
  • 1998
  • Ingår i: COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE. - : ELSEVIER SCI IRELAND LTD. - 0169-2607. ; 55:1, s. 31-37
  • Tidskriftsartikel (övrigt vetenskapligt/konstnärligt)abstract
    • The need for easily accessible computerised decision-and documentation-support in primary health care has been previously published. The implementation of such a tool in an intranet environment is described. The use of free-ware which can be downloaded fr
  •  
23.
  • Aghanavesi, Somayeh, 1981-, et al. (författare)
  • A multiple motion sensors index for motor state quantification in Parkinson's disease
  • 2020
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier BV. - 0169-2607 .- 1872-7565. ; 189
  • Tidskriftsartikel (refereegranskat)abstract
    • Aim: To construct a Treatment Response Index from Multiple Sensors (TRIMS) for quantification of motor state in patients with Parkinson's disease (PD) during a single levodopa dose. Another aim was to compare TRIMS to sensor indexes derived from individual motor tasks. Method: Nineteen PD patients performed three motor tests including leg agility, pronation-supination movement of hands, and walking in a clinic while wearing inertial measurement unit sensors on their wrists and ankles. They performed the tests repeatedly before and after taking 150% of their individual oral levodopa-carbidopa equivalent morning dose.Three neurologists blinded to treatment status, viewed patients’ videos and rated their motor symptoms, dyskinesia, overall motor state based on selected items of Unified PD Rating Scale (UPDRS) part III, Dyskinesia scale, and Treatment Response Scale (TRS). To build TRIMS, out of initially 178 extracted features from upper- and lower-limbs data, 39 features were selected by stepwise regression method and were used as input to support vector machines to be mapped to mean reference TRS scores using 10-fold cross-validation method. Test-retest reliability, responsiveness to medication, and correlation to TRS as well as other UPDRS items were evaluated for TRIMS. Results: The correlation of TRIMS with TRS was 0.93. TRIMS had good test-retest reliability (ICC = 0.83). Responsiveness of the TRIMS to medication was good compared to TRS indicating its power in capturing the treatment effects. TRIMS was highly correlated to dyskinesia (R = 0.85), bradykinesia (R = 0.84) and gait (R = 0.79) UPDRS items. Correlation of sensor index from the upper-limb to TRS was 0.89. Conclusion: Using the fusion of upper- and lower-limbs sensor data to construct TRIMS provided accurate PD motor states estimation and responsive to treatment. In addition, quantification of upper-limb sensor data during walking test provided strong results. © 2019
  •  
24.
  • Ahkami, Bahareh, 1994, et al. (författare)
  • Locomotion Decoding (LocoD) An Open-Source Modular Platform for Researching Control of Lower Limb Assistive Devices.
  • 2023
  • Ingår i: Computer Methods and Programs in Biomedicine. - 1872-7565 .- 0169-2607.
  • Tidskriftsartikel (refereegranskat)abstract
    • Background and Objective: Commercially available motorized prosthetic legs use exclusively non-biological signals to control movements, such as those provided by load cells, pressure sensors, and inertial measurement units (IMUs). Despite that the use of biological signals of neuromuscular origin can provide more natural control of leg prostheses, these signals cannot yet be captured and decoded reliably enough to be used in daily life. Indeed, decoding motor intention from bioelectric signals obtained from the residual limb holds great potential, and therefore the study of decoding algorithms has increased in the past years with standardized methods yet to be established. Methods: In the absence of shared tools to record and process lower limb bioelectric signals, such as electromyography (EMG), we developed an open-source software platform to unify the recording and processing (pre-processing, feature extraction, and classification) of EMG and non-biological signals amongst researchers with the goal of investigating and benchmarking control algorithms. We validated our locomotion decoding (LocoD) software by comparing the accuracy in the classification of locomotion mode using three different combinations of sensors (1 = IMU+EMG, 2 = EMG, 3 = IMU). EMG and non-biological signals (from the IMU and pressure sensor) were recorded while able-bodied participants (n = 21) walked on different surfaces such as stairs and ramps, and this data set is also released publicly along this publication. LocoD was used for all recording, pre-processing, feature extraction, and classification of the recorded signals. We tested the statistical hypothesis that there was a difference in predicted locomotion mode accuracy between sensor combinations using the Wilcoxon signed-rank test. Results: We found that the sensor combination 1 (EMG+IMU) led to significantly more accurate and improved locomotion mode prediction (Accuracy=93.4 ± 3.9) than using EMG (Accuracy= 74.56 ± 5.8) or IMU alone (Accuracy=90.77 ± 4.6) with p-value < 0.001. Conclusions: Our results support previous research and validate the functionality of LocoD as an open-source and modular platform to research control algorithms for prosthetic legs that incorporate bioelectric signals.
  •  
25.
  • Ahnesjö, Anders, 1953-, et al. (författare)
  • Collapsed cone dose calculations for heterogeneous tissues in brachytherapy using primary and scatter separation source data
  • 2017
  • Ingår i: Computer Methods and Programs in Biomedicine. - : ELSEVIER IRELAND LTD. - 0169-2607 .- 1872-7565. ; 139, s. 17-29
  • Tidskriftsartikel (refereegranskat)abstract
    • Background and Objective: Brachytherapy is a form of radiation therapy using sealed radiation sources inserted within or in the vicinity of the tumor of, e.g., gynecological, prostate or head and neck cancers. Accurate dose calculation is a crucial part of the treatment planning. Several reviews have called for clinical software with model-based algorithms that better take into account the effects of patient individual distribution of tissues, source-channel and shielding attenuation than the commonly employed TG-43 formalism which simply map homogeneous water dose distributions onto the patient. In this paper we give a comprehensive and thorough derivation of such an algorithm based on collapsed cone point-kernel superposition, and describe details of its implementation into a commercial treatment planning system for clinical use. Methods: A brachytherapy version of the collapsed-cone algorithm using analytical raytraces of the primary photon radiation followed by successive scattering dose calculation for once and multiply scattered photons is described in detail, including derivation of the corresponding set of recursive equations for energy transport along cone axes/transport lines and the coupling to clinical source modeling. Specific implementation issues for setting up of the calculation grid, handling of intravoxel gradients and voxels partly containing non patient applicator material are given. Results: Sample runs for two clinical cases are shown, one being a gynecological application with a tungsten-shielded applicator and one a breast implant. These two cases demonstrate the impact of improved dose calculation versus TG-43 formalism. Conclusions: Use of model-based dose calculation algorithms for brachytherapy taking the three-dimensional treatment geometry into account increases the dosimetric accuracy in planning and follow up of treatments. The comprehensive description and derivations provided gives a rigid background for further clinical, educational and research applications.
  •  
26.
  • Allalou, Amin, 1981-, et al. (författare)
  • BlobFinder, a tool for fluorescence microscopy image cytometry
  • 2009
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier BV. - 0169-2607 .- 1872-7565. ; 94:1, s. 58-65
  • Tidskriftsartikel (refereegranskat)abstract
    • Images can be acquired at high rates with modern fluorescence microscopy hardware, giving rise to a demand for high-speed analysis of image data. Digital image cytometry, i.e., automated measurements and extraction of quantitative data from images of cells, provides valuable information for many types of biomedical analysis. There exists a number of different image analysis software packages that can be programmed to perform a wide array of useful measurements. However, the multi-application capability often compromises the simplicity of the tool. Also, the gain in speed of analysis is often compromised by time spent learning complicated software. We provide a free software called BlobFinder that is intended for a limited type of application, making it easy to use, easy to learn and optimized for its particular task. BlobFinder can perform batch processing of image data and quantify as well as localize cells and point like source signals in fluorescence microscopy images, e.g., from FISH, in situ PLA and padlock probing, in a fast and easy way.
  •  
27.
  • Amiri, Saeid, et al. (författare)
  • On the efficiency of bootstrap method into the analysis contingency table
  • 2011
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier BV. - 0169-2607 .- 1872-7565. ; 104:2, s. 182-187
  • Tidskriftsartikel (refereegranskat)abstract
    • The bootstrap method is a computer intensive statistical method that is widely used in performing nonparametric inference. Categorica ldata analysis,inparticular the analysis of contingency tables, is commonly used in applied field. This work considers nonparametric bootstrap tests for the analysis of contingency tables. There are only a few research papers which exploit this field. The p-values of tests in contingency tables are discrete and should be uniformly distributed under the null hypothesis. The results of this article show that corresponding bootstrap versions work better than the standard tests. Properties of the proposed tests are illustrated and discussed using Monte Carlo simulations. This article concludes with an analytical example that examines the performance of the proposed tests and the confidence interval of the association coefficient.
  •  
28.
  • Aoki, Yasunori, et al. (författare)
  • PopED lite: an optimal design software for preclinical pharmacokinetic and pharmacodynamic studies
  • 2016
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier BV. - 0169-2607 .- 1872-7565. ; 127, s. 126-143
  • Tidskriftsartikel (refereegranskat)abstract
    • Background and ObjectiveOptimal experimental design approaches are seldom used in preclinical drug discovery. The objective is to develop an optimal design software tool specifically designed for preclinical applications in order to increase the efficiency of drug discovery in vivo studies.MethodsSeveral realistic experimental design case studies were collected and many preclinical experimental teams were consulted to determine the design goal of the software tool. The tool obtains an optimized experimental design by solving a constrained optimization problem, where each experimental design is evaluated using some function of the Fisher Information Matrix. The software was implemented in C++ using the Qt framework to assure a responsive user-software interaction through a rich graphical user interface, and at the same time, achieving the desired computational speed. In addition, a discrete global optimization algorithm was developed and implemented.ResultsThe software design goals were simplicity, speed and intuition. Based on these design goals, we have developed the publicly available software PopED lite (http://www.bluetree.me/PopED_lite). Optimization computation was on average, over 14 test problems, 30 times faster in PopED lite compared to an already existing optimal design software tool. PopED lite is now used in real drug discovery projects and a few of these case studies are presented in this paper.ConclusionsPopED lite is designed to be simple, fast and intuitive. Simple, to give many users access to basic optimal design calculations. Fast, to fit a short design-execution cycle and allow interactive experimental design (test one design, discuss proposed design, test another design, etc). Intuitive, so that the input to and output from the software tool can easily be understood by users without knowledge of the theory of optimal design. In this way, PopED lite is highly useful in practice and complements existing tools.
  •  
29.
  • Bauwens, Maite, et al. (författare)
  • On Climate Reconstruction Using Bivalve Shells : Three Methods To Interpret the Chemical Signature of a Shell
  • 2011
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier. - 0169-2607 .- 1872-7565. ; 104:2, s. 104-111
  • Tidskriftsartikel (refereegranskat)abstract
    • To improve our understanding of the climate process and to assess the human impact on current global warming, past climate reconstruction is essential. The chemical composition of a bivalve shell is strongly coupled to environmental variations and therefore ancient shells are potential climate archives. The nonlinear nature of the relation between environmental condition (e.g. the seawater temperature) and proxy composition makes it hard to predict the former from the latter, however. In this paper we compare the ability of three nonlinear system identification methods to reconstruct the ambient temperature from the chemical composition of a shell. The comparison shows that nonlinear multi-proxy approaches are potentially useful tools for climate reconstructions and that manifold based methods result in smoother and more precise temperature reconstruction.
  •  
30.
  • Beháňová, Andrea, et al. (författare)
  • gACSON software for automated segmentation and morphology analyses of myelinated axons in 3D electron microscopy
  • 2022
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier. - 0169-2607 .- 1872-7565. ; 220
  • Tidskriftsartikel (refereegranskat)abstract
    • Background and Objective: Advances in electron microscopy (EM) now allow three-dimensional (3D) imaging of hundreds of micrometers of tissue with nanometer-scale resolution, providing new opportunities to study the ultrastructure of the brain. In this work, we introduce a freely available Matlab-based gACSON software for visualization, segmentation, assessment, and morphology analysis of myelinated axons in 3D-EM volumes of brain tissue samples.Methods: The software is equipped with a graphical user interface (GUI). It automatically segments the intra-axonal space of myelinated axons and their corresponding myelin sheaths and allows manual segmentation, proofreading, and interactive correction of the segmented components. gACSON analyzes the morphology of myelinated axons, such as axonal diameter, axonal eccentricity, myelin thickness, or gratio.Results: We illustrate the use of the software by segmenting and analyzing myelinated axons in six 3DEM volumes of rat somatosensory cortex after sham surgery or traumatic brain injury (TBI). Our results suggest that the equivalent diameter of myelinated axons in somatosensory cortex was decreased in TBI animals five months after the injury.Conclusion: Our results indicate that gACSON is a valuable tool for visualization, segmentation, assessment, and morphology analysis of myelinated axons in 3D-EM volumes. It is freely available at https://github.com/AndreaBehan/g-ACSON under the MIT license.
  •  
31.
  • Bora, Kangkana, et al. (författare)
  • Automated classification of Pap smear images to detect cervical dysplasia
  • 2017
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier. - 0169-2607 .- 1872-7565. ; 138, s. 31-47
  • Tidskriftsartikel (refereegranskat)abstract
    • Background and objectives: The present study proposes an intelligent system for automatic categorization of Pap smear images to detect cervical dysplasia, which has been an open problem ongoing for last five decades. Methods: The classification technique is based on shape, texture and color features. It classifies the cervical dysplasia into two-level (normal and abnormal) and three-level (Negative for Intraepithelial Lesion or Malignancy, Low-grade Squamous Intraepithelial Lesion and High-grade Squamous Intraepithelial Lesion) classes reflecting the established Bethesda system of classification used for diagnosis of cancerous or precancerous lesion of cervix. The system is evaluated on two generated databases obtained from two diagnostic centers, one containing 1610 single cervical cells and the other 1320 complete smear level images. The main objective of this database generation is to categorize the images according to the Bethesda system of classification both of which require lots of training and expertise. The system is also trained and tested on the benchmark Herlev University database which is publicly available. In this contribution a new segmentation technique has also been proposed for extracting shape features. Ripplet Type I transform, Histogram first order statistics and Gray Level Co-occurrence Matrix have been used for color and texture features respectively. To improve classification results, ensemble method is used, which integrates the decision of three classifiers. Assessments are performed using 5 fold cross validation. Results: Extended experiments reveal that the proposed system can successfully classify Pap smear images performing significantly better when compared with other existing methods. Conclusion: This type of automated cancer classifier will be of particular help in early detection of cancer.
  •  
32.
  • BORALV, E, et al. (författare)
  • DOCUMENTATION AND INFORMATION-SERVICES IN THE HELIOS PROJECT
  • 1994
  • Ingår i: COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE. - 0169-2607. ; 45, s. S139-S145
  • Tidskriftsartikel (övrigt vetenskapligt/konstnärligt)abstract
    • In a modern software project large amounts of documentation is produced. All parts of the complex software system require extensive documentation--both for reference purposes and promotional reasons. However, there are some aspects that often are forgotten or badly implemented; (i) the availability of on-line documentation, (ii) integration of the different formats of documentation, and (iii) the world wide promotional aspect. To solve these problems, the Helios project has chosen to integrate its public documentation and software material into a hypertext system using the World Wide Web.
  •  
33.
  • BORALV, E, et al. (författare)
  • USABILITY AND EFFICIENCY - THE HELIOS APPROACH TO DEVELOPMENT OF USER INTERFACES
  • 1994
  • Ingår i: COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE. - 0169-2607. ; 45, s. S47-S64
  • Tidskriftsartikel (övrigt vetenskapligt/konstnärligt)abstract
    • This paper describes the user interface related services of the HELIOS project. The design and implementation of efficient user interfaces is a prerequisite for successful introduction of computer support in health care ward units. Design principles must be based on a basic understanding of cognitive aspects of human-computer interaction, as well as on detailed knowledge about the specific needs and requirements of the health care professionals. In the HELIOS project, a style guide for design of user interfaces has been developed. The style guide defines detailed design guide-lines together with a set of interface elements specified for the ward domain. Development tools for construction and implementation of user interfaces to ward applications have been developed and integrated into the HELIOS SEE. The tools are based on the TeleUSE product, which has been extended and adjusted to the HELIOS specifications. A set of new widgets, designed to implement health care interface elements, has been incorporated into the development tool.
  •  
34.
  •  
35.
  •  
36.
  • Caruso, Camillo Maria, et al. (författare)
  • A deep learning approach for overall survival prediction in lung cancer with missing values
  • 2024
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier. - 0169-2607 .- 1872-7565. ; 254
  • Tidskriftsartikel (refereegranskat)abstract
    • Background and Objective: In the field of lung cancer research, particularly in the analysis of overall survival (OS), artificial intelligence (AI) serves crucial roles with specific aims. Given the prevalent issue of missing data in the medical domain, our primary objective is to develop an AI model capable of dynamically handling this missing data. Additionally, we aim to leverage all accessible data, effectively analyzing both uncensored patients who have experienced the event of interest and censored patients who have not, by embedding a specialized technique within our AI model, not commonly utilized in other AI tasks. Through the realization of these objectives, our model aims to provide precise OS predictions for non-small cell lung cancer (NSCLC) patients, thus overcoming these significant challenges.Methods: We present a novel approach to survival analysis with missing values in the context of NSCLC, which exploits the strengths of the transformer architecture to account only for available features without requiring any imputation strategy. More specifically, this model tailors the transformer architecture to tabular data by adapting its feature embedding and masked self-attention to mask missing data and fully exploit the available ones. By making use of ad-hoc designed losses for OS, it is able to account for both censored and uncensored patients, as well as changes in risks over time.Results: We compared our method with state-of-the-art models for survival analysis coupled with different imputation strategies. We evaluated the results obtained over a period of 6 years using different time granularities obtaining a Ct-index, a time-dependent variant of the C-index, of 71.97, 77.58 and 80.72 for time units of 1 month, 1 year and 2 years, respectively, outperforming all state-of-the-art methods regardless of the imputation method used.Conclusions: The results show that our model not only outperforms the state-of-the-art's performance but also simplifies the analysis in the presence of missing data, by effectively eliminating the need to identify the most appropriate imputation strategy for predicting OS in NSCLC patients.
  •  
37.
  • Cava, José Manuel Gonzáles, et al. (författare)
  • Robust PID control of propofol anaesthesia: uncertainty limits performance, not PID structure
  • 2021
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier BV. - 0169-2607. ; 198, s. 1-1
  • Tidskriftsartikel (refereegranskat)abstract
    • Background and objective: New proposals to improve the regulation of hypnosis in anaesthesia based on the development of advanced control structures emerge continuously. However, a fair study to analyse the real benefits of these structures compared to simpler clinically validated PID-based solutions has not been presented so far. The main objective of this work is to analyse the performance limitations associated with using a filtered PID controller, as compared to a high-order controller, represented through a Youla parameter.Methods: The comparison consists of a two-steps methodology. First, two robust optimal filtered PID controllers, considering the effect of the inter-patient variability, are synthesised. A set of 47 validated paediatric pharmacological models, identified from clinical data, is used to this end. This model set provides representative inter-patient variability Second, individualised filtered PID and Youla controllers are synthesised for each model in the set. For fairness of comparison, the same performance objective is optimised for all designs, and the same robustness constraints are considered. Controller synthesis is performed utilising convex optimisation and gradient-based methods relying on algebraic differentiation. The worst-case performance over the patient model set is used for the comparison.Results: Two robust filtered PID controllers for the entire model set, as well as individual-specific PID and Youla controllers, were optimised. All considered designs resulted in similar frequency response characteristics. The performance improvement associated with the Youla controllers was not significant compared to the individually tuned filtered PID controllers. The difference in performance between controllers synthesized for the model set and for individual models was significantly larger than the performance difference between the individual-specific PID and Youla controllers. The different controllers were evaluated in simulation. Although all of them showed clinically acceptable results, the robust solutions provided slower responses.Conclusion: Taking the same clinical and technical considerations into account for the optimisation of the different controllers, the design of individual-specific solutions resulted in only marginal differences in performance when comparing an optimal Youla parameter and its optimal filtered PID counterpart. The inter-patient variability is much more detrimental to performance than the limitations imposed by the simple structure of the filtered PID controller.
  •  
38.
  •  
39.
  • DEGOULET, P, et al. (författare)
  • THE HELIOS MEDICAL SOFTWARE ENGINEERING ENVIRONMENT
  • 1994
  • Ingår i: COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE. - : ELSEVIER SCI PUBL IRELAND LTD. - 0169-2607. ; 45:1-2, s. 91-95
  • Tidskriftsartikel (övrigt vetenskapligt/konstnärligt)abstract
    • The aim of the HELIOS project is to create an integrated Software Engineering Environment (SEE) to facilitate the development and maintenance of medical applications. HELIOS is made of a set of software components, communicating through a software bus cal
  •  
40.
  • Eklund, Anders, et al. (författare)
  • fMRI Analysis on the GPU - Possibilities and Challenges
  • 2012
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier. - 0169-2607 .- 1872-7565. ; 105:2, s. 145-161
  • Tidskriftsartikel (refereegranskat)abstract
    • Functional magnetic resonance imaging (fMRI) makes it possible to non-invasively measure brain activity with high spatial resolution.There are however a number of issues that have to be addressed. One is the large amount of spatio-temporal data that needsto be processed. In addition to the statistical analysis itself, several preprocessing steps, such as slice timing correction and motioncompensation, are normally applied. The high computational power of modern graphic cards has already successfully been used forMRI and fMRI. Going beyond the first published demonstration of GPU-based analysis of fMRI data, all the preprocessing stepsand two statistical approaches, the general linear model (GLM) and canonical correlation analysis (CCA), have been implementedon a GPU. For an fMRI dataset of typical size (80 volumes with 64 x 64 x 22 voxels), all the preprocessing takes about 0.5 s on theGPU, compared to 5 s with an optimized CPU implementation and 120 s with the commonly used statistical parametric mapping(SPM) software. A random permutation test with 10 000 permutations, with smoothing in each permutation, takes about 50 s ifthree GPUs are used, compared to 0.5 - 2.5 h with an optimized CPU implementation. The presented work will save time forresearchers and clinicians in their daily work and enables the use of more advanced analysis, such as non-parametric statistics, bothfor conventional fMRI and for real-time fMRI.
  •  
41.
  • Engelmann, U, et al. (författare)
  • Experiences with the German teleradiology system MEDICUS
  • 1997
  • Ingår i: COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE. - 0169-2607. ; 54:1-2, s. 131-139
  • Tidskriftsartikel (övrigt vetenskapligt/konstnärligt)abstract
    • This paper introduces the teleradiology system, MEDICUS, which has been developed at the Deutsches Krebsforschungszentrum (German Cancer Research Center) in Heidelberg, Germany. The system is designed to work on ISDN lines as well as in a local area netwo
  •  
42.
  • Fu, Qiang, et al. (författare)
  • Anaesthesia record system on handheld computers : pilot experience and uses for quality control and clinical guidelines
  • 2005
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier. - 0169-2607 .- 1872-7565. ; 77:2, s. 155-63
  • Tidskriftsartikel (refereegranskat)abstract
    • This paper describes a mobile information system to collect patient information for anesthesia quality control. In this system, a mobile database program was designed for use on handheld computers (Pocket PC). This program is used to collect patient data at the bedside on the handhelds, with a daily synchronization of the data between the anaesthesiologists' handhelds with the anaesthesia database. All collected data are later used for quality control analysis. Furthermore, clinical guidelines will be included on these same handhelds. During the pilot phase, data from a sample set of about 300 patients were incorporated. The processes and interfaces of the system are presented in the paper. The current mobile database system has been designed to replace the original paper-based data collection system. The individual anaesthesiologist's handheld synchronizes patient data daily with anaesthesia database center. This information database is analyzed and used not only to give feedback to the individual doctor or center, but also to review the use of the guidelines provided and the results of their utilization.
  •  
43.
  • Gabrielsson, Johan, et al. (författare)
  • Maxsim2-Real-time interactive simulations for computer-assisted teaching of pharmacokinetics and pharmacodynamics.
  • 2014
  • Ingår i: Computer methods and programs in biomedicine. - : Elsevier BV. - 1872-7565 .- 0169-2607. ; 113, s. 815-829
  • Tidskriftsartikel (refereegranskat)abstract
    • We developed a computer program for use in undergraduate and graduate courses in pharmacology, pharmacokinetics and pharmacodynamics. This program can also be used in environmental and toxicological studies and preclinical simulation, to facilitate communication between modeling pharmacokineticists and project leaders or other decision-makers in the pharmaceutical industry. The program simulates the drug delivery and transport by means of (I) a six-compartment physiological pharmacokinetic flow model, (II) a system of traditional compartment models, or (III) a target-mediated drug disposition system. The program also can be used to simulate instantaneous equilibria between concentration and pharmacodynamic response, or as temporal delays between concentration and response. The latter is done by means of turnover models (indirect response models). Drug absorption, distribution, and elimination are represented by differential equations, which are described by organ and tissue volumes or other volumes of distribution, blood flows, clearance terms, and tissue-to-blood partition coefficients. The user can control and adjust these parameters by means of a slider in real time. By interactively changing the parameter values and simultaneously displaying the resulting concentration-time and/or response-time profiles, users can understand the major mechanisms that govern the disposition or the pharmacological response of the drug in the organism in real time. Schedule dependence is typically seen in clinical practice with a non-linear concentration-response relationship, and is difficult to communicate except via simulations. Here, we sought to illustrate the potential advantages of this approach in teaching pharmacology, pharmacokinetics, and pharmacodynamics to undergraduate pharmacy-, veterinary-, and medical students or to project teams in drug discovery/development.
  •  
44.
  •  
45.
  • Gelzinis, A., et al. (författare)
  • Automated speech analysis applied to laryngeal disease categorization
  • 2008
  • Ingår i: Computer Methods and Programs in Biomedicine. - Amsterdam : Elsevier. - 0169-2607 .- 1872-7565. ; 91:1, s. 36-47
  • Tidskriftsartikel (refereegranskat)abstract
    • The long-term goal of the work is a decision support system for diagnostics of laryngeal diseases. Colour images of vocal folds, a voice signal, and questionnaire data are the information sources to be used in the analysis. This paper is concerned with automated analysis of a voice signal applied to screening of laryngeal diseases. The effectiveness of 11 different feature sets in classification of voice recordings of the sustained phonation of the vowel sound /a/ into a healthy and two pathological classes, diffuse and nodular, is investigated. A k-NN classifier, SVM, and a committee build using various aggregation options are used for the classification. The study was made using the mixed gender database containing 312 voice recordings. The correct classification rate of 84.6% was achieved when using an SVM committee consisting of four members. The pitch and amplitude perturbation measures, cepstral energy features, autocorrelation features as well as linear prediction cosine transform coefficients were amongst the feature sets providing the best performance. In the case of two class classification, using recordings from 79 subjects representing the pathological and 69 the healthy class, the correct classification rate of 95.5% was obtained from a five member committee. Again the pitch and amplitude perturbation measures provided the best performance.
  •  
46.
  • Guerrero, Esteban, et al. (författare)
  • Forming We-intentions under breakdown situations in human-robot interactions
  • 2023
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier. - 0169-2607 .- 1872-7565. ; 242
  • Tidskriftsartikel (refereegranskat)abstract
    • Background and Objective: When agents (e.g. a person and a social robot) perform a joint activity to achieve a joint goal, they require sharing a relevant group intention, which has been defined as a We-intention. In forming We-intentions, breakdown situations due to conflicts between internal and “external” intentions are unavoidable, particularly in healthcare scenarios. To study such We-intention formation and “reparation” of conflicts, this paper has a two-fold objective: introduce a general computational mechanism allowing We-intention formation and reparation in interactions between a social robot and a person; and exemplify how the formal framework can be applied to facilitate interaction between a person and a social robot for healthcare scenarios.Method: The formal computational framework for managing We-intentions was defined in terms of Answer set programming and a Belief-Desire-Intention control loop. We exemplify the formal framework based on earlier theory-based user studies consisting of human-robot dialogue scenarios conducted in a Wizard of Oz setup, video-recorded and evaluated with 20 participants. Data was collected through semi-structured interviews, which were analyzed qualitatively using thematic analysis. N=20 participants (women n=12, men=8, age range 23-72) were part of the study. Two age groups were established for the analysis: younger participants (ages 23-40) and older participants (ages 41-72).Results: We proved four theoretical propositions, which are well-desired characteristics of any rational social robot. In our study, most participants suggested that people were the cause of breakdown situations. Over half of the young participants perceived the social robot's avoidant behavior in the scenarios.Conclusions: This work covered in depth the challenge of aligning the intentions of two agents (for example, in a person-robot interaction) when they try to achieve a joint goal. Our framework provides a novel formalization of the We-intentions theory from social science. The framework is supported by formal properties proving that our computational mechanism generates consistent potential plans. At the same time, the agent can handle incomplete and inconsistent intentions shared by another agent (for example, a person). Finally, our qualitative results suggested that this approach could provide an acceptable level of action/intention agreement generation and reparation from a person-centric perspective.
  •  
47.
  • Hakman, M, et al. (författare)
  • Object-oriented biomedical system modeling - The Rationale
  • 1999
  • Ingår i: COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE. - : ELSEVIER SCI IRELAND LTD. - 0169-2607. ; 59:1, s. 1-17
  • Tidskriftsartikel (övrigt vetenskapligt/konstnärligt)abstract
    • A short tutorial and a rationale for Object-Oriented Biomedical (Continuous) System Modelling (OOBSM) are given. The paper investigates and defines what is needed in order to make the work with complex bio-medical and pathophysiological models easier, les
  •  
48.
  • Hakman, M, et al. (författare)
  • Object-oriented biomedical system modelling - the language
  • 1999
  • Ingår i: COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE. - : ELSEVIER SCI IRELAND LTD. - 0169-2607. ; 60:3, s. 153-181
  • Tidskriftsartikel (övrigt vetenskapligt/konstnärligt)abstract
    • The paper describes a new object-oriented biomedical continuous system modelling language (OOBSML). It is fully object-oriented and supports model inheritance, encapsulation, and model component instantiation and behaviour polymorphism. Besides the tradit
  •  
49.
  •  
50.
  • Holmberg, Björn, et al. (författare)
  • Possibilities of texture based motion analysis
  • 2006
  • Ingår i: Computer Methods and Programs in Biomedicine. - : Elsevier BV. - 0169-2607 .- 1872-7565. ; 84:1, s. 1-10
  • Tidskriftsartikel (refereegranskat)abstract
    • Analysis of motion patterns in the human locomotion apparatus is important in many clinical areas like orthopaedics, physiotherapy, neurology, and sports medicine. Today marker based human motion analysis (HMA) is completely dominant in the clinical context. Technically, these systems are stable and dependable, and about a dozen variants are commercially available. One drawback of such systems is the time consuming and error prone marker placement. The purpose of the present contribution is to show that it is possible, with existing simple technology and methods, to build systems that do not depend on anatomically placed markers and yet produce an accuracy in knee joint center estimation comparable to marker based systems. It is shown that texture based methods can give estimates of knee joint center of rotation that can be compared to knee joint center estimates from marker based systems. Due to different definitions of the knee joint center position some bias is seen between the two estimates.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-50 av 110
Typ av publikation
tidskriftsartikel (110)
Typ av innehåll
refereegranskat (100)
övrigt vetenskapligt/konstnärligt (10)
Författare/redaktör
Wigertz, Ove, 1934- (8)
Koch, S (5)
Hooker, Andrew C. (4)
Bengtsson, Ewert (4)
Wagner, IV (4)
Karlsson, Mats O. (3)
visa fler...
Groth, Torgny (3)
Groth, T. (3)
Nyholm, Dag (3)
Olsson, E (3)
Ouhbi, Sofia (3)
Sandblad, B (3)
Shahsavar, Nosrat, 1 ... (3)
Gill, Hans, 1944- (3)
Göransson, Bengt (3)
Borälv, Erik (3)
Medvedev, Alexander, ... (2)
Scholl, J (2)
Li, YC (2)
Klintström, Benjamin (2)
Klintström, Eva, 195 ... (2)
Nilsson, D (2)
Andersson, B. (2)
Mavroidis, Panayioti ... (2)
Alm Carlsson, Gudrun (2)
Ljungberg, Michael (2)
Knutsson, Hans (2)
Dougherty, Mark (2)
von Rosen, Dietrich (2)
Persson, Cecilia (2)
Borgefors, Gunilla (2)
Martins da Silva, Ma ... (2)
Sandblad, Bengt (2)
Goransson, B (2)
Persson, T (2)
Seipel, Stefan (2)
Arkad, Kristina, 196 ... (2)
Xiao-Ming, Gao, 1963 ... (2)
Ludwigs, Ulf (2)
Åhlfeldt, Hans, 1955 ... (2)
Verikas, Antanas (2)
Helgason, Benedikt (2)
Chowdhury, Manish (2)
Sintorn, Ida-Maria, ... (2)
Malm, Patrik (2)
Jirstrand, Mats (2)
Olsson, Eva (2)
BORALV, E (2)
Meinzer, HP (2)
Matuszewski, Damian ... (2)
visa färre...
Lärosäte
Uppsala universitet (51)
Linköpings universitet (23)
Karolinska Institutet (17)
Lunds universitet (7)
Kungliga Tekniska Högskolan (5)
Göteborgs universitet (4)
visa fler...
Örebro universitet (4)
Sveriges Lantbruksuniversitet (4)
Umeå universitet (3)
Chalmers tekniska högskola (3)
Högskolan Dalarna (3)
Högskolan i Halmstad (2)
Stockholms universitet (2)
Högskolan i Gävle (2)
Mälardalens universitet (2)
visa färre...
Språk
Engelska (107)
Odefinierat språk (3)
Forskningsämne (UKÄ/SCB)
Teknik (22)
Medicin och hälsovetenskap (21)
Naturvetenskap (20)
Samhällsvetenskap (1)

År

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy