SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "L4X0:0345 7524 srt2:(2000-2004)"

Sökning: L4X0:0345 7524 > (2000-2004)

  • Resultat 1-25 av 289
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Asker, Claes (författare)
  • Computer Assisted Video Microscopy : in Characterization of Capillary Ensembles
  • 2000
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • This thesis focuses on evaluation and analysis of capillary microcirculatory changes in the skin, that can be improved and extended by computer assisted video microscopy. Capillary microscopy has been used extensively, both in clinical practice and research, to study different phenomena in the microvasculature of the skin, mainly in the nailfold of fingers and toes where a large portion of the capillary loop can be observed.In the majority of the different skin regions, the nutritive capillary network approaches the skin surface perpendicularly and capillary microscopy in these sites reveals the apex of the capillary loop as a dark spot. The main approach in this work has been to study a large ensemble of capillary loops, in order to apply statistical and planar models whilst, at the same time, obtaining spatial parameters related to the capillary localization.The statistical models of proximity are based on nearest neighbour methods and triangulation techniques. The main reason for introducing these models is because of their capability to characterize the heterogeneity of the capillary ensemble.A computer assisted video microscopy system, that enables both capturing and evaluating of capillary bed images, was assembled and was, thereafter, successfully used in laboratory and clinical studies
  •  
2.
  • Hellgren, Johan, 1966- (författare)
  • Compensation for hearing loss and cancellation of acoustic feedback indigital hearing aids
  • 2000
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • The development of integrated circuits during the last decades has made it possible to incorporate digital signal processing in hearing aids that fit into the ear canal and are powered by small zink-air batteries. The digital signal processing provides new possibilities for the hearing aid to modify the signal to fit the impaired ear. A linear phase filter bank that is intended as a basic building block of the signal processing in digital hearing aids is introduced in this dissertation. The filterbank is computationally very efficient and divides the input signal into a number of narrow band signals for further signal processing. The filter bank was combined with band specific gains and two compressors to form the signal processing of a hearing aid. The compressors allow leveldependent gain. Three alternative fitting strategies used to adjust the characteristics of this hearing aid to the individual hearing impaired listener were evaluated. The three fitting strategies differed mainly in the characteristics of the compressors. The strategies were evaluated by hearing impaired subjects in a field test and in laboratory tests. When the subjects were grouped according to their preference among the fitting strategies, the results showed significant differences in the hearing loss configuration between the groups.One of the main tasks of a hearing aid is to amplify the signal to make it audible for the hearing impaired user. The maximum gain that can be used in a hearing aid will be controlled by the feedback from the output to the microphone, as the hearing aid will be a part of a closed loop system. The feedback path depends on several factors such as the position of the microphone (differs between hearing aid categories), size of vent, and the acoustics around the hearing aid. The feedback, and thus the maximum gain that can be used in a hearing aid, has been identified with a number of different hearing aids in a number of conditions that can be expected when the hearing aid is used under real-life conditions.Feedback cancellation can be used to reduce the negative effects of feedback on the performance of the hearing aid. An internal feedback in the hearing aid that is an estimate of the external feedback is then used to cancel the feedback signal. The external feedback path will vary as the hearing aid is used ( e.g. when a telephone set is placed by the ear). It is thus desirable to continuously identify the feedback path. One approach to do this is to utilize closed loop identification with the direct method and some recursive identification method. The output and input signals of the hearing aid are then considered as input and output signal of the system to be identified, i.e. the feedback path. An advantage with this method is that the identification can be done without modifying the output signal. A drawback is that the estimate may be biased, depending on the characteristics of the input signal. A difference from many other closed loop identification problems is that the data used for identification will depend on previous estimates of the system. A feedback cancellation algorithm where Filtered-X LMS is used with the direct method has been analyzed. Filtered-XLMS is computationally efficient and gives a possibility to incorporate known characteristics of the feedback path in the model set used. Prefiltering was also used in the algorithm as it can provide an unbiased estimate if the spectrum of the input signal is known.
  •  
3.
  • Johansson, Magnus, 1973- (författare)
  • On noise and hearing loss : Prevalence and reference data
  • 2003
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Noise exposure is one of the most prevalent causes of irreversible occupational disease in Sweden and in many other countries. In hearing conservation programs, aimed at preventing noise-induced hearing loss, audiometry is an important instrument to highlight the risks and to assess the effectiveness of the program. A hazardous working environment and persons affected by it can be identified by monitoring the hearing thresholds of individual employees or groups of employees over time. However, in order to evaluate the prevalence of occupational noise-induced hearing loss, relevant reference data of unexposed subjects is needed.The first part of this dissertation concerns the changes in hearing thresholds over three decades in two occupational environments with high noise levels in the province of Östergötland, Sweden: the mechanical and the wood processing industries. The results show a positive trend, with improving median hearing thresholds from the 1970s into the 1990s. However, the hearing loss present also in the best period, during the 1990s, was probably greater than if the occupational noise exposure had not occurred. This study made clear the need for a valid reference data base, representing the statistical distribution of hearing threshold levels in a population not exposed to occupational noise but otherwise comparable to the group under study.In the second part of the dissertation, reference data for hearing threshold levels in women and men aged from 20 to 79 years are presented, based on measurements of 603 randomly selected individuals in Östergötland. A mathematical model is introduced, based on the hyperbolic tangent function, describing the hearing threshold levels as functions of age. The results show an age-related gender difference, with poorer hearing for men in age groups above 50 years.The prevalence of different degree of hearing loss and tinnitus is described for the same population in the third part of the dissertation. The overall prevalence of mild, moderate, severe or profound hearing loss was 20.9% collectively for women and 25.0% collectively for men. Tinnitus was reported by 8.9% of the women and 17.6% of the men. Approximately 2.4% of the subjects under study had been provided with hearing aids. However, about 7.7% were estimated to potentially benefit from hearing aids as estimated from their degree of hearing loss.Noise-induced hearing loss primarily causes damage to the outer hair cells of the inner ear. The fourth and last part of the dissertation evaluates the outer hair cell function, using otoacoustic emission measurements (OAE). Prevalence results from three different measuring techniques are presented: spontaneous otoacoustic emissions (SOAE), transient evoked otoacoustic emissions (TEOAE) and distortion product otoacoustic emissions (DPOAE). Gender and age effects on the recorded emission levels were also investigated. Women showed higher emission levels compared to men and for both women and men the emission levels decreased with increasing age. The results from the OAE recordings were shown to be somewhat affected by the state of the middle ear. The study included tympanometry, and the relation of the outcome ofthis test to the otoacoustic emissions is described, where high middle ear compliance resulted in low emission level. Reference data for the tympanometric measurements are also presented.The results of this project form an essential part of the important work against noiseinduced hearing loss, which needs continuous monitoring. The reference data presented here will provide a valid and reliable data base for the future assessment of hearing tests performed by occupational health centres in Sweden. This data base will in turn prove useful for comparison studies for Sweden as a responsible fellow EU member country setting high standards for work force safety. The statistical distribution of hearing threshold levels as a function of age for men and women in tabulated form is available on the Swedish Work Environment Authority (Arbetsmiljöverket) web site: http://www.av.se/publikationer/bocker/fysiskt/h293.shtm.
  •  
4.
  • Peters, Björn (författare)
  • Evaluation of Adapted Passenger Cars for Drivers with Physical Disabilities
  • 2004
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Driving can provide independent and efficient mobility. However, according to the driving license directive (91/439/EEC) are persons with locomotor impairments are only allowed drive if their disabilities can be compensated. Compensation can be realised by vehicle adaptations. The directive provides meagre guidance on how vehicles should be adapted or how to verify that the compensatory requirements are fulfilled. This is a gap in the current process for licensing drivers with physical disabilities. Furthermore, the Swedish process from driver assessment to driver licensing and adaptation approval is complex, fragmented, and suffer from lack of communication between involved authorities. The objective of this thesis was to contribute to the development of a method to evaluate vehicle adaptations for driver with physical disabilities. The focus was on the evaluation of adaptations for steering, accelerating and braking. Three driving simulator experiments and one manoeuvre test with adapted vehicles were conducted. A group of drivers with tetraplegia driving with hand controls were compared to able-bodied drivers in the first experiment. Even if the drivers with tetraplegia had a longer brake reaction time they performed comparable to the able-bodied drivers. However, they spent more effort and were more tired in order to perform as well as the able-bodied drivers. It was concluded that the adaptation was not sufficient. An Adaptive Cruise Controller (ACC) was tested in the second experiment in order to find out if it could alleviate the load on drivers using hand controls. It was found that the ACC decreased the workload on the drivers. However, ACC systems need to be adjustable and better integrated. The results from the first two experiments were used to provide some guidelines for ACCsystems to be used by drivers with disabilities. The third experiment was preceded by a manoeuvre test with joystick controlled cars. The test revealed some problems, which were attributed to time lags, control interference, and lack of feedback. Four joystick designs were tested with a group of drivers with tetraplegia in the third experiment. It was concluded that time lags should be made similar to what is found in standard cars. Lateral and longitudinal control should be separated. Active feedback can improve vehicle control but should be individually adjusted. The experiments revealed that drivers with the same diagnose can be functionally very diverse. Thus, an adaptation evaluation should be made individually. Furthermore, the evaluation should include a manoeuvre test. Finally, it was concluded that the evaluation approach applied in the experiments was relevant but needs to be further developed.
  •  
5.
  • Adlers, Mikael (författare)
  • Topics in Sparse Least Squares Problems
  • 2000
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • This thesis addresses topics in sparse least squares computation. A stable method for solving the least squares problem, min ||Ax-b||2 is based on the QR factorization.Here we have addressed the difficulty for storing the orthogonal matrix Q. Using traditional methods, the number of nonzero elements in Q makes it in many cases not feasible to store. Using the multifrontal technique when computing the QR factorization,Q may be stored and used more efficiently. A new user friendly Matlab implementation is developed.When a row in A is dense the factor R from the QR factorization may be completely dense. Therefore problems with dense rows must be treated by special techniques. The usual way to handle dense rows is to partition the problem into one sparse and one dense subproblem. The drawback with this approach is that the sparse subproblem may be more ill-conditioned than the original problem or even not have a unique solution. Another method, useful for problems with few dense rows, is based on matrix stretching, where the dense rows are split into several less dense rows linked then together with new artificial variables. We give and analyze the conditioning of the matrix obtained by this method and show that no ill-conditioned subproblem arise.In many least squares problems upper and lower bounds on the variables have to be satisfied at the solution. This type of problem arises, for example, in reconstruction problems in geodesy and tomography. Here methods based on direct factorization methods for sparse matrix computation are explored. Two completely different approaches for solving the problem are discussed and compared, i.e. active set methods and primal-dual interior-point methods based on Mehrotra's predictor-corrector path following method. An active set block method suitable for sparse problems is developed and a convergence proof is presented. The choice of barrier parameter, multiple corrections and finite termination for the interior-point method are discussed. Numerical comparison is given of the active set method, the interior-point method, together with an trust region method based on the interior-reflective Newton implemented in the optimization toolbox for MATLAB. The numerical tests show that the block active set method is faster and gives better accuracy for both nondegenerate and degenerate problems.
  •  
6.
  • Ahlberg, Jörgen, 1971- (författare)
  • Model-based coding : extraction, coding, and evaluation of face model parameters
  • 2002
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • This thesis deals with model-based coding of human faces for low bitrate communication. The basicidea of model-based coding is that the appearance and motion of a human face are analysed. By thisanalysis, compact parameters allowing realistic visualization of a synthetic face are extracted. The parameterscan be transmitted at very low bitrates, potentially allowing video face-to-face communicationover narrow channels like GSM or PSTN. Recently, numerous web-based applications for animatedfaces have emerged as well.Although the idea is two decades old by now, there are still several technical problems to be solved,and some of them are treated in this thesis. With the advent of the MPEG-4 standard for face animationwhich provides a standardized way for storing and transmitting animation parameters, model-basedcoding and face animation have been increasingly popular research topics.The first topic treated here is th ecompression of face model parameters. We propose a new compressionscheme, showing that such parameters can be transmitted with reasonable quality at bitrates lowerthan 1 kbit/s.Then, techniques for analysis of images and image sequences containing a human face are treated. Forstatic face images, a method for the extraction of facial features is proposed. For image sequences, amethod for face tracking using the active appearance model search algorithmis proposed and foundto be useful in practical experiments. Methods for achieving real-time performance have been developed,and the robustness and accuracy are analysed.To set the analysis in the context of possible applications, different ways of video synthesis using theextracted parameters are discussed as well.Finally, we take a look at the synthesized face models. Comparing them to real video sequences, wetry to evaluate how well the synthetic face models can convey emotions. Standard data sets and performancemeasures are suggested.
  •  
7.
  • Amandusson, Helena (författare)
  • Hydrogen Extraction with Palladium Based Membranes
  • 2000
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Palladium membranes are commercially used to purify hydrogen gas and in dehydrogenation reactions. The combination of the catalytic ability of the membrane surface and the selectivity of hydrogen permeation offers a tool to extract pure hydrogen and to shift a dehydrogenation reaction towards the product side. In this thesis, hydrogen extraction over palladium and palladium-silver based membranes both from different gas mixtures and from dehydrogenated organic molecules is investigated. The aim has been to find the optimal conditions for hydrogen extraction in different environments.The hydrogen permeation rate has been shown to depend on both silver concentration on the surface and in the bulk of a palladium based membrane. The diffusion through the membrane is the rate limiting step in the permeation process of most studied membranes. For a palladium membrane with 20 Å silver deposited on the upstream surface, the surface reactions, however, become rate limiting.Co-adsorbed oxygen will inhibit hydrogen permeation by blocking hydrogen adsorption sites and by consuming already adsorbed hydrogen in the water forming reaction on Pd membrane surfaces. On Pd70Ag30 membranes, however, oxygen has no effect on the hydrogen permeation rate, mainly due to an effective hydrogen dissolution into silver and a strongly reduced water formation rate. CO blocks hydrogen adsorption sites on both Pd and PdAg membranes effectively below 150°C, but above 300°C, CO has almost no effect on hydrogen permeation.Hydrogen can also be extracted through the dehydrogenation of organic molecules. A steady and continuous dehydrogenation of methanol and ethanol, and a subsequent hydrogen permeation, can be maintained in the presence of oxygen through both Pd and PdAg membranes. Without oxygen, a blocking contaminating layer is formed from the decomposition products, which prevents alcohol adsorption and thus also the hydrogen permeation. The hydrogen yield is larger over PdAg membranes than over Pd membranes mainly due to a smaller hydrogen consumption in the water forming reaction, but also due to a larger conversion of the alcohol on PdAg.The long time objective of this research has been to develop a method to extract hydrogen from anaerobic bacteria degradation of organic waste material in a co-operation project with microbiologists at the Department of Water and Environmental Studies at Linköping University. The selectivity towards hydrogen permeation in palladium membranes offers a tool to obtain clean hydrogen, which can be used as an energy carrier. By draining the bacteria culture of hydrogen, and thereby reducing the partial pressure of hydrogen, the fermentation process is directed towards a higher production of hydrogen.
  •  
8.
  • Ammenberg, Jonas, 1973- (författare)
  • Do standardised environmental management systems lead to reduced environmental impacts?
  • 2003
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • The overall aim of this thesis is to increase the understanding of the relationship between standardised environmental management systems (EMSs) and the environment, focusing on the use of such systems by companies and on systems in accordance with the ISO 14001 and/or EMAS standards. Another purpose is to investigate how standardised EMSs fit small and medium-sized enterprises (SMEs) and to examine a special EMS solution called the Hackefors model, used by a group of SMEs, to find out how this model has affected the environmental efforts and business of these enterprises.To gather knowledge on the connection between EMSs and environmental impacts, two main roads have been followed. Firstly, empirical studies (and a few literature reviews) have been conducted, among other things, aiming to clarify how the standards' requirements are interpreted and applied in reality, and uncover what this means in terms of environmental impacts. For the most part, external environmental auditors and environmental managers have been interviewed. An important purpose is to illuminate what an ISO 14001 certificate, or an EMAS registration, guarantees. This means that the minimum level is emphasised to a large extent. Secondly, a literature review has been conducted to collect knowledge on the selected issue from the international research arena. One intention is that this review will contribute information about the average use of EMSs and thus serve as a good complement to the empirical studies.It has to be concluded that a standardised EMS does not guarantee a good environmental performance and defmitely not reduced environmental impacts. Without any doubt, EMSs can be used to structure and strengthen a company's environmental efforts, and many companies surely have achieved important reductions in terms of environmental impacts by using an EMS. However, the standards' formulations are very indistinct and they can be interpreted and applied in many different ways. It is clearly possible to be certified and registered without improving very much at all. The effects of EMSs are to a very large extent dependent on how companies choose to use them. To capture the potential that EMSs have, issues of credibility should be observed. Therefore, the thesis includes some recommendations in the form of discussion points.The Hackefors model clearly can be used to overcome many of the common barriers forimplementing an EMS at SMEs. In the studied case, the EMS implementation had led to severalimportant environmental improvements and also to other types of improvements.
  •  
9.
  • Andersson, Fredrik, 1969- (författare)
  • On Curvature-Free Connections and Other Properties of the Lanczos Spinar
  • 2000
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • In this thesis we study various properties of the Lanczos spinor. The results include an algebraic classification scheme for symmetric (3,1)-spinors, a link between Lanczos potentials of the Weyl spinor and the spin coefficients in certain classes of spacetimes, an existence proof for the Lanczos potential of a general Weyl candidate that is much simpler than those previously known and the existence of a symmetric potential HABA'B' of an arbitrary symmetric (3,1)-spinor LABCA' in Einstein spacetimes according tothe equation LABCA' = ∇(AB' HBC)A'B'. In addition we study a large subclass of algebraically special spacetimes and obtain necessary and sufficient conditions for a Lanczos potential of the Weyl spinor to define a metric, curvature-free connection; we also prove existence of such connections. This construction is analogous to a construction of quasi-local momentum in the Kerr spacetime by Bergqvist and Ludvigsen and we therefore obtain an analogue of the Nester-Witten 2-form in these spacetimes.
  •  
10.
  • Andersson, Johan, 1972- (författare)
  • Multiobjective optimization in engineering design : applications to fluid power systems
  • 2001
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • This thesis focuses on how to improve design and development of complex engineering systems by employing simulation and optimization techniques. Within the thesis, methods are developed and applied to systems that combine mechanical, hydraulical and electrical subsystems, so-called multi-domain systems. Studied systems include a landing gear system for a civil aircraft, electrohydrostatic actuation systems for aircraft applications as well as hydraulic actuation systems.The usage of simulation and optimization in engineering design is gaining wider acceptance in all fields of industry as the computational capabilities of computers increase. Therefore, the applications for numerical optimization have increased dramatically. A great part of the design process is and will always be intuitive. Analytical techniques as well as numerical optimization could however be of great value and can permit vast improvements in design.Within the thesis, a framework is presented in which modeling and simulation are employed to predict the performance of a design. Additionally, non-gradient optimization techniques are coupled to the simulation models to automate the search for the best design.Engineering design problems often consist of several conflicting objectives. In many cases, the multiple objectives are aggregated into one single objective function. Optimization is then conducted with one optimal design as the result. The result is then strongly dependent on how the objectives are aggregated. Here a method is presented in which the Design Structure Matrix and the relationship matrix from the House of Quality method are applied to support the formulation of the objective function.Another approach to tackle multiobjective design problems is to employ the concept of Pareto optimality. Within this thesis a new multiobjective genetic algorithm is proposed and applied to support the design of a hydraulic actuation system. The outcome from such a multiobjective optimization is a set of Pareto optimal solutions that visualize the trade-off between the competing objectives. The proposed method is capable of handling a mix of continuous design variables and discrete selections of individual components from catalogs or databases.In real-world situations, system parameters will always include variations to some extent, and this fact is likely to influence the performance of the system. Therefore we need to answer not only the question "What is best?", but also "What is sufficiently robust?" Within this thesis, several approaches to handle these two different questions are presented.
  •  
11.
  • Andersson, Kjell (författare)
  • Development of a Method for Comparing Amphetamine Samples
  • 2004
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • The studies presented in this thesis were part of a pan-European project, and they describe the research performed to develop a method for comparing amphetamine samples. The work included the following: optimisation of a method for profiling of amphetamine by gaschromatography (GC); optimisation of a technique for preparing samples for GC analysis; testing and evaluation of the abilities of a number of distance metrics to discern links between amphetamine samples originating from the same batch of synthesis analysed using the method developed in the current studies.Street amphetamine contains hundreds of different by-products (target compounds), many of which have been identified and found to come from the different conditions used in the manufacturing process. Therefore, the main objective of developing the GC method was to optimise separation and quantification of the target compounds. The best separation was achieved using a DB-35MS capillary column. For quantification, mass spectrometry (MS) in the scan mode employing one target ion seemed to perform best, because this technique provided superior selectivity, and also made it possible to use mass spectra to identify target compounds. In addition, MS detection proved to offer excellent between-laboratory reproducibility.The conditions used to prepare amphetamine samples for liquid-liquid extraction (LLE) andsolid-phase extraction (SPE) were also optimised. The sample preparation methods gave similar results but it was easier to use LLE, hence it was chosen for further sample preparation.The ability of various numerical methods to find links between amphetamine samples was tested on GC-MS data of 26 target compounds that had been transformed by various pretreatment techniques. The best results were obtained when using Pearson correlation applied to data transformed by normalisation followed by applying fourth root.It was also demonstrated that the amphetamine profiling method developed in the current studies was superior to a procedure already in use in a number of forensic laboratories in Europe.
  •  
12.
  • Andersson, Kenneth, 1970- (författare)
  • Motion estimation for perceptual image sequence coding
  • 2003
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Since the advent of television obtaining high perceived quality using a limited bandwidth has been an important issue. This thesis proposes new methods for exploitation of temporal and perceptual redundancy in image sequences to achieve lower coding rate and/or higher visual quality. The methods presented are inspired and based on human visual system models. Particularly relevant in the present context are the indications that the visual cortex contains cells that are selective in orientation and frequency but invariant to the phase of the stimuli. For this reason a spatial quadrature filter bank, representing images in a similar fashion, is generated. For computational eciency, a filter net technique is employed using combinations of simple sequential 1D filter kernels. The lter bank is designed for interlaced video which is still the most common format for video sequences.For coding of image sequences temporal redundancy is reduced using motion compensated prediction. In motion compensated prediction the prediction of the next image is given by the present image and a predicted dense local motion field. Motion compensation is performed with a new and computationally ecient method. The method estimates data samples on a desired output grid from input data represented by samples on an irregularly grid. The initially predicted image is refined using forward motion compensation with a sparse motion field. In this case only the sparse motion field needs to be transmitted to the decoder. As a result a prediction without block artifacts, common in standard forward motion compensation schemes, is generated. Experiments show that this method performs better than traditional block-matching approaches.The motion is estimated using a new approach based on phase differences computed from products of quadrature filter responses. The approach includes learning parameters for motion estimation and introduces multiple hierarchical motion estimation to achieve estimates with high spatial resolution.The quadrature lter bank approach used for motion estimation also provides a basis for image quality estimation in accordance with human perception. This allows the video quality estimator to be an integral part of the video coder and opens up the possibility of local space-time optimization of video coding parameters.
  •  
13.
  • Andersson, Nils-Eric, 1970- (författare)
  • Structure and properties of thick plate and near surface properties after high speed machining af AA7010
  • 2003
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Using thick plates instead of forgings in the aircraft industry for integral construction of load carrying components is becoming more and more practice. The reasons are shorter lead-times from design of a modified or totally new component to introduction in an aircraft and smaller variations in properties for plate compared to forging. The concept of integral construction also reduces the assembly time. The complex shaped components are prepared by machining pieces of thick plate. The thicker the plate the larger components can be made in one piece. Machining components from blocks of material cut from thick plate means removal of a lot of material compared to machining of near final shape forgings. A change in machining concept to high speed machining leads to higher productivity and makes thin walled sections possible to manufacture due to decreased cutting forces.Variation of through thickness structure and properties of 7010-T7451/2 as 100, 150 and 200 mm thick plates has been investigated. Through thickness crystallographic texture, degree of recrystallisation, distribution of inclusions, chemical composition and grain size has been mapped out. The observed structure is taken into account in order to explain variations of properties like yield strength; fracture toughness and fatigue crack growth resistance. Equipment used in the work of characterising the structure has been EBSP, SEM, X-ray diffraction and optical microscopy. Equipment used for evaluating mechanical properties is screw machines, servo hydraulic machines and hardness indentors.The plates show a strong through thickness texture gradient that influence the yield strength. The yield strength is also dependent on chemical composition and quench rate. Recrystallisation did not show any significant influence on yield strength or fracture toughness. The grain morphology together with quench rate is of importance for the fracture toughness and the fatigue crack growth resistance.Properties of down cut milled surfaces on thin sections using a conventional machining concept and the concept of high speed machining at various cutting speeds have been compared. The same has been done for facemilled surfaces using conventional tools and inserts at cutting speeds varying from 500 m/min up to 5000 m/min. The property of most interest is the high cycle fatigue strength. The influence of surface roughness, residual stresses and hardness on the fatigue strength has been investigated. In order to try to gain a little more information about the near surface properties x-ray diffraction studies at grazing angle incidence has been undertaken.
  •  
14.
  • Andersson, Roger, 1967- (författare)
  • Monitoring principles for haemodialysis
  • 2002
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • This thesis deals with non-invasive monitoring techniques for haemodialysis. Three applications in particular have been investigated: Blood pressure measurements in the extra corporal circuit, the relationship between UV measurements and dialysate urea concentration and photoplethysmography (PPG) in haemodialysis patients.A non-invasive pressure sensor as an integrated part of the extracorporeal tube circuit was developed using modified cross-section tube geometry. The expansions of the modified tubes with different cross-sectional geometries were studied upon application of pressure, both experimentally and numerically, using the finite element method. Factorial design was used to study the relationship between pressure in the tube and force needed to restore the expanding tube to its original dimension. This investigation was performed for different tube cross-sectional geometries. A pressure sensor was designed, based on the previously obtained findings. The evaluation of the pressure sensor showed that the output corresponded well to applied pressure (R2=0.99).An UV-method for studying waste products in the dialysate has recently been developed by our research group. In the present study, it was investigated how the relationship between UV-absorption and dialysate urea concentration was affected by the treatment settings, patient anamnesis and prescribed pharmaceuticals. A mathematical model was proposed which includes these effects. Multiregression analysis indicated the possibility of performing individual estimates of urea concentration from UV-absorption.During haemodialysis, the patient's cardiovascular system is affected when excess fluid is extracted, which may result in blood pressure fluctuations. In the present study, a novel PPG method for monitoring haemodynamic changes during dialysis was investigated. The performed study indicates that PPG measurements relate to haemodynamic changes and may thus be useful in the patient monitoring. However, the relationship is complex and needs further studies.
  •  
15.
  • André-Jönsson, Henrik, 1968- (författare)
  • Indexing strategies for time series data
  • 2002
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Traditionally, databases have stored textual data and have been used to store administrative information. The computers used. and more specifically the storage available, have been neither large enough nor fast enough to allow databases to be used for more technical applications. In recent years these two bottlenecks have started to di sappear and there is an increasing interest in using databases to store non-textual data like sensor measurements or other types of process-related data. In a database a sequence of sensor measurements can be represented as a time series. The database can then be queried to find, for instance, subsequences, extrema points, or the points in time at which the time series had a specific value. To make this search efficient, indexing methods are required. Finding appropriate indexing methods is the focus of this thesis.There are two major problems with existing time series indexing strategies: the size of the index structures and the lack of general indexing strategies that are application independent. These problems have been thoroughly researched and solved in the case of text indexing files. We have examined the extent to which text indexing methods can be used for indexing time series.A method for transforming time series into text sequences has been investigated. An investigation was then made on how text indexing methods can be applied on these text sequences. We have examined two well known text indexing methods: the signature files and the B-tree. A study has been made on how these methods can be modified so that they can be used to index time series. We have also developed two new index structures, the signature tree and paged trie structures. For each index structure we have constructed cost and size models. resulting in comparisons between the different approaches.Our tests indicate that the indexing method we have developed. together with the B-tree structure. produces good results. It is possible to search for and find sub-sequences of very large time series efficiently.The thesis also discusses what future issues will have to be investigated for these techniques to be usable in a control system relying on time-series indexing to identify control modes.
  •  
16.
  • Antoni, Marc (författare)
  • Learning between projects : - in product development contexts -
  • 2002
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • A project is often formed with the intention to bring forth a deliverable within a certain time frame. This goal-focus however may lead to suboptimizations on the organizational level. At the same time projects more often than not underperform in a number of aspects, with an especially huge gap in the learning aspect.Thus the research questions addressed in this work concern what is done in organizations to foster organizational learning in project-based contexts, how different actions arc implemented and how the relation between the project and the functional organization affects organizational learning. Deming' s profound knowledge structure has been used to structure the theoretical aspects.It was found that learning in a project context is not a typical activity and needs to be organized for. The reinvention of the wheel still remains a major, unsolved problem for organizations that develop their products via projects. That is partly due to the fact that a project is a natural forum for learning new things. but not a natural forum for learning from past experience. A central finding of this work is that a systems perspective is necessary in project management contexts. Categories related to learning that have been identified are a category relating to documented knowledge, a category relating to learning in personal interaction and a category regarding organizational aspects.Regarding documented knowledge it was found that among practitioners there seems to be an unjustified belief in the ability of documents to solve learning problems of projects. Personal interaction was found to be an effective way for learning, but the risk of information overload by meetings is considerable. Bounded rationality seems to play an important role. As conflicting value systems meet. The forward-orientation, action-focus of projects meets with the long-time perspective of organizational learning.Important elements when researching learning between projects are amongst others feedback, learning incentives. modularization, location of project team members. organizational size, inter-project competition, full-time project managers and the formulation of a learning mandate for the project.
  •  
17.
  • Arildsson, Mikael, 1967- (författare)
  • Origin and processing of laser Doppler spectra
  • 2000
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Laser Doppler Flowmetry (LDF) is a technique for studying microvascular blood flow. Laser light is guided to the tissue and the backscattered light, after being Doppler shifted by moving Red Blood Cells (RBCs), is detected using a heterodyne process. In Laser Doppler Perfusion Monitoring, the light is guided to and from the tissue using optical fibers, whilst in Laser Doppler Perfusion Imaging (LDPI), a freely impinging laser beam is used. The Power Spectral Density (PSD) of the photodetector current constitutes the LDF spectrum and can be processed to yield an estimate of the tissue perfusion.The aim of this thesis was to study the origin and suggest adequate processing of the LDF spectra from both a technical and a physiological perspective.The orientation and length of the average scattering vector resulting from a RBC/photon interaction, are altered when changing the laser source wavelength. It has been shown theoretically that the change in the orientation and length do not alter the average frequency shift of the scattering event. In vivo measurements on a low and a high perfused area using the wavelengths 632.8 nm and 780 nm respectively, confirm the theoretical findings. The heterodyne efficiency of the detector increases for longer wavelengths, giving higher photodetector signal amplitude.A method for differentiating high velocity flows, by changing the filtering of the LDF spectra is presented. Emphasis is given to higher frequencies, including information from higher flow velocities.The scanning mode and the shape of the laser beam. influence the spectral signature in LDPI. In order to maintain a high signal quality, a stepwise mode is to be preferred. The continuous mode induces large spectral components that depend on the scanning speed and the tissue surface roughness. Using a slightly divergent beam minimizes the inlluence of the distance between the detector and the tissue surface.The physiological perspective includes two randomized and placebo controlled studies of the rela tionship between topical skin analgesia and the perfusion response to different local stimuli. In the first study, it was shown that analgesia using EMLA® cream during local heating, changes the dynamic flow regulation to a persistent and delayed perfusion increase. This was not observed in untreated or placebo treated skin. In the second study, this heating response was positively related to longer treatment times and. hence. to higher intradermal concentrations of the analgesics. By using capillary microscopy. it was shown that analgesic cream treatment for at least one hour reduces the number of physiologically active capillaries by 50%, while LDf perfusion remains unaltered. After local heating, the LDF perfusion increased, in 9/11 subjects by an average of 8.7 times, while the number of capillaries remained decreased. These findings suggest a low capillary influence of the LDF signal in human skin.
  •  
18.
  • Artursson, Tom (författare)
  • Development of Preprocessing Methods for Multivariate Sensor Data
  • 2002
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • In this work various aspects of data preprocessing are discussed. Preprocessing of data from multivariate sensor data systems is often necessary to extract relevant information or remove disturbances. Depending on the sensor type different preprocessing techniques have to be used. A number of important problems face the user e.g. drifting sensor data which results in very short-lived calibration models, and complex models with a very high number of variables. A systematic approach to handle and analyse data is necessary. In this work different preprocessing techniques are elaborated to reduce these problems.Drift is a gradual change in any quantitative characteristic that is supposed to remain constant. Thus, a drifting chemical sensor does not give exactly the same response even if it is exposed to exactly the same environment for a long time. Drift is a common problem for all chemical sensors, and thus needs to be considered as soon as measurements are made for a long period of time. Drift reduction methods try to compensate for the changes in sensor performance using mathematical models and thus maintaining the identification capability of the chemical sensor. The problem with drifting sensor data and thereby short-lived calibration models is overcome using reference samples and smart algorithms utilizing the relation between the reference measurements and the measurements from the samples. This has been studied in two of the papers in this thesis. Two new approaches have been developed and tested using data from real measurements from the electronic nose and tongue.In industry and science more and more variables are used to describe the process under study which lead to complex models and calculations. In order to increase the interpretability of the measurements, decrease the calculation demand on the computer, and/or to reduce noise an alternative, more compact, representation of the measurement can be made which describes the important features of the measurement well but with a much smaller vector. This data reduction is the topic of three of the papers in this thesis. Two different methods have been used: A double exponential model has been developed to approximate electronic tongue data. The parameters from this model describe the signal and are used as inputs to multivariate models. Secondly, the more general approach wavelet compression with different strategies for selection of wavelet coefficients has also been studied for variable reduction of both electronic tongue data and X-ray powder diffraction data.
  •  
19.
  • Askenäs, Linda, 1972- (författare)
  • The roles of IT : studies of organising when implementing and using enterprise systems
  • 2004
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • This study concerns implementation and use of enterprise systems (ERP systems) in complex organisations. The purpose of this thesis is to problematise and understand the social organising of information technology in organisations, by studying the implementation and use of enterprise system. This is done by using a multi-theoretical perspective and studying cases of complex organisations with a qualitative and interpretive research method.The study manages to give a more profound understanding of the roles of the technology. It is found that the enterprise systems act as Bureaucrat, Manipulator, Administrative assistant, Consultant or is dismissed, in the sense that intended users chose to avoid using them. These roles of information technology are formed in a rather complex organising process. A Structuration Theory Analytical Model and Procedure (STAMP) is developed, that serves to illuminate the dynamic relationships of individuals' or groups' interpretations, power and norms and how that affects the implementation and use of enterprise systems. The roles were also found to be different for individuals in similar work conditions. This was due to how they learned their job, what understanding of the job they developed, and what competences they developed. The different kinds of competences found, requested different support from the technology and it also made the individuals take a different approach towards how to use the technology. The study also explores why emotions appear and what they affect, and identifies patterns of emotions and emotional transitions that appear during implementation and use of an enterprise system.The social aspect of using technology is in focus in this thesis. And thus, the technology is not just a tool to make excellent use of; it becomes something more - an actor with different roles. The main contribution is the development of a language and an approach to how to understand the use and implementation of enterprise systems.
  •  
20.
  • Axelsson, Jan R.C. (författare)
  • Quality and Ergonomics : towards successful integration
  • 2000
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • The understanding and practice of ergonomics, built on the knowledge of human characteristics, abilities and needs, plays a fundamental role in satisfying people - whether they are labelled customers, users or workers. In this context ergonomics and quality can be regarded as overall approaches, as philosophies taking account of people in the way things are designed and organised. Given the conceptual similarities and that several indicators point to the fact that poor ergonomics may cause quality deficiencies, there has in resent years been an increased focus on the potential benefits of an integrative approach.The research presented in this thesis aims to support this process by developing a broader understanding of relationships in-between ergonomics and quality issues and to generate knowledge on how to effectively integrate quality and ergonomics development. The research project covers data and experiences from twenty-four case studies. A clearly marked interdisciplinary and triangulated research strategy with empirical data, qualitative as well as quantitative, from observations, interviews, surveys, and descriptive statistics, forms the basis of new knowledge, theories and methodology.The results show that there is a strong relationship between a number of ergonomics related issues and quality performance. Deficiencies in information handling, management, work task and workplace design and motivation are important causes of poor quality. It is shown that quality deficiency rates increases substantially for adverse work postures compared to good postures, and that ergonomics improvements can reduce quality deficiencies with 30-50%. Further studies show that a feasible and often necessary strategy in creating lasting improvements is to integrate the perspectives of employers, employees and customers - i.e. efficiency, work conditions and quality. To achieve this a number of participatory techniques and support systems have been developed, studied, empirical tested and evaluated.It is shown that a kaizen based suggestion system focusing on participatory ergonomics promotes motivation, commitment and high performance - quality as well as productivity. Nearly six out of ten suggestions deal with ergonomics issues and one out of five involves both ergonomics and quality improvement proposals. Furthermore an integrated participatory problem-solving approach, using both quality as well as ergonomics methodology and tools has been developed and empirically evaluated. It is shown that ergonomic tools are well suited in the quality improvement process, and that participatory ergonomics not only can be used as an effective tool eliminating ergonomic problems but also to progressively improve learning, working conditions and performance, thus stimulating a positive development of quality.The thesis also presents a balanced strategic management concept with the potential to integrate ergonomic issues at all levels. An integrated macro-ergonomic management concept is believed to have a great impact on improving people's job satisfaction, performance and quality of working life, thus helping to create the environment in which total quality management and quality improvement will flourish.It can finally be concluded that quality work is not just concerned with developing products and processes but equally with giving the people involved in these processes a chance to develop and to perform a good job. Designing a work system in accordance with ergonomic principles can thus be seen as a quality issue in which the internal customers' (employees') requests of ergonomics are given a high priority.
  •  
21.
  • Benesch, Johan, 1969- (författare)
  • Null Ellipsometry and Protein Adsorption to Model Biomaterials
  • 2001
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • When implants are inserted into the human body cascades of events become started that will determine the outcome of wound healing and ultimately the success of the implantation. The events start with the adsorption of small molecules, and proteins that may be activated (enzymes) or are able to activate cells of the immune defense and the healing process. In the biomaterials research that is conducted in our group we often ask two questions: "How much?" and "What?" proteins adsorb to a specific surface after incubation in serum or plasma. In the first two papers in this thesis I studied how well we are able to determine the answer to the first question. In the latter two works I tried to answer both questions for two model biomaterial surfaces: oligo(ethylene glycol) terminated self assembled monolayers on gold and chitosan coated silicon.In many null ellipsometric studies the protein film refractive, Nfilm, is assumed to be close to 1.5. In the first paper we analyzed if the assumption of Nfilm = 1.465 is satisfactory for the determination of the surface mass density of a submonolayer thin protein film. Human serum albumin (HSA) was labeled with 125I and mixed with non-labeled HSA, and hydrophobic and hydrophilic silicon pieces were incubated in the solutions. The surface mass densities on all pieces determined by both ellipsometry and gamma counter measurements, and was pair-wise compared. The above assumption regarding the value of Nfilm for the agreement between the methods was satisfactory, although precautions have to be made not to overestimate the surface mass density when studying radiolabelled proteins, especially at rough surfaces.Are the assumptions made in Paper I also true for up to 100 nm thick protein films or do we have to use a different protein film refractive index and do the ellipsometric model still hold? Human serum albumin (HSA) and polyclonal anti-HSA were labeled with 125I and mixed with unlabelled proteins. Hydrophobic silicon pieces were alternatingely incubated in the two protein solutions. Again the surface mass density was quantified with null ellipsometry and a gamma counter and the methods compared. The thickest protein layers were also gently scratched and the thickness measured by AFM. It appeared that a protein film refractive index Nfilm = 1.5 was a good choice for the determination of the protein film thickness. However, in order to obtain a good methods agreement for the adsorbed mass density a linear correction term was needed in the Cuypers surface mass density formula for ellipsometry. The physical interpretation of the correction term is presently unclear.Self assembled monolayers (SAMs) containing oligo (ethyleneglycol) end groups (OEG) have been successfully used to minimize protein adsorption from single protein solutions. We investigated the protein resistance in a fibrinogen solution, serum and plasma of OEG-SAMs with an increasing number of OEG units and with different end groups. It turned out that the adsorbed amounts after 10 minutes of plasma incubation and 15 minutes of fibrinogen incubation decreased with an increasing number of EG units. In serum, the total deposition and subsequent deposition of antibodies towards complement proteins (C3c, C3d and properdin) did not depend on the number of EG units. In summary, the investigated OEG-SAMs were not protein resistant in complex solutions, although the adsorbed amounts varied with the number of EG units and the terminal chemical group. Complement deposition was observed at OEG surfaces.For the last 50 odd years different polysaccharides, such as heparin and cellulose have been used for clinical applications and in recent years also chitin and its deacetylated form chitosan have gained increasing attention as potential biomaterials. Previous studies on complement activation by chitosan derivatives have focused on the soluble complement factors and not the surface bound ones that may be important for the binding of cells to surfaces. In our study about 10 nm thick chitosan films were incubated in plasma or serum and subsequently in polyclonal antibodysolutions. The films did not activate complement and the intrinsic pathway of coagulational though fibrinogen was detectable after plasma incubations. When the chitosan film was acetylated it became a strong alternative complement pathway activator in serum and fibrinogen was then no longer antibody detectable after plasma incubations.
  •  
22.
  • Bengtson, Håkan, 1971- (författare)
  • High speed CMOS optical receiver
  • 2004
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Optical communication develops very fast and is today the main method for long distance wired communication. However cost is still high. Looking back to the evolution of optical transmission systems, one main objective of system development has become more and more important; minimize cost per gigabytes per second per kilometre, Gb/s/km. One possible solution is to utilize cost effective CMOS technology for all electronic parts and replace optical dispersion compensation with electronic equalization. Recent research indicates that deep submicron CMOS technology indeed can be used for realizing highly integrated optical receivers at data rates of tens of gigabit per second. Recent research also shows that expensive optical dispersion compensators can be replaced with electrical equalizers.This thesis describes an optical receiver in CMOS. The optical receiver consists of a differential transimpedance amplifier, a differential and four times interleaved decision feedback equalizer, DFE coefficient update unit and symbol synchronization. The objective of the thesis is to find a scalable optical receiver topology for high speed, wide input range, low power supply sensitivity and reasonable input related noise, for a CMOS technology with a relative low fT The target is to reach 2.5 Gb/s in a 3.3 V 0.35μm CMOS process. Due to the risk for instability for cascaded broadband amplifiers, the amplifier stability related to the power supply impedance is also investigatedMeasurements on the differential transimpedance amplifier show 72 dBΩ transimpedance gain and 1.4 GHz bandwidth. Eye diagrams at data rate of 2.5 Gb/s show a dynamic range of more than 60 dB. The performance is reached with a three-stage transimpedance amplifier, utilizing differential high-speed stages and carefully chosen peaking frequencies.By measurements on the equalizer, a 2 Gb/s NRZ pattern are sent through a 5 m coaxial cable with an 8 cm open stub for echo generation. The coaxial cable with the stub introduces such large intersymbol interference that there is no eye opening left. The equalizer recovers then the sent data correctly.The equalizer is clocked with a DLL, which is separately tested. The DLL has a new type of delay cell with low power supply sensitivity. The delay range is 0.31 ns to 21.8 ns. For 0.5 ns delay of a 500 MHz signal, the delay increases 2.5 % if the power supply is decreased from 3.3 V to 3 V.The DFE coefficient update unit and the symbol synchronization is implemented in verilog-A and verified with simulations.
  •  
23.
  • Berglund, Aseel, 1973- (författare)
  • Augmenting the Remote Control : Studies in Complex Information Navigation for Digital TV
  • 2004
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • The transition to digital TV is changing the television set into an entertainment as well as information supplier device that provides two-way communication with the viewer. However, the present remote control device is not appropriate for navigation through the huge amount of services and information provided by the future digital TV, presumably also a device for accessing the Internet. One possibility for coping with the complex information navigation required by TV viewers is an augmentation of the interaction tools currently available for TV. Two approaches to such an augmentation are investigated in this thesis: linking paper-based TV guides to the digital TV and enhancing the remote control unit with speech interaction.Augmentation of paper-based TV guides is a futuristic research approach based on the integration of paper-based TV guides into computation technology. This solution provides interactive paper-based TV guides that also function as a remote control for the TV. A prototype system is developed and explorative studies are conducted to investigate this approach. These studies indicate the benefits of integrating paper-based TV guides into the TV set. They also illuminate the potential to provide innovative solutions for home information systems. Integrating familiar physical artefacts, such as paper and pen into TV technology may provide easy access to information services usually provided by PCs and the Internet. Thus, the same augmentation needed for TV as an entertainment device also opens up new communication channels for providing society information to citizens who do not feel comfortable with conventional computers.The thesis also reports on studies of speech interfaces for TV information navigation. Traditional speech interfaces have several common problems, such as user acceptance and misinterpretation of user input. These problems are investigated in empirical and explorative studies with implementation of mockups and running research systems. We have found that the pragmatic solution of augmenting remote control devices by speech is a suitable solution that eases information navigation and search.
  •  
24.
  • Berglund, Erik, 1971- (författare)
  • Library Communication Among Programmers Worldwide
  • 2002
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Programmers worldwide share components and jointly develop components on a global scale in contemporary software development. An important aspect of such library-based programming is the need for technical communication with regard to libraries – library communication. As part of their work, programmers must discover, study, and learn as well as debate problems and future development. In this sense, the electronic, networked media has fundamentally changed programming by providing new mechanisms for communication and global interaction through global networks such as the Internet. Today, the baseline for library communication is hypertext documentation. Improvements in quality, efficiency, cost and frustration of the programming activity can be expected by further developments in the electronic aspects of library communication.This thesis addresses the use of the electronic networked medium in the activity of library communication and aims to discover design knowledge for communication tools and processes directed towards this particular area. A model of library communication is provided that describes interaction among programmer as webs of interrelated library communities. A discussion of electronic, networked tools and processes that match such a model is also provided. Furthermore, research results are provided from the design and industrial valuationof electronic reference documentation for the Java domain. Surprisingly, the evaluation did not support individual adaptation (personalization). Furthermore, global library communication processes have been studied in relation to open-source documentation and user-related bug handling. Open-source documentation projects are still relatively uncommon even in open-source software projects. User-related Open-source does not address the passive behavior users have towards bugs. Finally, the adaptive authoring process in electronic reference documentation is addressed and found to provide limited support for expressing the electronic, networked dimensions of authoring requiring programming skill by technical writers.Library communication is addressed here by providing engineering knowledge with regards to the construction of practical electronic, networked tools and processes in the area. Much of the work has been performed in relation to Java library communication and therefore the thesis has particular relevancefor the object-oriented programming domain. A practical contribution of the work is the DJavadoc tool that contributes to the development of reference documentation by providing adaptive Java reference documentation.
  •  
25.
  • Bergman, Karl-Olof, 1965- (författare)
  • Ecology and conservation of the butterfly Lopinga achine
  • 2000
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • The ecology of the red-listed butterfly Lopinga achine was studied in partly open woodlands in the province of Östergotland, Sweden. Detailed autoecological research is essential for successful conservation of a species, and the present investigation focused on the initial aspects of such work, namely, choice of host plant, habitat selection, and dispersal ability. The effects of patch area, isolation, and successional stages of studied sites were also examined.The results indicate that L. achine depends on a single host plant, Carex montana. The females preferred to oviposit near this sedge, and most of the larvae (> 80%) were found on C. montana in the field. Egg-laying females and larvae were restricted to C. montana growing in a narrow zone along the edges of glades. This restriction to forest edges is probably the cause for the dependence of L. achine on a restricted stage of canopy cover. More specifically, no L. achine occurred at sites with less than 60% canopy cover, and population densities decreased sharply with more than 90% cover. Eighty-six percent of the studied sites that were occupied are unmanaged, thus the most important aspect of long-term conservation of L. achine is probably the deterministic changes of its woodland habitat. If the sites remain unmanaged, the system of populations will most likely collapse within 20-40 years. Populations of both L. achine and C. montana increased in size at experimentally managed sites where new glades were created. However, an important prerequisite of successful restoration appears to be the presence of C. montana along the edges of new glades from the start, because the rate of C. montana colonisation was slow.Most of the populations (50 of 79) were small (< 500 adults; none larger than 4,500) and seemed to show synchronous interannual fluctuation. The probability that a patch would be occupied increased with increasing patch area and decreasing distance to the nearest occupied patch. This was presumably due to different probabilities of extinctions, colonisations, and survival of the inhabiting populations. All but two of the sites with ≥ 3 individuals were within 740 m of the nearest neighbour. Patch size is also a key factor for occurrence: compared to larger patches, small patches are more dependent on neighbour populations.The majority of the movements were small and within sites, although in many cases the distance to other sites was less than 100 m. Only 56 individuals (4.0% of those recaptured) moved between sites. It seems that habitat patches of L. achine should be less than 700 m from each other to ensure inter-population contact. Fifteen to 20 wellconnected populations have been stated as a lower limit for a viable metapopulation.Based on these criteria, there are two groups of viable populations in the studied area,and these two groups will be given priority in future conservation work.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-25 av 289
Typ av publikation
doktorsavhandling (289)
Typ av innehåll
övrigt vetenskapligt/konstnärligt (289)
Författare/redaktör
Nilsson, Ulf (3)
Timpka, Toomas (2)
Nyce, James M. (2)
Gustafsson, Oscar, 1 ... (1)
Zhang, Jie (1)
Karlsson, Fredrik (1)
visa fler...
Ammenberg, Jonas, 19 ... (1)
Andersson, Kjell (1)
Kugler, Veronika Moz ... (1)
Lambrix, Patrick, Pr ... (1)
Lambrix, Patrick (1)
Gooran, Sasan, 1965- (1)
Larsson, Jan-Åke, 19 ... (1)
Storasta, Liutauras (1)
Persson, Per (1)
Ohlsson, Kjell (1)
Danev, Danyo (1)
Oliveberg, Mikael (1)
Elg, Mattias, 1968- (1)
Sandahl, Kristian (1)
Berntsson, Fredrik, ... (1)
Kozlov, Vladimir, Pr ... (1)
Lundström, Ingemar (1)
Adlers, Mikael (1)
Saunders, Michael A. ... (1)
Diószegi, Attila, 19 ... (1)
Peters, Björn (1)
Wikner, Jacob, 1973- (1)
Pandikow, Asmus (1)
Peng, Zebo (1)
Peng, Zebo, Professo ... (1)
Eles, Petru, Profess ... (1)
Music, Denis (1)
Jonsson, Peter (1)
Ahlberg, Jörgen, 197 ... (1)
Senior, Andrew, Dokt ... (1)
Dahlquist, Erik, Pro ... (1)
Hedenstierna, Göran, ... (1)
Eklund, Robert, 1962 ... (1)
Jackson, Mats, Profe ... (1)
Johansson, Anders, 1 ... (1)
Hult, Peter, 1964- (1)
Berglund, Aseel, 197 ... (1)
Petersson, Per (1)
Merkel, Magnus (1)
Ahrenberg, Lars (1)
Jönsson, Arne (1)
Askenäs, Linda, 1972 ... (1)
Johansen, Knut, 1958 ... (1)
Enander, Karin, 1972 ... (1)
visa färre...
Lärosäte
Linköpings universitet (288)
Jönköping University (5)
Karlstads universitet (3)
VTI - Statens väg- och transportforskningsinstitut (2)
Luleå tekniska universitet (1)
Högskolan i Gävle (1)
visa fler...
Chalmers tekniska högskola (1)
visa färre...
Språk
Engelska (286)
Svenska (3)
Forskningsämne (UKÄ/SCB)
Naturvetenskap (118)
Teknik (44)
Medicin och hälsovetenskap (4)
Samhällsvetenskap (4)
Lantbruksvetenskap (1)

År

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy