SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "L4X0:0345 7524 srt2:(1975-1979)"

Sökning: L4X0:0345 7524 > (1975-1979)

  • Resultat 1-13 av 13
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Ask, Per, 1950- (författare)
  • Oesophageal manometry : design and evaluation of measurement systems
  • 1978
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • The aim of this study was to investigate the technical characteristics of oesophageal manometry systems and to improve the performance of these systems.The investigation of the characteristics of oesophageal manometry systems with non-perfused catheters or catheters perfused with a flow generated by a syringe pump did not show properties which fullfilled the requirements for accurate pressure measurements. The bandwidth of the pressure mesurement system was limited by the high compliance of the syringe perfusion pump. The characteristics of perfused systems were improved by the design of a low-compliance perfusion pump. The frequency characteristics of a pressure measurement system utilizing the low-compliance perfusion pump seemed to be determined by the properties of the cathetermanometer system. The frequency content of oesophageal peristaltic pressure was studied by use of a measurement system including the low-compliance perfusion pump. This investigation showed that a bandwidth of about 8 Hz or more is necessary for accurate measurements, which bandwidth can only be obtained with a low-compliance system. In a clinical study a non-perfused system and a system perfused with a syringe pump were compared simultaneously to a system with the low-compliance perfusion pump. The non-perfused system and the system perfused with the syringe pump gave lower peristaltic and sphincter pressure amplitudes than the system with the low-compliance perfusion pump. Since the oesophageal sphincter pressure varies in different radial directions, a pressure transducer has been devised, the mechanical design of which gives an integration of a radial pressure profile. The transducer has a linear static and dynamic transfer function.
  •  
2.
  •  
3.
  • Nilsson, Gert, 1947- (författare)
  • On the measurement of evaporative water loss : methods and clinical applications
  • 1977
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • The new method for measurement of water loss by evaporation from the skin described in paper I, offers a high degree of accuracy and improved sensitivity in comparison with devices reported previously. Rapid recordings can be made by technically untrained persons both in clinical departments and in the laboratory.Using this method, the average insensible perspiration from the skin of healthy adult subjects at rest was found to be 381, 526 and 695 gr per day at ambient temperatures of 22°, 27° and 30°C, respectively. On the head, hands and feet the evaporation rate was high, while on other body surface areas more moderate values were recorded,In a clinical study on newborn infants, a linear relationship between the evaporative water loss from the skin and the ambient humidity was found. At an ambient temperature of 34.5°c and an ambient relative humidity of 50% the average transepidermal water loss was calculated to be 8.1 g/m2h.In burned patients high evaporation rates from about 140 g/m2h to over 220 g/m2h were recorded on the injured skin surfaces. Biological dressings were only slightly permeable to water vapour, while the permeability of the artificial dressings tested was generally high.By recording the rate of increase in vapour concentration in a closed measurement chamber placed over the exposed abdominal cavity during surgery, the water loss by evaporation from wounds and exteriorized viscera was determined. At incisions of minor extent the evaporative water loss was low, while at larger incisions with exteriorized viscera the water loss by evaporation from the wound exceeded the basal cutaneous perspiration of healthy adult subjects.
  •  
4.
  • Hammersberg, Peter (författare)
  • Techniques for the determination of the optimal performance of high resolution computerised tomography
  • 1977
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Techniques to deternine the optimal performance of high resolution computerised tomography (CT) has been studied, that is, optimal data collection parameter settings for the imaging task at issue and its detectability limits.Generally, CT is an X-ray based non-destructive testing method that was developed for medical purposes in the seventies and introduced for industrial applications the latter half of the eighties. CT produces maps of the X-ray linear attenuation coefficient of an object's interior; it is presented either as cross section images (two-dimensional CT) or as volume information (three-dimensional CT). The linear attenuation coefficient is the absorption and scatter of X-rays per length as it propagates through an object. The linear attenuation coefficient depends on the X-ray photon energy, and both object density and atomic composition.Most industrial CT-systems are equipped with conventional X-ray tubes that produce X-ray photons with an energy distribution, that is, a spectrum. Consequently, the effective linear attenuation coefficient of an object, shown by CT, depends on the full energy spectrum, how it changes as it propagates through the object, and how it interacts with the detector. To emphasise contrasts in the final CT-image, caused by density or compositional variations in the object investigated, the energy spectrum has to be chosen and shaped with filters in a way so that the differences in effective linear attenuation coefficient increase. However, it is empirically tedious to find optimal CT parameter settings, particularly for industrial CT, because of the wide range of materials, applications, and imaging tasks of interest.The main result from this work is the simulation model of the data collection process that makes it possible to determine optimal operator parameter settings that maximise the detectability for an arbitrary imaging task and predict its detectability limits. The simulation model can be used to correct for beam hardening artefacts in CT-images.There are several important partial results: (1) definition of quality of the CT-data in relation to the imaging task, including a model of the X-ray paths and how it is used to predict the optimal performance; (2) a model and method to determine how the information of the imaged object transfer from the detector entrance screen through the detector chain to CT projection data and further on to the final CT image. without detailed knowledge of each stage in the detector chain; (3) a model and method of how the total unsharpness of the CT-system is determined, in terms of modulation-transfer-function as a function of spatial frequency; and (4) the commonly used contrast-detail-curve, together with the limiting perception factor for detection of small details, is developed here to the more useful object detail-detectability curve.It has been shown that to model the data collection process for CT, a polyenergetic model is needed. Such a model consists of: complete X-ray energy spectra that are produced by the X-ray source used and a detector response model of how the X-rays impart energy to the detector entrance screen. Absolute X-ray spectra were measured using a Compton spectrometer. The detector response was determined using Monte Carlo photon transport simulations. It is further shown that X-ray source leak radiation increases image noise when the generated X-ray spectrum contains photons with energies over the K-edges of the enclosure wall material.
  •  
5.
  • Haraldsson, Anders, 1946- (författare)
  • A program manipulation system based on partial evaluation
  • 1977
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Program manipulation is the task to perform transformations on program code, and is normally done in order to optimize the code with respect of the utilization of some computer resource. Partial evaluation is the task when partial computations can be performed in a program before it is actually executed. If a parameter to a procedure is constant a specialized version of that procedure can be generated if the constant is inserted instead of the parameter in the procedure body and as much computations in the code as possible are performed.A system is described which works on programs written in INTERLISP, and which performs partial evaluation together with other transformations such as beta-expansion and certain other optimization operations. The system works on full LISP and not only for a "pure" LISP dialect, and deals with problems occurring there involving side-effects, variable assignments etc. An analysis of a previous system, REDFUN, results in a list of problems, desired extensions and new features. This is used as a basis for a new design, resulting in a new implementation, REDFUN-2. This implementation, design considerations, constraints in the system, remaining problems, and other experience from the development and experiments with the system are reported in this paper.     
  •  
6.
  • Jacobson, Birgit E., 1938- (författare)
  • The relation between microstructure and mechanical properties of cobalt- and nickel-rich Co-Ti-C and Ni-Ti-C alloys
  • 1976
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Age-hardening processes up to l000°c which might be important in the binder phase of the Co-Ti-C and Ni-Ti-Chard metal systems have been studied mainly by high voltage electron microscopy, (HVEM), optical microscopy (OM) and scanning electron microscopy (SEM) as well as by micro-and macrohardness measurements and tensile testing at room and elevated temperatures (500°-800°). In addition,the allotropic transformation of cobalt has been followed by X-ray diffraction in order to permit a more complete interpretation of the changes in microstructure reflected in the mechanical testing. Various degrees of FCC stabilization to room temperature are obtained from carbon and titanium in solid solution, from predeformation, (related to fine grain size), and from the very early stages of decomposition.In the Co-Ti-C system a wide range of precipitation morphologies is possible on ageing; the only precipitated phase is TiC usually with a cube coincidence orientation relationship to the FCC-Co matrix. The nucleation sites depend sensitively on ageing temperature with homogeneous precipitation of coherent particles dominating at low temperatures (600°c) together with grain boundary precipitates associated with precipitate free zones along the grain boundaries. Stacking fault precipitation is common at medium temperatures (700°-850°c) and dislocations are the main nucleation sites at high temperatures (850°-l000°C). The grain boundary precipitates are fairly fine (above 600°C) and often associated with twin boundaries; a dispersion of rods in unusual orientations are observed over most of the temperature range investigated.The homogeneous precipitation produces the best strengthening in Co-Ti-C alloys with a peak in hardness and yield strength after 600°C/64h. This material is, however, rather brittle at all temperatures; alloys with stacking fault precipitation give cleavage at room temperature but are ductile at elevated temperatures. The solution-treated metastable FCC-structure is characterized by high work hardening rates, high ultimate tensile strengths and large elongations to fracture due to stress- and strain-induced martensite transformation and dynamic strain ageing.In comparison to the Co-system, the microstructure in the Ni-Ti-C system is affected by the higher stacking fault energy of the matrix and solubilities of Ti and C. After a satisfactory solution treatment, followed by rapid quenching, carbide precipitation occurs as a sparse distribution of homogeneously nucleated precipitates at soo0c and for higher temperatures as lath-shaped dislocation nucleated precipitates as well as grain boundary particles. Under certain conditions, depending sensitively on solution treatment, repeated nucleation on climbing dislocations is obtained. The principal differences from the Co-system, are the very rapid precipitation at all temperatures and the more complex relationships as regards interlattice orientations and growth directions. At room temperature, the solid-solution hardening effects are of the same order as peak hardness after 600 C/500h. At elevated temperatures (500° -800°C) the coarse TiC distribution exerts some influence on the mechanical properties (particularly UTS).
  •  
7.
  • Kruse, Björn, 1937- (författare)
  • Design and implementation of a picture processor
  • 1977
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • In recent years much effort has been devoted to the study of picture processing algorithms and pictorial pattern recognition. The efforts have resulted in both theoretical understanding of and practical approaches to some of the basic problems involved.Common to the areas of picture processing is the great need for processing power. One picture may contain as much as several million picture elements (pixels). A scene viewed through a common television camera for example, represents approximately two megabits of data renewed 25 times per second. To produce or analyze data at such rates is completely beyond the capacity of the general purpose computer. Even if the speed is increased by a factor of ten, wich may be anticipated in the future, the GPC will fall short when faced with real-time applications. The problems stern from the fact that pictures represent a new type of data, orders of magnitude richer in information than that for which the GPC's architecture was designed.Many attempts have been made to design a computer architecture that is more suited for picture processing [3-6] [9-11] and in some cases the machines have also been constructed.In the following we will restrict ourselves to an analysis of the computational problems of pictorial pattern recognition. Starting with a survey in this chapter, the principles and motivations behind the PICAP processor will be described in chapter 2 followed by the PICAP implementation, examples of PICAP operation and evaluation in chapters 3, 4 and 5.
  •  
8.
  • Mats, Cedwall, 1946- (författare)
  • Semantisk analys av processbeskrivningar i naturligt språk
  • 1977
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • The purpose of the project described in this report is to study control structures in Natural Swedish, especially those occuring when tasks of algorithmic nature are described, and how to transform these specifications into programs, which can then be executed.The report describes and discusses the solutions that are used in an implemented system which can read and comprehend descriptions of patience (solitaire) games. The results are partly language dependent, but are not restricted to this specific problem environment.The system is divided into four modules. The syntactic module splits the sentence approximately to its component parts. In addition to the standard component categories, such as subject and predicate, every preposition is regarded as a component category of the sentence. The semantic analysis within a sentence works with a set of internallsation rules, one for each combination of a verb and a component part. The third module deals with the semantics on text level and integrates the representation of a sentence into the program code that is built up. The last module is an interpreter which can execute the programs representing patience games. 
  •  
9.
  • Ribberfors, A. Roland, 1942- (författare)
  • Relativistic effects in Compton scattering of photons from bound electron states.
  • 1975
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • What is light? A correct answer to this question is: I don't know, but I can see some of its properties"! It is exciting to study such a complex thing and to search for models which can help us to try to understand, in a proper way, some of the most interesting behavior of light. We see the sunshine and the radiation makes us warm, undoubtly it must be some kind of energy? Interference experiments with monochromatic light show that the radiation has wave-like properties, but for example scattering experiments predict a particle nature. Maxwell showed (1865) that light could be an electromagnetic wave and he also wrote down, f priori, the correct velocity of propagation in vacuum. Planck and de Broglie established the simple connection between energy and frequency, and momentum and wavelenght respectively. The development of the quantum mechanical formalism started by Schrodinger and continued by Dirac~ was a signal for an explosion of ideas explaining an overwhelming number of physical phenomena.
  •  
10.
  • Risch, Tore, 1949- (författare)
  • Compilation of multiple file queries in a meta-database system
  • 1978
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • A meta-database system is constructed for describing the contents of very large databases. The meta-database is implemented as data structures in a symbol manipulation language, separate from the underlying database system. A number of programs are built around the meta-database. The most important program module is a query compiler, which translates a non-procedural query language called LRL into a lower level language (COBOL). LRL permits the specification of database retrievals without stating which files are to be used in the search, or how they shall be connected. This is decided automatically by the query compiler. A major feature of the system is a method, the Focus method, for compiletime optimization of these choices. Other facilities include the definition of "views" of the database; data directory services; authority codes; and meta-database entry and update.Design issues discussed include the decision to compile rather than interpret non-procedural query languages; the decision to separate the meta-database from the underlying database system; and the problem of achieving an architecture convertible to any underlying database system. Experience with one such conversion is reported.   
  •  
11.
  • Schéele, Siv, 1941- (författare)
  • A mathematical programming algorithm for optimal bus frequencies
  • 1977
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Assume that a bus network is given, i.e. we are given a network of streets on which certain bus lines have been set up. Let the total number of buses be given. Assume furthermore that the total demand for bus transportation is given in the form of the marginal totals of an origin-destination matrix, i.e. the total demand for travel from certain origins as well as the total demand for travel to certain destinations is given.Problem: Determine the complete travel pattern and decide which bus frequencies to use on the various lines.The problem is formulated as a non-linear programming problem The most interesting features are that the model explicitly takes into account capacity constraints on the buses, and that the distribution of trips between different zones is influenced by the frequencies on the bus lines. The model also takes into account modal split between bus riding and walking. (An extension to a model handling modal split between car and bus is formulated but not solved.)The model is intended for usage in medium to long range planning.An iterative algorithm to solve this problem is developed. The algorithm is shown to converge to stationary points. As a part of the algorithm, an algorithm for the combined distribution-assignment problem in traffic planning is developed using decomposition.The model has been used on the bus network in the town of Linköping (80.000 inhabitants). The planning model suggests certain actions which are in agreement with the actions actually taken by the bus operator.
  •  
12.
  • Urmi, Jaak, 1944- (författare)
  • A machine independent LISP compiler and its implications for ideal hardware
  • 1978
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • A LISP compiler is constructed without any a priori assumptions about the target machine. In parallel with the compiler a LISP oriented instruction set is developed. The instruction set can be seen as either an intermediarylanguage for a traditional computer oras the instruction set for a special purpose LISP machine. The code produced by the compiler is evaluated with regard to its static and dynamic properties. Finally some architectural aspects on LISP oriented hardware are discussed. The notion of segments with different word lengths, under program control, is developed and a proposed implementation of this is described.    
  •  
13.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-13 av 13

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy