SwePub
Sök i SwePub databas

  Extended search

Boolean operators must be entered wtih CAPITAL LETTERS

Träfflista för sökning "hsv:(NATURAL SCIENCES) hsv:(Computer and Information Sciences) hsv:(Computer Vision and Robotics) srt2:(2005-2009)"

Search: hsv:(NATURAL SCIENCES) hsv:(Computer and Information Sciences) hsv:(Computer Vision and Robotics) > (2005-2009)

  • Result 1-10 of 490
Sort/group result
   
EnumerationReferenceCoverFind
1.
  • Liu, Yuanhua, 1971, et al. (author)
  • Considering the importance of user profiles in interface design
  • 2009
  • In: User Interfaces. ; , s. 23-
  • Book chapter (other academic/artistic)abstract
    • User profile is a popular term widely employed during product design processes by industrial companies. Such a profile is normally intended to represent real users of a product. The ultimate purpose of a user profile is actually to help designers to recognize or learn about the real user by presenting them with a description of a real user’s attributes, for instance; the user’s gender, age, educational level, attitude, technical needs and skill level. The aim of this chapter is to provide information on the current knowledge and research about user profile issues, as well as to emphasize the importance of considering these issues in interface design. In this chapter, we mainly focus on how users’ difference in expertise affects their performance or activity in various interaction contexts. Considering the complex interaction situations in practice, novice and expert users’ interactions with medical user interfaces of different technical complexity will be analyzed as examples: one focuses on novice and expert users’ difference when interacting with simple medical interfaces, and the other focuses on differences when interacting with complex medical interfaces. Four issues will be analyzed and discussed: (1) how novice and expert users differ in terms of performance during the interaction; (2) how novice and expert users differ in the perspective of cognitive mental models during the interaction; (3) how novice and expert users should be defined in practice; and (4) what are the main differences between novice and expert users’ implications for interface design. Besides describing the effect of users’ expertise difference during the interface design process, we will also pinpoint some potential problems for the research on interface design, as well as some future challenges that academic researchers and industrial engineers should face in practice.
  •  
2.
  • Boström, Henrik (author)
  • Maximizing the Area under the ROC Curve with Decision Lists and Rule Sets
  • 2007
  • In: Proceedings of the 7th SIAM International Conference on Data Mining. - : Society for Industrial and Applied Mathematics. - 9780898716306 ; , s. 27-34
  • Conference paper (peer-reviewed)abstract
    • Decision lists (or ordered rule sets) have two attractive properties compared to unordered rule sets: they require a simpler classi¯cation procedure and they allow for a more compact representation. However, it is an open question what effect these properties have on the area under the ROC curve (AUC). Two ways of forming decision lists are considered in this study: by generating a sequence of rules, with a default rule for one of the classes, and by imposing an order upon rules that have been generated for all classes. An empirical investigation shows that the latter method gives a significantly higher AUC than the former, demonstrating that the compactness obtained by using one of the classes as a default is indeed associated with a cost. Furthermore, by using all applicable rules rather than the first in an ordered set, an even further significant improvement in AUC is obtained, demonstrating that the simple classification procedure is also associated with a cost. The observed gains in AUC for unordered rule sets compared to decision lists can be explained by that learning rules for all classes as well as combining multiple rules allow for examples to be ranked according to a more fine-grained scale compared to when applying rules in a fixed order and providing a default rule for one of the classes.
  •  
3.
  • Hedenberg, Klas, 1968-, et al. (author)
  • Obstacle Detection For Thin Horizontal Structures
  • 2008
  • In: World Congress on Engineering and Computer Science. - Hong Kong : International Association of Engineers. - 9789889867102 ; , s. 689-693
  • Conference paper (peer-reviewed)abstract
    • Many vision-based approaches for obstacle detection often state that vertical thin structure is of importance, e.g. poles and trees. However, there are also problem in detecting thin horizontal structures. In an industrial case there are horizontal objects, e.g. cables and fork lifts, and slanting objects, e.g. ladders, that also has to be detected. This paper focuses on the problem to detect thin horizontal structures. The system uses three cameras, situated as a horizontal pair and a vertical pair, which makes it possible to also detect thin horizontal structures. A comparison between a sparse disparity map based on edges and a dense disparity map with a column and row filter is made. Both methods use the Sum of Absolute Difference to compute the disparity maps. Special interest has been in scenes with thin horizontal objects. Tests show that the sparse dense method based on the Canny edge detector works better for the environments we have tested.
  •  
4.
  • Bouguerra, Abdelbaki, 1974-, et al. (author)
  • An autonomous robotic system for load transportation
  • 2009
  • In: 2009 IEEE Conference on Emerging Technologies & Factory Automation (EFTA 2009). - New York : IEEE conference proceedings. - 9781424427277 - 9781424427284 ; , s. 1563-1566
  • Conference paper (peer-reviewed)abstract
    • This paper presents an overview of an autonomous robotic material handling system. The goal of the system is to extend the functionalities of traditional AGVs to operate in highly dynamic environments. Traditionally, the reliable functioning of AGVs relies on the availability of adequate infrastructure to support navigation. In the target environments of our system, such infrastructure is difficult to setup in an efficient way. Additionally, the location of objects to handle are unknown, which requires that the system be able to detect and track object positions at runtime. Another requirement of the system is to be able to generate trajectories dynamically, which is uncommon in industrial AGV systems.
  •  
5.
  • Magnusson, Andreas (author)
  • Evolutionary optimisation of a morphological image processor for embedded systems
  • 2008
  • Doctoral thesis (other academic/artistic)abstract
    • The work presented in this thesis concerns the design, development and implementation of two digital components to be used, primarily, in autonomously operating embedded systems, such as mobile robots. The first component is an image coprocessor, for high-speed morphological image processing, and the second is a hardware-based genetic algorithm coprocessor, which provides evolutionary computation functionality for embedded applications. The morphological image coprocessor, the Clutter-II, has been optimised for efficiency of implementation, processing speed and system integration. The architecture employs a compact hardware structure for its implementation of the morphological neighbourhood transformations. The compact structure realises a significantly reduced hardware resource cost. The resources saved by the compact structure can be used to increase parallelism in image processing operations, thereby improving processing speed in a similarly significant manner. The design of the Clutter-II as a coprocessor enables easy-to-use and efficient access to its image processing capabilities from the host system processor and application software. High-speed input-output interfaces, with separated instruction and data buses, provide effective communication with system components external to the Clutter-II. A substantial part of the work presented in this thesis concerns the practical implementation of morphological filters for the Clutter-II, using the compact transformation structure. To derive efficient filter implementations, a genetic algorithm has been developed. The algorithm optimises the filter implementation by minimising the number of operations required for a particular filter. The experience gained from the work on the genetic algorithm inspired the development of the second component, the HERPUC. HERPUC is a hardware-based genetic algorithm processor, which employs a novel hardware implementation of the selection mechanism of the algorithm. This, in combination with a flexible form of recombination operator, has made the HERPUC an efficient hardware implementation of a genetic algorithm. Results indicate that the HERPUC is able to solve the set of test problems, to which it has been applied, using fewer fitness evaluations and a smaller population size, than previous hardware-based genetic algorithm implementations.
  •  
6.
  • Castellano, Ginevra, et al. (author)
  • Expressive Control of Music and Visual Media by Full-Body Movement
  • 2007
  • In: Proceedings of the 7th International Conference on New Interfaces for Musical Expression, NIME '07. - New York, NY, USA : ACM Press. ; , s. 390-391
  • Conference paper (peer-reviewed)abstract
    • In this paper we describe a system which allows users to use their full-body for controlling in real-time the generation of an expressive audio-visual feedback. The system extracts expressive motion features from the user’s full-body movements and gestures. The values of these motion features are mapped both onto acoustic parameters for the real-time expressive rendering ofa piece of music, and onto real-time generated visual feedback projected on a screen in front of the user.
  •  
7.
  • Castellano, Ginevra, et al. (author)
  • User-Centered Control of Audio and Visual Expressive Feedback by Full-Body Movements
  • 2007
  • In: Affective Computing and Intelligent Interaction. - Berlin / Heidelberg : Springer Berlin/Heidelberg. - 9783540748885 ; , s. 501-510
  • Book chapter (peer-reviewed)abstract
    • In this paper we describe a system allowing users to express themselves through their full-body movement and gesture and to control in real-time the generation of an audio-visual feedback. The systems analyses in real-time the user’s full-body movement and gesture, extracts expressive motion features and maps the values of the expressive motion features onto real-time control of acoustic parameters for rendering a music performance. At the same time, a visual feedback generated in real-time is projected on a screen in front of the users with their coloured silhouette, depending on the emotion their movement communicates. Human movement analysis and visual feedback generation were done with the EyesWeb software platform and the music performance rendering with pDM. Evaluation tests were done with human participants to test the usability of the interface and the effectiveness of the design.
  •  
8.
  • Mancini, Maurizio, et al. (author)
  • A virtual head driven by music expressivity
  • 2007
  • In: IEEE Transactions on Audio, Speech, and Language Processing. - : Institute of Electrical and Electronics Engineers (IEEE). - 1558-7916 .- 1558-7924. ; 15:6, s. 1833-1841
  • Journal article (peer-reviewed)abstract
    • In this paper, we present a system that visualizes the expressive quality of a music performance using a virtual head. We provide a mapping through several parameter spaces: on the input side, we have elaborated a mapping between values of acoustic cues and emotion as well as expressivity parameters; on the output side, we propose a mapping between these parameters and the behaviors of the virtual head. This mapping ensures a coherency between the acoustic source and the animation of the virtual head. After presenting some background information on behavior expressivity of humans, we introduce our model of expressivity. We explain how we have elaborated the mapping between the acoustic and the behavior cues. Then, we describe the implementation of a working system that controls the behavior of a human-like head that varies depending on the emotional and acoustic characteristics of the musical execution. Finally, we present the tests we conducted to validate our mapping between the emotive content of the music performance and the expressivity parameters.
  •  
9.
  • Mancini, M., et al. (author)
  • From acoustic cues to an expressive agent
  • 2006
  • In: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). - Berlin, Heidelberg : Springer Berlin Heidelberg. - 3540326243 ; , s. 280-291
  • Conference paper (peer-reviewed)abstract
    • This work proposes a new way for providing feedback to expressivity in music performance. Starting from studies on the expressivity of music performance we developed a system in which a visual feedback is given to the user using a graphical representation of a human face. The first part of the system, previously developed by researchers at KTH Stockholm and at the University of Uppsala, allows the real-time extraction and analysis of acoustic cues from the music performance. Cues extracted are: sound level, tempo, articulation, attack time, and spectrum energy. From these cues the system provides an high level interpretation of the emotional intention of the performer which will be classified into one basic emotion, such as happiness, sadness, or anger. We have implemented an interface between that system and the embodied conversational agent Greta, developed at the University of Rome "La Sapienza" and "University of Paris 8". We model expressivity of the facial animation of the agent with a set of six dimensions that characterize the manner of behavior execution. In this paper we will first describe a mapping between the acoustic cues and the expressivity dimensions of the face. Then we will show how to determine the facial expression corresponding to the emotional intention resulting from the acoustic analysis, using music sound level and tempo characteristics to control the intensity and the temporal variation of muscular activation.
  •  
10.
  • Pousman, Zachary, et al. (author)
  • Living with Tableau Machine : A Longitudinal Investigation of a Curious Domestic Intelligence
  • 2008
  • In: Proceedings of the 10th International Conference on Ubiquitous Computing. - New York, NY, USA : ACM Press. ; , s. 370-379
  • Conference paper (peer-reviewed)abstract
    • We present a longitudinal investigation of Tableau Machine, an intelligent entity that interprets and reflects the lives of occupants in the home. We created Tableau Machine (TM) to explore the parts of home life that are unrelated to accomplishing tasks. Task support for "smart homes" has inspired many researchers in the community. We consider design for experience, an orthogonal dimension to task-centric home life. TM produces abstract visualizations on a large LCD every few minutes, driven by a set of four overhead cameras that capture a sense of the social life of a domestic space. The openness and ambiguity of TM allow for a cycle of co-interpretation with householders. We report on three longitudinal deployments of TM for a period of six weeks. Participant families engaged with TM at the outset to understand how their behaviors were influencing the machine, and, while TM remained puzzling, householders interacted richly with TM and its images. We extract some key design implications for an experience-focused smart home.
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-10 of 490
Type of publication
conference paper (305)
journal article (83)
book chapter (35)
doctoral thesis (25)
reports (14)
licentiate thesis (14)
show more...
editorial proceedings (5)
editorial collection (2)
book (2)
other publication (2)
patent (2)
research review (1)
show less...
Type of content
peer-reviewed (339)
other academic/artistic (140)
pop. science, debate, etc. (11)
Author/Editor
Balkenius, Christian (49)
Bengtsson, Ewert (32)
Gu, Irene Yu-Hua, 19 ... (28)
Lindblad, Joakim (28)
Johnsson, Magnus (20)
Johansson, Birger (20)
show more...
Borgefors, Gunilla (18)
Åström, Karl (15)
Mehnert, Andrew, 196 ... (15)
Crozier, Stuart (14)
Josephson, Klas (13)
Sladoje, Nataša (13)
Strand, Robin (11)
Solem, Jan Erik (10)
Svensson, Stina (10)
Overgaard, Niels Chr ... (10)
Nalpantidis, Lazaros (10)
Gasteratos, Antonios (10)
Wang, Tiesheng, 1975 (10)
Solli, Martin, 1980- (9)
Byröd, Martin (9)
McMahon, Kerry (9)
Gustavsson, Tomas, 1 ... (8)
Ahlberg, Jörgen, 197 ... (8)
Nyström, Ingela (8)
Backhouse, Andrew, 1 ... (8)
Kahl, Fredrik (7)
Kennedy, Dominic (7)
Viberg, Mats, 1961 (7)
Brun, Anders, 1976- (7)
Verikas, Antanas (7)
Bigun, Josef, 1961- (6)
Sirakoulis, Georgios ... (6)
Sintorn, Ida-Maria (6)
Knutsson, Hans (6)
Strand, Robin, 1978- (6)
Wählby, Carolina, 19 ... (6)
Malmberg, Filip (6)
Heyden, Anders (6)
Norell, Kristin (6)
Bergström, Mats (6)
Pham, Tuan D. (6)
Hotz, Ingrid (6)
Brun, Anders (6)
Hamid Muhammed, Hame ... (6)
Berger, Cyrille, 198 ... (6)
Lacroix, Simon (6)
Lenz, Reiner, 1953- (6)
Hamann, Bernd (6)
Shi, Pengfei (6)
show less...
University
Uppsala University (146)
Lund University (104)
Chalmers University of Technology (82)
Linköping University (69)
Royal Institute of Technology (57)
Swedish University of Agricultural Sciences (32)
show more...
Halmstad University (28)
Örebro University (13)
Umeå University (7)
University of Gävle (6)
University of Gothenburg (4)
University West (4)
University of Skövde (4)
Mälardalen University (2)
Jönköping University (2)
Malmö University (2)
Mid Sweden University (2)
RISE (2)
Stockholm University (1)
Linnaeus University (1)
University of Borås (1)
show less...
Language
English (470)
Swedish (16)
French (2)
German (1)
Spanish (1)
Research subject (UKÄ/SCB)
Natural sciences (490)
Engineering and Technology (88)
Medical and Health Sciences (22)
Agricultural Sciences (15)
Social Sciences (12)
Humanities (10)

Year

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view