SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Shafiq ur Réhman 1978 ) "

Sökning: WFRF:(Shafiq ur Réhman 1978 )

  • Resultat 1-10 av 22
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Ehatisham-ul-Haq, Muhammad, et al. (författare)
  • Identifying smartphone users based on their activity patterns via mobile sensing
  • 2017
  • Ingår i: Procedia Computer Science. - : Elsevier. - 1877-0509. ; 113, s. 202-209
  • Tidskriftsartikel (refereegranskat)abstract
    • Smartphones are ubiquitous devices that enable users to perform many of their routine tasks anytime and anywhere. With the advancement in information technology, smartphones are now equipped with sensing and networking capabilities that provide context-awareness for a wide range of applications. Due to ease of use and access, many users are using smartphones to store their private data, such as personal identifiers and bank account details. This type of sensitive data can be vulnerable if the device gets lost or stolen. The existing methods for securing mobile devices, including passwords, PINs and pattern locks are susceptible to many bouts such as smudge attacks. This paper proposes a novel framework to protect sensitive data on smartphones by identifying smartphone users based on their behavioral traits using smartphone embedded sensors. A series of experiments have been conducted for validating the proposed framework, which demonstrate its effectiveness.
  •  
2.
  • Fahlquist, Karin, et al. (författare)
  • Human animal machine interaction : Animal behavior awareness and digital experience
  • 2010
  • Ingår i: Proceedings of ACM Multimedia 2010 - Brave New Ideas, 25-29 October 2010, Firenze, Italy.. - New York, NY, USA : ACM. - 9781605589336 ; , s. 1269-1274
  • Konferensbidrag (refereegranskat)abstract
    • This paper proposes an intuitive wireless sensor/actuator based communication network for human animal interaction for a digital zoo. In order to enhance effective observation and control over wild life, we have built a wireless sensor network. 25 video transmitting nodes are installed for animal behavior observation and experimental vibrotactile collars have been designed for effective control in an animal park. The goal of our research is two-folded. Firstly, to provide an interaction between digital users and animals, and monitor the animal behavior for safety purposes. Secondly, we investigate how animals can be controlled or trained based on vibrotactile stimuli instead of electric stimuli. We have designed a multimedia sensor network for human animal machine interaction. We have evaluated the effect of human animal machine state communication model in field experiments.
  •  
3.
  • Khan, Muhammad Sikandar Lal, 1988-, et al. (författare)
  • Face-off : A face reconstruction technique for virtual reality (VR) scenarios
  • 2016
  • Ingår i: 14th European Conference on Computer Vision, ECCV 2016. - Cham : Springer. - 9783319466033 - 9783319466040 ; , s. 490-503
  • Konferensbidrag (refereegranskat)abstract
    • Virtual Reality (VR) headsets occlude a significant portion of human face. The real human face is required in many VR applications, for example, video teleconferencing. This paper proposes a wearable camera setup-based solution to reconstruct the real face of a person wearing VR headset. Our solution lies in the core of asymmetrical principal component analysis (aPCA). A user-specific training model is built using aPCA with full face, lips and eye region information. During testing phase, lower face region and partial eye information is used to reconstruct the wearer face. Online testing session consists of two phases, (i) calibration phase and (ii) reconstruction phase. In former, a small calibration step is performed to align test information with training data, while the later uses half face information to reconstruct the full face using aPCAbased trained-data. The proposed approach is validated with qualitative and quantitative analysis.
  •  
4.
  • Li, Liu, 1965-, et al. (författare)
  • Vibrotactile chair : A social interface for blind
  • 2006
  • Ingår i: Proceedings SSBA 2006. - Umeå : Umeå universitet. Institutionen för datavetenskap. ; , s. 117-120
  • Konferensbidrag (övrigt vetenskapligt/konstnärligt)abstract
    • In this paper we present our vibrotactile chair, a social interface for the blind. With this chair the blind can get on-line emotion information from the person he / she is heading to. This greatly enhances communication ability and improve the quality of social life of the blind. In the paper we are discussing technical challenges and design principles behind the chair, and introduce the experimental platform: tactile facial expression appearance recognition system (TEARS)TM".
  •  
5.
  • Pizzamiglio, Sara, et al. (författare)
  • A multimodal approach to measure the distraction levels of pedestrians using mobile sensing
  • 2017
  • Ingår i: Procedia Computer Science. - : Elsevier. - 1877-0509. ; 113, s. 89-96
  • Tidskriftsartikel (refereegranskat)abstract
    • The emergence of smart phones has had a positive impact on society as the range of features and automation has allowed people to become more productive while they are on the move. On the contrary, the use of these devices has also become a distraction and hindrance, especially for pedestrians who use their phones whilst walking on the streets. This is reinforced by the fact that pedestrian injuries due to the use of mobile phones has now exceeded mobile phone related driver injuries. This paper describes an approach that measures the different levels of distraction encountered by pedestrians whilst they are walking. To distinguish between the distractions within the brain the proposed work analyses data collected from mobile sensors (accelerometers for movement, mobile EEG for electroencephalogram signals from the brain). The long-term motivation of the proposed work is to provide pedestrians with notifications as they approach potential hazards while they walk on the street conducting multiple tasks such as using a smart phone.
  •  
6.
  • Shafiq, ur Réhman, 1978-, et al. (författare)
  • Using Vibrotactile Language for Multimodal Human Animals Communication and Interaction
  • 2014
  • Ingår i: Proceedings of the 2014 Workshops on Advances in Computer Entertainment Conference, ACE '14. - New York, NY, USA : Association for Computing Machinery (ACM). - 9781450333146 ; , s. 1:1-1:5
  • Konferensbidrag (refereegranskat)abstract
    • In this work we aim to facilitate computer mediated multimodal communication and interaction between human and animal based on vibrotactile stimuli. To study and influence the behavior of animals, usually researchers use 2D/3D visual stimuli. However we use vibrotactile pattern based language which provides the opportunity to communicate and interact with animals. We have performed experiment with a vibrotactile based human-animal multimodal communication system to study the effectiveness of vibratory stimuli applied to the animal skin along with audio and visual stimuli. The preliminary results are encouraging and indicate that low-resolution tactual displays are effective in transmitting information.
  •  
7.
  • ur Réhman, Shafiq, 1978- (författare)
  • Expressing emotions through vibration for perception and control
  • 2010
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • This thesis addresses a challenging problem: “how to let the visually impaired ‘see’ others emotions”. We, human beings, are heavily dependent on facial expressions to express ourselves. A smile shows that the person you are talking to is pleased, amused, relieved etc. People use emotional information from facial expressions to switch between conversation topics and to determine attitudes of individuals. Missing emotional information from facial expressions and head gestures makes the visually impaired extremely difficult to interact with others in social events. To enhance the visually impaired’s social interactive ability, in this thesis we have been working on the scientific topic of ‘expressing human emotions through vibrotactile patterns’. It is quite challenging to deliver human emotions through touch since our touch channel is very limited. We first investigated how to render emotions through a vibrator. We developed a real time “lipless” tracking system to extract dynamic emotions from the mouth and employed mobile phones as a platform for the visually impaired to perceive primary emotion types. Later on, we extended the system to render more general dynamic media signals: for example, render live football games through vibration in the mobile for improving mobile user communication and entertainment experience. To display more natural emotions (i.e. emotion type plus emotion intensity), we developed the technology to enable the visually impaired to directly interpret human emotions. This was achieved by use of machine vision techniques and vibrotactile display. The display is comprised of a ‘vibration actuators matrix’ mounted on the back of a chair and the actuators are sequentially activated to provide dynamic emotional information. The research focus has been on finding a global, analytical, and semantic representation for facial expressions to replace state of the art facial action coding systems (FACS) approach. We proposed to use the manifold of facial expressions to characterize dynamic emotions. The basic emotional expressions with increasing intensity become curves on the manifold extended from the center. The blends of emotions lie between those curves, which could be defined analytically by the positions of the main curves. The manifold is the “Braille Code” of emotions. The developed methodology and technology has been extended for building assistive wheelchair systems to aid a specific group of disabled people, cerebral palsy or stroke patients (i.e. lacking fine motor control skills), who don’t have ability to access and control the wheelchair with conventional means, such as joystick or chin stick. The solution is to extract the manifold of the head or the tongue gestures for controlling the wheelchair. The manifold is rendered by a 2D vibration array to provide user of the wheelchair with action information from gestures and system status information, which is very important in enhancing usability of such an assistive system. Current research work not only provides a foundation stone for vibrotactile rendering system based on object localization but also a concrete step to a new dimension of human-machine interaction.
  •  
8.
  • ur Réhman, Shafiq, 1978-, et al. (författare)
  • Facial expression appearance for tactile perception of emotions
  • 2007
  • Ingår i: Proceedings of Swedish symposium on image analysis, 2007. ; , s. 157-160
  • Konferensbidrag (refereegranskat)abstract
    • To enhance the daily life experience for visually challengedpersons, the Facial Expression Appearance for Tactile systemis proposed. The manifold of facial expressions is used fortactile perception. Locally Linear Embedding (LLE) codingalgorithm is implemented for tactile display. LLE algorithmis extended to handle the real time video coding. The vibrotactilechair as a social interface for the blind is used to displaythe facial expression. The chair provides the visuallyimpaired with on-line emotion information about the personhe/she is approaching. The preliminary results are encouragingand show that it greatly enhances communication abilityof the visually impaired person.
  •  
9.
  • ur Réhman, Shafiq, 1978-, et al. (författare)
  • How to use manual labelers in evaluation of lip analysis systems?
  • 2009
  • Ingår i: Visual speech recognition. - USA : IGI Global. - 9781605661865 ; , s. 239-259
  • Bokkapitel (övrigt vetenskapligt/konstnärligt)abstract
    • The purpose of this chapter is not to describe any lip analysis algorithms but rather to discuss some of the issues involved in evaluating and calibrating labeled lip features from human operators. In the chapter we question the common practice in the field: using manual lip labels directly as the ground truth for the evaluation of lip analysis algorithms. Our empirical results using an Expectation-Maximization procedure show that subjective noise in manual labelers can be quite significant in terms of quantifying both human and  algorithm extraction performance. To train and evaluate a lip analysis system one can measure the performance of human operators and infer the “ground truth” from the manual labelers, simultaneously.
  •  
10.
  • ur Réhman, Shafiq, 1978-, et al. (författare)
  • iFeeling : Vibrotactile rendering of human emotions on mobile phones
  • 2010. - 1st Edition
  • Ingår i: Mobile multimedia processing. - Heidelberg, Germany : Springer Berlin. - 9783642123481 ; , s. 1-20
  • Bokkapitel (övrigt vetenskapligt/konstnärligt)abstract
    • Today, the mobile phone technology is mature enough to enable us to effectively interact with mobile phones using our three major senses namely, vision, hearing and touch. Similar to the camera, which adds interest and utility to mobile experience, the vibration motor in a mobile phone could give us a new possibility to improve interactivity and usability of mobile phones. In this chapter, we show that by carefully controlling vibration patterns, more than 1-bit information can be rendered with a vibration motor. We demonstrate how to turn a mobile phone into a social interface for the blind so that they can sense emotional information of others. The technical details are given on how to extract emotional information, design vibrotactile coding schemes, render vibrotactile patterns, as well as how to carry out user tests to evaluate its usability. Experimental studies and users tests have shown that we do get and interpret more than one bit emotional information. This shows a potential to enrich mobile phones communication among the users through the touch channel.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-10 av 22

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy