SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Heikkonen J) "

Sökning: WFRF:(Heikkonen J)

  • Resultat 1-8 av 8
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Mohamed, S. A. S., et al. (författare)
  • Asynchronous Corner Tracking Algorithm Based on Lifetime of Events for DAVIS Cameras
  • 2020
  • Ingår i: 15th International Symposium on Visual Computing, ISVC 2020. - : Springer Science and Business Media Deutschland GmbH. ; , s. 530-541
  • Konferensbidrag (refereegranskat)abstract
    • Event cameras, i.e., the Dynamic and Active-pixel Vision Sensor (DAVIS) ones, capture the intensity changes in the scene and generates a stream of events in an asynchronous fashion. The output rate of such cameras can reach up to 10 million events per second in high dynamic environments. DAVIS cameras use novel vision sensors that mimic human eyes. Their attractive attributes, such as high output rate, High Dynamic Range (HDR), and high pixel bandwidth, make them an ideal solution for applications that require high-frequency tracking. Moreover, applications that operate in challenging lighting scenarios can exploit from the high HDR of event cameras, i.e., 140 dB compared to 60 dB of traditional cameras. In this paper, a novel asynchronous corner tracking method is proposed that uses both events and intensity images captured by a DAVIS camera. The Harris algorithm is used to extract features, i.e., frame-corners from keyframes, i.e., intensity images. Afterward, a matching algorithm is used to extract event-corners from the stream of events. Events are solely used to perform asynchronous tracking until the next keyframe is captured. Neighboring events, within a window size of 5 × 5 pixels around the event-corner, are used to calculate the velocity and direction of extracted event-corners by fitting the 2D planar using a randomized Hough transform algorithm. Experimental evaluation showed that our approach is able to update the location of the extracted corners up to 100 times during the blind time of traditional cameras, i.e., between two consecutive intensity images.
  •  
2.
  • Mohamed, S. A. S., et al. (författare)
  • DBA-Filter : A Dynamic Background Activity Noise Filtering Algorithm for Event Cameras
  • 2022
  • Ingår i: Proceedings of the 2021 Computing Conference, Volume 1. - : Springer Science and Business Media Deutschland GmbH. ; , s. 685-696
  • Konferensbidrag (refereegranskat)abstract
    • Newly emerged dynamic vision sensors (DVS) offer a great potential over traditional sensors (e.g. CMOS) since they have a high temporal resolution in the order of μs, ultra-low power consumption and high dynamic range up to 140 dB compared to 60 dB in frame cameras. Unlike traditional cameras, the output of DVS cameras is a stream of events that encodes the location of the pixel, time, and polarity of the brightness change. An event is triggered when the change of brightness, i.e. log intensity, of a pixel exceeds a certain threshold. The output of event cameras often contains a significant amount of noise (outlier events) alongside the signal (inlier events). The main cause of that is transistor switch leakage and noise. This paper presents a dynamic background activity filtering, called DBA-filter, for event cameras based on an adaptation of the K-nearest neighbor (KNN) algorithm and the optical flow. Results show that the proposed algorithm is able to achieve a high signal to noise ratio up to 13.64 dB. 
  •  
3.
  • Mohamed, S. A. S., et al. (författare)
  • Dynamic resource-aware corner detection for bio-inspired vision sensors
  • 2020
  • Ingår i: 2020 25th International Conference on Pattern Recognition, (ICPR). - : Institute of Electrical and Electronics Engineers (IEEE). - 9781728188089 ; , s. 10465-10472
  • Konferensbidrag (refereegranskat)abstract
    • Event-based cameras are vision devices that transmit only brightness changes with low latency and ultra-low power consumption. Such characteristics make event-based cameras attractive in the field of localization and object tracking in resource-constrained systems. Since the number of generated events in such cameras is huge, the selection and filtering of the incoming events are beneficial from both increasing the accuracy of the features and reducing the computational load. In this paper, we present an algorithm to detect asynchronous corners form a stream of events in real-time on embedded systems. The algorithm is called the Three Layer Filtering-Harris or TLF-Harris algorithm. The algorithm is based on an events' filtering strategy whose purpose is 1) to increase the accuracy by deliberately eliminating some incoming events, i.e., noise and 2) to improve the real-time performance of the system, i.e., preserving a constant throughput in terms of input events per second, by discarding unnecessary events with a limited accuracy loss. An approximation of the Harris algorithm, in turn, is used to exploit its high-quality detection capability with a low-complexity implementation to enable seamless real-time performance on embedded computing platforms. The proposed algorithm is capable of selecting the best corner candidate among neighbors and achieves an average execution time savings of 59% compared with the conventional Harris score. Moreover, our approach outperforms the competing methods, such as eFAST, eHarris, and FA-Harris, in terms of real-time performance, and surpasses Arc* in terms of accuracy.
  •  
4.
  •  
5.
  • Yasin, J. N., et al. (författare)
  • Dynamic Formation Reshaping Based on Point Set Registration in a Swarm of Drones
  • 2021
  • Ingår i: Advances in Intelligent Systems and Computing. - : Springer Nature. - 9783030730994 ; , s. 577-588
  • Konferensbidrag (refereegranskat)abstract
    • This work focuses on the formation reshaping in an optimized manner in autonomous swarm of drones. Here, the two main problems are: 1) how to break and reshape the initial formation in an optimal manner, and 2) how to do such reformation while minimizing the overall deviation of the drones and the overall time, i.e. without slowing down. To address the first problem, we introduce a set of routines for the drones/agents to follow while reshaping to a secondary formation shape. And the second problem is resolved by utilizing the temperature function reduction technique, originally used in the point set registration process. The goal is to be able to dynamically reform the shape of multi-agent based swarm in near-optimal manner while going through narrow openings between, for instance obstacles, and then bringing the agents back to their original shape after passing through the narrow passage using point set registration technique.
  •  
6.
  • Yasin, J. N., et al. (författare)
  • Low-cost ultrasonic based object detection and collision avoidance method for autonomous robots
  • 2020
  • Ingår i: International Journal of Information Technology (Singapore). - : Springer Nature. - 2511-2104.
  • Tidskriftsartikel (refereegranskat)abstract
    • This work focuses on the development of an effective collision avoidance algorithm that detects and avoids obstacles autonomously in the vicinity of a potential collision by using a single ultrasonic sensor and controlling the movement of the vehicle. The objectives are to minimise the deviation from the vehicle’s original path and also the development of an algorithm utilising one of the cheapest sensors available for very lost cost systems. For instance, in a scenario where the main ranging sensor malfunctions, a backup low cost sensor is required for safe navigation of the vehicle while keeping the deviation to a minimum. The developed algorithm utilises only one ultrasonic sensor and approximates the front shape of the detected object by sweeping the sensor mounted on top of the unmanned vehicle. In this proposed approach, the sensor is rotated for shape approximation and edge detection instead of moving the robot around the encountered obstacle. It has been tested in various indoor situations using different shapes of objects, stationary objects, moving objects, and soft or irregularly shaped objects. The results show that the algorithm provides satisfactory outcomes by entirely avoiding obstacles and rerouting the vehicle with a minimal deviation.
  •  
7.
  • Yasin, J N, et al. (författare)
  • Navigation of Autonomous Swarm of Drones Using Translational Coordinates
  • 2020
  • Ingår i: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). - : Springer. ; , s. 353-362
  • Konferensbidrag (övrigt vetenskapligt)abstract
    • This work focuses on an autonomous swarm of drones, a multi-agent system, where the leader agent has the capability of intelligent decision making while the other agents in the swarm follow the leader blindly. The proposed algorithm helps with cost cutting especially in the multi-drone systems, i.e., swarms, by reducing the power consumption and processing requirements of each individual agent. It is shown that by applying a pre-specified formation design with feedback cross-referencing between the agents, the swarm as a whole can not only maintain the desired formation and navigate but also avoid collisions with obstacles and other drones. Furthermore, the power consumed by the nodes in the considered test scenario, is reduced by 50% by utilising the proposed methodology. 
  •  
8.
  • Mohamed, S. A. S., et al. (författare)
  • A Survey on Odometry for Autonomous Navigation Systems
  • 2019
  • Ingår i: IEEE Access. - : Institute of Electrical and Electronics Engineers Inc.. - 2169-3536. ; 7, s. 97466-97486
  • Tidskriftsartikel (refereegranskat)abstract
    • The development of a navigation system is one of the major challenges in building a fully autonomous platform. Full autonomy requires a dependable navigation capability not only in a perfect situation with clear GPS signals but also in situations, where the GPS is unreliable. Therefore, self-contained odometry systems have attracted much attention recently. This paper provides a general and comprehensive overview of the state of the art in the field of self-contained, i.e., GPS denied odometry systems, and identifies the out-coming challenges that demand further research in future. Self-contained odometry methods are categorized into five main types, i.e., wheel, inertial, laser, radar, and visual, where such categorization is based on the type of the sensor data being used for the odometry. Most of the research in the field is focused on analyzing the sensor data exhaustively or partially to extract the vehicle pose. Different combinations and fusions of sensor data in a tightly/loosely coupled manner and with filtering or optimizing fusion method have been investigated. We analyze the advantages and weaknesses of each approach in terms of different evaluation metrics, such as performance, response time, energy efficiency, and accuracy, which can be a useful guideline for researchers and engineers in the field. In the end, some future research challenges in the field are discussed.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-8 av 8

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy