SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Ebelin Pontus) "

Sökning: WFRF:(Ebelin Pontus)

  • Resultat 1-4 av 4
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Ebelin, Pontus, et al. (författare)
  • Estimates of Temporal Edge Detection Filters in Human Vision
  • 2024
  • Ingår i: ACM Transactions on Applied Perception. - 1544-3558. ; 21:2
  • Tidskriftsartikel (refereegranskat)abstract
    • Edge detection is an important process in human visual processing. However, as far as we know, few attempts have been made to map the temporal edge detection filters in human vision. To that end, we devised a user study and collected data from which we derived estimates of human temporal edge detection filters based on three different models, including the derivative of the infinite symmetric exponential function and temporal contrast sensitivity function. We analyze our findings using several different methods, including extending the filter to higher frequencies than were shown during the experiment. In addition, we show a proof of concept that our filter may be used in spatiotemporal image quality metrics by incorporating it into a flicker detection pipeline.
  •  
2.
  • Ebelin, Pontus (författare)
  • Evaluating and Improving Rendered Visual Experiences : Metrics, Compression, Higher Frame Rates & Recoloring
  • 2024
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Rendered imagery is presented to us daily. Special effects in movies, video games, scientific visualizations, and marketing catalogs all often rely on images generated through computer graphics. However, with all the possibilities that rendering offers come also a plethora of challenges. This thesis proposes novel ways of evaluating the visual errors caused when some of those challenges are not completely overcome. The thesis also suggests ways to improve on the visual experience observers have when viewing rendered content.In the introduction of this thesis, I provide an overview of a subset of the many and fantastic aspects of the human visual system. I also describe how images are rendered using computer graphics, some of the related challenges, and how the final result is displayed to users. Finally, I discuss some of the basics of image and video quality assessment. The scientific publications contained in this thesis focus on image quality metrics, compression, and rendering at high frame rates. In addition, one paper considers the recoloring of images with the goal of giving people with color vision deficiencies an improved visual experience in a process known as daltonization.Papers I–III suggest ways to evaluate and communicate the errors that users may see in rendered images. In those papers, an image’s error is determined by how much it visually differs from a perfect-quality version of the same view. The focus is on the error map, an image that indicates the magnitude and locations of errors. In Paper IV, tools proposed in the first three papers are used to convey how a novel material texture compression algorithm results in lower visual error compared to competing techniques at similar, low bit rates. To achieve good quality at high compression rates, the proposed algorithm exploits similarities in the textures used for materials.Starting with Paper V, the thesis puts increased emphasis on temporal effects. That paper estimates the temporal edge detection filters in human vision, while previous research had mainly examined spatial edge detection filters. Paper VI demonstrates how perceived quality in rendering can be improved by leveraging the human visual system. The paper suggests a method for rendering ~4× more frames per second, which, paired with content-dependent sampling patterns and reconstruction, improves the overall visual experience of rendered image sequences despite reducing the quality of individual frames. This thesis’ final paper, Paper VII, presents a real-time daltonization algorithm that recolors images in a temporally consistent manner, so as to avoid flickering hue changes in image sequences, which are often an issue for competing algorithms that target single images. The proposed recoloring preserves luminance and, thus, the important visual ques it provides.
  •  
3.
  • Ebelin, Pontus, et al. (författare)
  • Luminance-Preserving and Temporally Stable Daltonization
  • 2023
  • Ingår i: Eurographics 2023 - Short Papers. - 9783038682097 ; , s. 45-49
  • Konferensbidrag (refereegranskat)abstract
    • We propose a novel, real-time algorithm for recoloring images to improve the experience for a color vision deficient observer. The output is temporally stable and preserves luminance, the most important visual cue. It runs in 0.2 ms per frame on a GPU.
  •  
4.
  • Vaidyanathan, Karthik, et al. (författare)
  • Random-Access Neural Compression of Material Textures
  • 2023
  • Ingår i: ACM Transactions on Graphics. - 0730-0301. ; , s. 1-25
  • Tidskriftsartikel (refereegranskat)abstract
    • The continuous advancement of photorealism in rendering is accompaniedby a growth in texture data and, consequently, increasing storage and memory demands. To address this issue, we propose a novel neural compressiontechnique specifically designed for material textures. We unlock two morelevels of detail, i.e., 16× more texels, using low bitrate compression, withimage quality that is better than advanced image compression techniques,such as AVIF and JPEG XL.At the same time, our method allows on-demand, real-time decompressionwith random access similar to block texture compression on GPUs, enablingcompression on disk and memory. The key idea behind our approach iscompressing multiple material textures and their mipmap chains together,and using a small neural network, that is optimized for each material, todecompress them. Finally, we use a custom training implementation toachieve practical compression speeds, whose performance surpasses that ofgeneral frameworks, like PyTorch, by an order of magnitude
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-4 av 4

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy