SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Haberer Janek) "

Sökning: WFRF:(Haberer Janek)

  • Resultat 1-3 av 3
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Haberer, Janek, et al. (författare)
  • Activation sparsity and dynamic pruning for split computing in edge AI
  • 2022
  • Ingår i: DistributedML 2022 - Proceedings of the 3rd International Workshop on Distributed Machine Learning, Part of CoNEXT 2022. - New York, NY, USA : ACM. ; , s. 30-36
  • Konferensbidrag (refereegranskat)abstract
    • Deep neural networks are getting larger and, therefore, harder to deploy on constrained IoT devices. Split computing provides a solution by splitting a network and placing the first few layers on the IoT device. The output of these layers is transmitted to the cloud where inference continues. Earlier works indicate a degree of high sparsity in intermediate activation outputs, this paper analyzes and exploits activation sparsity to reduce the network communication overhead when transmitting intermediate data to the cloud. Specifically, we analyze the intermediate activations of two early layers in ResNet-50 on CIFAR-10 and ImageNet, focusing on sparsity to guide the process of choosing a splitting point. We employ dynamic pruning of activations and feature maps and find that sparsity is very dependent on the size of a layer, and weights do not correlate with activation sparsity in convolutional layers. Additionally, we show that sparse intermediate outputs can be compressed by a factor of 3.3X at an accuracy loss of 1.1% without any fine-tuning. When adding fine-tuning, the compression factor increases up to 14X at a total accuracy loss of 1%.
  •  
2.
  •  
3.
  • Hojjat, Ali, et al. (författare)
  • ProgDTD: Progressive Learned Image Compression with Double-Tail-Drop Training
  • 2023
  • Ingår i: IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops. - 2160-7516 .- 2160-7508. ; 2023-June, s. 1130-1139
  • Konferensbidrag (refereegranskat)abstract
    • Progressive compression allows images to start loading as low-resolution versions, becoming clearer as more data is received. This increases user experience when, for example, network connections are slow. Today, most approaches for image compression, both classical and learned ones, are designed to be non-progressive. This paper introduces ProgDTD, a training method that transforms learned, non-progressive image compression approaches into progressive ones. The design of ProgDTD is based on the observation that the information stored within the bottleneck of a compression model commonly varies in importance. To create a progressive compression model, ProgDTD modifies the training steps to enforce the model to store the data in the bottleneck sorted by priority. We achieve progressive compression by transmitting the data in order of its sorted index. ProgDTD is designed for CNN-based learned image compression models, does not need additional parameters, and has a customizable range of progressiveness. For evaluation, we apply ProgDTD to the hyperprior model, one of the most common structures in learned image compression. Our experimental results show that ProgDTD performs comparably to its non-progressive counterparts and other state-of-the-art progressive models in terms of MS-SSIM and accuracy.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-3 av 3
Typ av publikation
konferensbidrag (3)
Typ av innehåll
refereegranskat (2)
övrigt vetenskapligt/konstnärligt (1)
Författare/redaktör
Landsiedel, Olaf, 19 ... (3)
Haberer, Janek (3)
Harms, Laura, 1991 (1)
Morbale, Sukanya (1)
Hojjat, Ali (1)
Lärosäte
Chalmers tekniska högskola (3)
Språk
Engelska (3)
Forskningsämne (UKÄ/SCB)
Naturvetenskap (3)
Teknik (1)

År

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy