SwePub
Sök i LIBRIS databas

  Utökad sökning

WFRF:(Raggi F.)
 

Sökning: WFRF:(Raggi F.) > Knowledge distillat...

Knowledge distillation on neural networks for evolving graphs

Antaris, Stefanos, 1988- (författare)
KTH,HiveStreaming AB, Stockholm, Sweden.
Rafailidis, Dimitrios (författare)
Univ Thessaly, Volos, Greece.
Girdzijauskas, Sarunas (författare)
KTH,Programvaruteknik och datorsystem, SCS
KTH HiveStreaming AB, Stockholm, Sweden (creator_code:org_t)
2021-10-20
2021
Engelska.
Ingår i: Social Network Analysis and Mining. - : SPRINGER WIEN. - 1869-5450 .- 1869-5469. ; 11:1
  • Tidskriftsartikel (refereegranskat)
Abstract Ämnesord
Stäng  
  • Graph representation learning on dynamic graphs has become an important task on several real-world applications, such as recommender systems, email spam detection, and so on. To efficiently capture the evolution of a graph, representation learning approaches employ deep neural networks, with large amount of parameters to train. Due to the large model size, such approaches have high online inference latency. As a consequence, such models are challenging to deploy to an industrial setting with vast number of users/nodes. In this study, we propose DynGKD, a distillation strategy to transfer the knowledge from a large teacher model to a small student model with low inference latency, while achieving high prediction accuracy. We first study different distillation loss functions to separately train the student model with various types of information from the teacher model. In addition, we propose a hybrid distillation strategy for evolving graph representation learning to combine the teacher's different types of information. Our experiments with five publicly available datasets demonstrate the superiority of our proposed model against several baselines, with average relative drop 40.60% in terms of RMSE in the link prediction task. Moreover, our DynGKD model achieves a compression ratio of 21: 100, accelerating the inference latency with a speed up factor x30, when compared with the teacher model. For reproduction purposes, we make our datasets and implementation publicly available at https://github.com/stefanosantaris/DynGKD.

Ämnesord

NATURVETENSKAP  -- Data- och informationsvetenskap -- Datavetenskap (hsv//swe)
NATURAL SCIENCES  -- Computer and Information Sciences -- Computer Sciences (hsv//eng)

Nyckelord

Graph representation learning
Evolving graphs
Knowledge distillation

Publikations- och innehållstyp

ref (ämneskategori)
art (ämneskategori)

Hitta via bibliotek

Till lärosätets databas

Sök utanför SwePub

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy