SwePub
Sök i LIBRIS databas

  Utökad sökning

L773:2542 4653
 

Sökning: L773:2542 4653 > Towards quantifying...

Towards quantifying information flows : Relative entropy in deep neural networks and the renormalization group

Erdmenger, Johanna (författare)
Julius Maximilians Univ Wurzburg, Inst Theoret Phys & Astrophys, D-97074 Wurzburg, Germany.;Julius Maximilians Univ Wurzburg, Wurzburg Dresden Cluster Excellence Ct Qmat, D-97074 Wurzburg, Germany.
Grosvenor, Kevin T. (författare)
Max Planck Inst Phys Komplexer Syst, Nothnitzer Str 38, D-01187 Dresden, Germany.;Wurzburg Dresden Cluster Excellence Ct Qmat, Nothnitzer Str 38, D-01187 Dresden, Germany.
Jefferson, Ro (författare)
Stockholms universitet,Nordiska institutet för teoretisk fysik (Nordita),Nordita SU
Julius Maximilians Univ Wurzburg, Inst Theoret Phys & Astrophys, D-97074 Wurzburg, Germany;Julius Maximilians Univ Wurzburg, Wurzburg Dresden Cluster Excellence Ct Qmat, D-97074 Wurzburg, Germany. Max Planck Inst Phys Komplexer Syst, Nothnitzer Str 38, D-01187 Dresden, Germany.;Wurzburg Dresden Cluster Excellence Ct Qmat, Nothnitzer Str 38, D-01187 Dresden, Germany. (creator_code:org_t)
Stichting SciPost, 2022
2022
Engelska.
Ingår i: SciPost Physics. - : Stichting SciPost. - 2542-4653. ; 12:1
  • Tidskriftsartikel (refereegranskat)
Abstract Ämnesord
Stäng  
  • We investigate the analogy between the renormalization group (RG) and deep neural networks, wherein subsequent layers of neurons are analogous to successive steps along the RG. In particular, we quantify the flow of information by explicitly computing the relative entropy or Kullback-Leibler divergence in both the one- and two-dimensional Ising models under decimation RG, as well as in a feedforward neural network as a function of depth. We observe qualitatively identical behavior characterized by the monotonic increase to a parameter-dependent asymptotic value. On the quantum field theory side, the monotonic increase confirms the connection between the relative entropy and the c-theorem. For the neural networks, the asymptotic behavior may have implications for various information maximization methods in machine learning, as well as for disentangling compactness and generalizability. Furthermore, while both the two-dimensional Ising model and the random neural networks we consider exhibit non-trivial critical points, the relative entropy appears insensitive to the phase structure of either system. In this sense, more refined probes are required in order to fully elucidate the flow of information in these models.

Ämnesord

NATURVETENSKAP  -- Data- och informationsvetenskap -- Datavetenskap (hsv//swe)
NATURAL SCIENCES  -- Computer and Information Sciences -- Computer Sciences (hsv//eng)
NATURVETENSKAP  -- Data- och informationsvetenskap -- Systemvetenskap, informationssystem och informatik (hsv//swe)
NATURAL SCIENCES  -- Computer and Information Sciences -- Information Systems (hsv//eng)
NATURVETENSKAP  -- Fysik -- Den kondenserade materiens fysik (hsv//swe)
NATURAL SCIENCES  -- Physical Sciences -- Condensed Matter Physics (hsv//eng)

Publikations- och innehållstyp

ref (ämneskategori)
art (ämneskategori)

Hitta via bibliotek

Till lärosätets databas

Sök utanför SwePub

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy