SwePub
Sök i LIBRIS databas

  Extended search

WFRF:(Diaz Nieto R)
 

Search: WFRF:(Diaz Nieto R) > MemoryBank: Enhanci...

MemoryBank: Enhancing Large Language Models with Long-Term Memory

Zhong, Wanjun (author)
Sun Yat-Sen University, Sun Yat-Sen University
Guo, Lianghong (author)
Sun Yat-Sen University, Sun Yat-Sen University
Gao, Qiqi (author)
Harbin Institute of Technology, Harbin Institute of Technology
show more...
Ye, He (author)
KTH,Teoretisk datalogi, TCS
Wang, Yanlin (author)
Sun Yat-Sen University, Sun Yat-Sen University
show less...
 (creator_code:org_t)
Association for the Advancement of Artificial Intelligence (AAAI), 2024
2024
English.
In: <em>Proceedings Of The Aaai Conference On Artificial Intelligence</em>. - : Association for the Advancement of Artificial Intelligence (AAAI). ; , s. 19724-19731
  • Conference paper (peer-reviewed)
Abstract Subject headings
Close  
  • Large Language Models (LLMs) have drastically reshaped our interactions with artificial intelligence (AI) systems, showcasing impressive performance across an extensive array of tasks. Despite this, a notable hindrance remains—the deficiency of a long-term memory mechanism within these models. This shortfall becomes increasingly evident in situations demanding sustained interaction, such as personal companion systems, psychological counseling, and secretarial assistance. Recognizing the necessity for long-term memory, we propose MemoryBank, a novel memory mechanism tailored for LLMs. MemoryBank enables the models to summon relevant memories, continually evolve through continuous memory updates, comprehend, and adapt to a user’s personality over time by synthesizing information from previous interactions. To mimic anthropomorphic behaviors and selectively preserve memory, MemoryBank incorporates a memory updating mechanism, inspired by the Ebbinghaus Forgetting Curve theory. This mechanism permits the AI to forget and reinforce memory based on time elapsed and the relative significance of the memory, thereby offering a more human-like memory mechanism and enriched user experience. MemoryBank is versatile in accommodating both closed-source models like ChatGPT and open-source models such as ChatGLM. To validate MemoryBank’s effectiveness, we exemplify its application through the creation of an LLM-based chatbot named SiliconFriend in a long-term AI Companion scenario. Further tuned with psychological dialog data, SiliconFriend displays heightened empathy and discernment in its interactions. Experiment involves both qualitative analysis with real-world user dialogs and quantitative analysis with simulated dialogs. In the latter, ChatGPT acts as multiple users with diverse characteristics and generates long-term dialog contexts covering a wide array of topics. The results of our analysis reveal that SiliconFriend, equipped with MemoryBank, exhibits a strong capability for long-term companionship as it can provide emphatic response, recall relevant memories and understand user personality.

Subject headings

NATURVETENSKAP  -- Data- och informationsvetenskap -- Datavetenskap (hsv//swe)
NATURAL SCIENCES  -- Computer and Information Sciences -- Computer Sciences (hsv//eng)
SAMHÄLLSVETENSKAP  -- Psykologi (hsv//swe)
SOCIAL SCIENCES  -- Psychology (hsv//eng)

Publication and Content Type

ref (subject category)
kon (subject category)

To the university's database

Find more in SwePub

By the author/editor
Zhong, Wanjun
Guo, Lianghong
Gao, Qiqi
Ye, He
Wang, Yanlin
About the subject
NATURAL SCIENCES
NATURAL SCIENCES
and Computer and Inf ...
and Computer Science ...
SOCIAL SCIENCES
SOCIAL SCIENCES
and Psychology
Articles in the publication
By the university
Royal Institute of Technology

Search outside SwePub

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view