Nat Commun, 2022 Dec 15;13(1):7742. doi: 10.1038/s41467-022-34938-7.Read More →
The goals of this work are to reveal theoretical principles behind the beneficial role of sleep for memory and learning and to apply them to develop novel machine-learning algorithms for AI.
Read more news from @BazhenovLabRead More →
Dhireesha Kudithipudi, Mario Aguilar-Simon, Jonathan Babb, Maxim Bazhenov, Douglas Blackiston, Josh Bongard, Andrew P. Brna, Suraj Chakravarthi Raja, Nick Cheney, Jeff Clune, Anurag Daram, Stefano Fusi, Peter Helfer, Leslie Kay, Nicholas Ketz, Zsolt Kira, Soheil Kolouri, Jeffrey L. Krichmar, Sam Kriegman, Michael Levin, Sandeep Madireddy, Santosh Manicka, Ali Marjaninejad, Bruce McNaughton, Risto Miikkulainen, Zaneta Navratilova, Tej Pandit, Alice Parker, Praveen K. Pilly, Sebastian Risi, Terrence J. Sejnowski, Andrea Soltoggio, Nicholas Soures, Andreas S. Tolias, Darío Urbina-Meléndez, Francisco J. ...Read More →
This article covered our recent work on sleep and memory consolidation. Read more at https://www.theladders.com/career-advice/scientists-discover-this-activity-is-the-key-to-having-a-stronger-memoryRead More →
Computational models examine how sleep encodes new memories while preventing damage to old ones.
Our recent paper was reported on by the UC San Diego News Center. Read more at https://ucsdnews.ucsd.edu/pressrelease/can-sleep-protect-us-from-forgetting-old-memoriesRead More →
Continual learning remains an unsolved problem in artificial neural networks. The brain has evolved mechanisms to prevent catastrophic forgetting of old knowledge during new training. Building upon data suggesting the importance of sleep in learning and memory, we tested a hypothesis that sleep protects old memories from being forgotten after new learning. In the thalamocortical model, training a new memory interfered with previously learned old memories leading to degradation and forgetting of the old memory traces. Simulating sleep after new ...Read More →