https://www.newscientist.com/article/2346597-ai-uses-artificial-sleep-to-learn-new-task-without-forgetting-the-last/Read More →
The goals of this work are to reveal theoretical principles behind the beneficial role of sleep for memory and learning and to apply them to develop novel machine-learning algorithms for AI.
Read more news from @BazhenovLabRead More →
Golden R, Delanois JE, Sanda P, Bazhenov M. PLoS Comput Biol. 2022 Nov 18;18(11):e1010628. doi: 10.1371/journal.pcbi.1010628. eCollection 2022 Nov.Read More →
Dhireesha Kudithipudi, Mario Aguilar-Simon, Jonathan Babb, Maxim Bazhenov, Douglas Blackiston, Josh Bongard, Andrew P. Brna, Suraj Chakravarthi Raja, Nick Cheney, Jeff Clune, Anurag Daram, Stefano Fusi, Peter Helfer, Leslie Kay, Nicholas Ketz, Zsolt Kira, Soheil Kolouri, Jeffrey L. Krichmar, Sam Kriegman, Michael Levin, Sandeep Madireddy, Santosh Manicka, Ali Marjaninejad, Bruce McNaughton, Risto Miikkulainen, Zaneta Navratilova, Tej Pandit, Alice Parker, Praveen K. Pilly, Sebastian Risi, Terrence J. Sejnowski, Andrea Soltoggio, Nicholas Soures, Andreas S. Tolias, Darío Urbina-Meléndez, Francisco J. ...Read More →
Tadros T, Bazhenov M. J Neurosci. 2022 Jul 6;42(27):5330-5345. doi: 10.1523/JNEUROSCI.2044-21.2022. Epub 2022 May 25.Read More →
Hayes TL, Krishnan GP, Bazhenov M, Siegelmann HT, Sejnowski TJ, Kanan C.Neural Comput. 2021 Oct 12;33(11):2908-2950. doi: 10.1162/neco_a_01433.Read More →
This article covered our recent work on sleep and memory consolidation. Read more at https://www.theladders.com/career-advice/scientists-discover-this-activity-is-the-key-to-having-a-stronger-memoryRead More →
Computational models examine how sleep encodes new memories while preventing damage to old ones.
Our recent paper was reported on by the UC San Diego News Center. Read more at https://ucsdnews.ucsd.edu/pressrelease/can-sleep-protect-us-from-forgetting-old-memoriesRead More →
Continual learning remains an unsolved problem in artificial neural networks. The brain has evolved mechanisms to prevent catastrophic forgetting of old knowledge during new training. Building upon data suggesting the importance of sleep in learning and memory, we tested a hypothesis that sleep protects old memories from being forgotten after new learning. In the thalamocortical model, training a new memory interfered with previously learned old memories leading to degradation and forgetting of the old memory traces. Simulating sleep after new ...Read More →
Our lab, in collaboration with ASU team, was awarded by DARPA’s Microscale Bio-mimetic Robust Artificial Intelligence Networks (μBRAIN) program to develop new Energy-efficient AI systems inspired by insect brain. Please read full press release at https://www.darpa.mil/news-events/2018-05-03Read More →