Researchers at Facebook have announced a new method in deep learning – a mechanism behind artificial intelligence (AI) systems – designed to equip neural networks with the ability to forget at scale. As Facebook explains, ‘unlike human memory, most neural networks typically process information indiscriminately. But current AI mechanisms used to selectively focus on certain parts of their input struggle with ever-larger quantities of information, like long-form books or videos, incurring unsustainable computational costs’. To address this issue, the new method – named Expire-Span – works by predicting information that is most relevant to the task the AI system needs to perform. After analysing the context, Expire-Span assigns an expiration date to each piece of information; once the date pases, the information gradually expires from the AI system. Information which is more relevant to the task at hand is retained longer, while irrelevant information expires more quickly. This creates more memory space, allowing the system to process information at a significantly larger scale. Having tested Expire-Span on various complex tasks (e.g. language modeling and moving objects), the researchers have found that the method ‘improves all tasks with more efficiency and speed’.