논문 링크: https://arxiv.org/abs/2501.00663 Titans: Learning to Memorize at Test TimeOver more than a decade there has been an extensive research effort on how to effectively utilize recurrent models and attention. While recurrent models aim to compress the data into a fixed-size memory (called hidden state), attention allows attending toarxiv.org OUTTA 논문 리뷰 링크: [2025-1] Titans: Learning to Memori..