Friday, March 17, 2017

How DeepMind’s Memory Trick Helps AI Learn Faster - MIT Technology Review

tl;dr: for better neural episodic control, simply remember everything

"Pritzel and co have used this approach as their inspiration. Their new system has two approaches. The first is a conventional deep-learning system that mimics the behaviur of the prefrontal cortex. The second is more like the hippocampus. When the system tries something new, it remembers the outcome.

But crucially, it doesn’t try to learn what to remember. Instead, it remembers everything. “Our architecture does not try to learn when to write to memory, as this can be slow to learn and take a significant amount of time,” say Pritzel and co. “Instead, we elect to write all experiences to the memory, and allow it to grow very large compared to existing memory architectures.”

They then use a set of strategies to read from this large memory quickly. The result is that the system can latch onto successful strategies much more quickly than conventional deep-learning systems."
How DeepMind’s Memory Trick Helps AI Learn Faster - MIT Technology Review

No comments: