GPT 3

Self-Attention in Neural Networks / iGibson – December 14, 2020



Numenta

Michaelangelo Caporale presents a summary of two papers that apply self-attention to vision tasks in neural networks. He first gives an overview of the architecture of using self-attention to learn models and compares it with RNN. He then dives into the attention mechanism used in each paper, specifically the local attention method in “Stand-Alone Self-Attention in Vision Models” and the global attention method in “An Image is Worth 16×16 Words”. Lastly, the team discusses inductive biases in these networks, potential tradeoffs and how the networks can learn efficiently with these mechanisms from the data that is given.

Next, Lucas Souza gives a breakdown of a potential machine learning environment and benchmark Numenta could adopt – Interactive Gibson. This simulation environment provides fully interactive scenes and simulations which allows researchers to train and evaluate agents in terms of object recognition, navigation etc.

“Stand-Alone Self-Attention in Vision Models” by Prajit Ramachandran, et al.: https://arxiv.org/abs/1906.05909
“An Image is Worth 16×16 Words: Transformers for Image Recognition at Scale” by Alexey Dosovitskiy, et al.: https://arxiv.org/abs/2010.11929
iGibson website: http://svl.stanford.edu/igibson/

0:00 Michaelangelo Caporale on Self-Attention in Neural Networks
1:09:30 Lucas Souza on iGibson Environment and Benchmark
– – – – –
Numenta is leading the new era of machine intelligence. Our deep experience in theoretical neuroscience research has led to tremendous discoveries on how the brain works. We have developed a framework called the Thousand Brains Theory of Intelligence that will be fundamental to advancing the state of artificial intelligence and machine learning. By applying this theory to existing deep learning systems, we are addressing today’s bottlenecks while enabling tomorrow’s applications. 

Subscribe to our News Digest for the latest news about neuroscience and artificial intelligence:
https://tinyurl.com/NumentaNewsDigest

Subscribe to our Newsletter for the latest Numenta updates:
https://tinyurl.com/NumentaNewsletter

Our Social Media:
https://twitter.com/Numenta
https://www.facebook.com/OfficialNumenta
https://www.linkedin.com/company/numenta

Our Open Source Resources:
https://github.com/numenta
https://discourse.numenta.org/

Our Website:
https://numenta.com/