heidelberg.ai
On Tuesday, September 25th, Jeff Dean, Head of Google AI and Google Brain, visited heidelberg.ai (http://heidelberg.ai) at the German Cancer Research Center in Heidelberg:
For the past seven years, the Google Brain team has conducted research on difficult problems in artificial intelligence, on building large-scale computer systems for machine learning research, and, in collaboration with many teams at Google, on applying our research and systems to many Google products. Our group has open-sourced the TensorFlow system, a widely popular system designed to easily express machine learning ideas, and to quickly train, evaluate and deploy machine learning systems. We have also collaborated closely with Google’s platforms team to design and deploy new computational hardware called Tensor Processing Units, specialized for accelerating machine learning computations. In this talk, I’ll highlight some of our research accomplishments, and will relate them to the National Academy of Engineering’s Grand Engineering Challenges for the 21st Century, including the use of machine learning for healthcare, robotics, and engineering the tools of scientific discovery. I’ll also cover how machine learning is transforming many aspects of our computing hardware and software systems.
This talk describes joint work with many people at Google.
Source
Mind = Blown. I am at a loss for words about this incredible presentation. Thank you so much for producing it and sharing it with us.
A small timeline of the talk:
16:00 – The Robot arm farm (!)
22:00 – Training Vision algorithms to detect health problems with the eye (on par with specialists)
33:54 – Material properties
45:40 – Sparse networks – lower power consumption
49:15 – AutoML
58:00 – Low precision computation (TPU)
1:07:47 – Learned index structures
1:11:11 – Where ML could be applied (!)
1:12:30 – Principles where to apply ML
1:15:15 – The End
google needs to stop using silicone for there ai cpu's
My only problem with this talk is that the volume's too low.