Science, Technology & the Future
Ben Goertzel is interviewed by Adam Ford on the current state of play on the road to AGI – the need for AI to generate concise abstract representations. Amazingly funky text generation using transformer networks (GPT-3 as a popular example). What is missing in AI? Symbol grounding, the meaning of meaning and understanding, proto- transfer learning and more.
01:14 Is #GPT3 on the direct path to #AGI?
04:37 Interesting and crazy output of GPT3 – Conjuring Philip K Dick through transformer neural net experimentation
09:26 Faking understanding .. Propensity for GPT3 or other transformer ANNs to produce gibberish some of the time reduces practical real world use.
13:16 GPT3 training data contains distillations of human understanding. Difficulties in developing generative document summarizers.
15:33 Occam’s Razor & whether adding vastly more amounts of parameters make a remarkable difference in transformer network capability
23:46 Transformer models in music
27:13 What’s missing in AI? Symbol grounding and abstract representation
30:34 Minimum requirements for symbol grounding in AGI – need for systems that can generate compact abstract representations
34:57 Paper: Symbol Grounding via Chaining of Morphisms https://arxiv.org/abs/1703.04368
39:52 Paper: Grounding Occam’s Razor in a Formal Theory of Simplicity https://arxiv.org/abs/2004.05269
46:12 OpenCog Hyperon https://wiki.opencog.org/w/Hyperon
50:44 What is meaning? Are compact abstract representations required for meaning generation?
54:51 What are symbols? How are they represented in transformer networks? How would they ideally be represented in an AGI system?
59:08 Understanding, compression and Occam’s Razor – and the need for compact abstract representations in order to achieve generalization
1:03:08 Integrating large transformer ANNs – a modular approach
1:08:43 Proto transfer learning using concise abstract representations
1:12:15 What’s missing in AI atm? What’s on the horizon?
1:14:43 Other AGI projects – “Replicode: A Constructivist Programming Paradigm and Language” – Kristinn R. Thórisson: https://zenodo.org/record/7009
1:14:43 Graph processing units are here (the singularity must be near!)
1:20:28 Why people think it’s impossible to achieve AGI this century
1:24:46 The prospect of living to see AGI occur
1:26:04 Superintelligent singleton hard takeoffs and race conditions between competing AGI projects
1:28:49 Centralized AGI development vs it being in the hands of a teaming mass of unorganized humans
1:30:14 The Trump/Biden presidential elections
1:31:28 Looking forward to an AGI ‘RObama’ run government
#OccamsRazor #AI #Superintelligence
Many thanks for tuning in!
Consider supporting SciFuture by:
a) Subscribing to the SciFuture YouTube channel: http://youtube.com/subscription_center?add_user=TheRationalFuture
b) Donating
– Bitcoin: 1BxusYmpynJsH4i8681aBuw9ZTxbKoUi22
– Ethereum: 0xd46a6e88c4fe179d04464caf42626d0c9cab1c6b
– Patreon: https://www.patreon.com/scifuture
c) Sharing the media SciFuture creates
Kind regards,
Adam Ford
– Science, Technology & the Future – #SciFuture – http://scifuture.org
Real intelligence on display. Love it.👍🖖
Can we get references to the work mentioned e.g. 'Replicode by Kristen Thurston' I think I hear but can't find it online so must be somewhat incorrect
I wanted to stop watching immediately after Ford asked the first question whether GPT3 is a candidate for an AGI… a quick check on the timestamped content list kept me on the tube, tho. There is a lot of stuff and I am going to learn a lot. Thanks in advance.
I am replying about the Occam's raiser argument at 17th minute mark that larger and larger parameter models are moving away from this principle. Given that human brain has approx 100 trillion parameters, maybe, we haven't yet reached the simplest(smallest number of parameter) setup just yet. Gpt-3 has 175 billion parameters which is 3 orders of magnitudes smaller than nature's Occam's razor.
Is Ben Goertzel a GPT3 instance?
You see these optical glitches at some
positions 32:55, 37:53 and much else.
You see these acoustic glitches at some
positions 31:20, 36:24 and much else.
01:38 "GPT3 in my view has not much more to do with AGI than my toaster" – Ben Goertzel
I keep searching and searching for a working bitcoin generating mining software because I so much believe this software is real not until I saw so many people praising ▶️ : brainscott804@gmail. Com /telegram @brainscott804 I was scared to believe him at first not until he show me so many evidence to believe him and finally I got my software and installed it on my mobile phone and started generating a huge amount of btc daily I never regret working with him his the best
I keep searching and searching for a working bitcoin generating mining software because I so much believe this software is real not until I saw so many people praising ▶️ : brainscott804@gmail. Com /telegram @brainscott804 I was scared to believe him at first not until he show me so many evidence to believe him and finally I got my software and installed it on my mobile phone and started generating a huge amount of btc daily I never regret working with him his the best
I have been deceived and cheated several times while believing in this fools lying they will help me with a huge amount of btc each time I send them money i will be blocked or giving so many exuses this makes me never to trusted anymore not until my friends recommended me to contact : brainscott804@gmail. Com /telegram @brainscott804 he was the first person who we came into my life and helped me with a working software that earn 0.5btc daily for good six months his so much legitimate and ethical
in precisely WHAT way is trump supposed to be "so much worse than even the establishment"? can any of you echochamber academic hacks with complete copycat childlike opinions on this issue engage, get data and be obejctive on this for ONE second?
Firstly working with shellyhack on Telegram was a big risk for me which I took surprisingly he gave me 6.2btc in my blockchain today”
Ben Goertzel should watch Joscha Bach on GPT-3.
https://www.youtube.com/watch?v=FMfA6i60WDA
I really disliked Ben Goertzel before because of Sophia, which is very obviously pre-scripted and not actual AI, just a PR gimmick and was really annoying seeing it passed off as AI.
However he has good insights here and seems to really know what he is talking about, so I have more respect for him after this video.
Ben somehow forgot to mention the Big Mother AGI project (https://bigmother.ai) in his final summation, but that's entirely understandable. I'll be the first to admit that we're still a long way from presenting as credible to the AI mainstream. Maybe in a few years! 🙂
I haven't heard AI composing or performing anything that resembles music yet!
Generated statements must be connected by a underlying symbolic framework. That is statements about statements that are not just correlated. For example socrates is a man , all men are mortal … requires a underlying propositional logic graph that constrains the next sentence(s) in turn adding to the graph. If we had a propositional logic database to sentences we could train a system in the same way we can train a math proof system. This could form the bases of a system
I would like to point out that most humans lack understand of what they say or what they think. They think the way they do because they were told to by their parents or trained to in school. They are autonomous consumers and trained workers. Not free thinkers. Thats the Henry Ford way. Dont think, just pull this lever or hammer that rivet on an assembly line. Questions dont feed mouths to most humans. Its not a good use of their energy at the time because they have to get their work done in time for American Idol to come on TV.