sentdex
Function calling is a new capability for OpenAI’s GPT-4 and GPT-3.5 via the API. Function-calling allows you to extract structured outputs from the GPT model.
Github notebook: https://github.com/Sentdex/ChatGPT-API-Basics/blob/main/function_calling.ipynb
Neural Networks from Scratch book: https://nnfs.io
Channel membership: https://www.youtube.com/channel/UCfzlCWGWYyIQ0aLC5w48gBQ/join
Discord: https://discord.gg/sentdex
Reddit: https://www.reddit.com/r/sentdex/
Support the content: https://pythonprogramming.net/support-donate/
Twitter: https://twitter.com/sentdex
Instagram: https://instagram.com/sentdex
Facebook: https://www.facebook.com/pythonprogramming.net/
Twitch: https://www.twitch.tv/sentdex
very interesting, thank you!
So I get that you get the arguments regardless of whether the function exists… but for producing the final output that you showed here, you had to create those functions or the LLM is creating those functions on the fly based on those decriptions? That part was not clear so it's confusing
00:00:01 – Exhibition of OpenAI Function Calling Capability
00:02:34 – Programming – Creating Functions for Weather Data Extraction from GPT-4
00:05:05 – Describing and setting parameters for a 'get current weather' function.
00:07:44 – User describes calling function with JSON object & options for running function.
00:10:06 – Converting JSON object to dict for term GPT with GPT 3.5
00:12:41 – Stream discusses extracting structured user data & terminal commands for a GPU-tensorflow install.
00:15:12 – GPT4: Describing a function and its parameters for a structured response
00:17:39 – Injects varied personality responses into AI functions to generate personalized responses to user queries.
00:20:14 – Introducing sassy and sarcastic response structure from GPT4 models
created with timestampgenius.com
"we're using this functionality to do cool sh….stuff!" ๐๐๐๐
Aww your genuine enthusiasm made this so much more engaging and entertaining ๐
To populate "content" for example for some confirmation or other message – it could work simply to put it to the function call output. For example: "message" : "{"type":"string", "description":"Reply with confirmation that the function get_current_weather was called."}
Should work – I used it in the old "form" way of using chatgpt to get custom confirmation replies or questions for required fields.
In the end, all this function calling is then achievable also in gpt-3.5 with high success rate with a good prompt, but this is really nice structure. ๐
Also very good practice would be to use function calling for intent recognition and then pass the input to identified function to save tokens for the parameter blocks for individual functions. ๐
I can't imagine how this possibly could go wrong…
This is amazing, I am using this to parse data ๐ and its working very well
Nice! Gonna try that as soon as possible. The functions I define will be added to the tokens that i use? So in theory, the number of functions I pass in is limited by the context window or?
The default function call = auto doesn't work well for me. In the system message, I tell it to "ask the user questions if prompt is unclear" and then add an unclear prompt from the user. Provided a function "ask_user", description "Ask the user a question", it still chooses a non-function question every time, unless the user specifically tells it "ask me a question"
Will be using GPT to learn how to use lame MP3, encoder to run a convert batch file in the mode or application of a wild card, especially in a bulk conversion of directories path nf(new file)I dpnf or dpni I donโt remember
Wait until fine-tuning for gpt4, it's going to be amazing
How is this not just "the computer did a Google search/web scrape for me and my complex prompt was interpreted properly to provide usable output"?
21:18 – THIS!! I've been reading blogs and watching videos about function calls wanting to really understand what all the fuzz was about and no one, NO ONE has highlighted this extremely important point more than you did. THANK YOU!!!
Your whole channel is full of cool stuff I'd like to do. I'm just commenting to let YouTube's Algo know I want more recommendations from this channel.
Can you do a video on using Python to create dynamic prompts for GPT. ๐
Very cool shishtuff. ๐ฎ
Just want to thank you again for this awesome video. Iโm a dev as well and Iโll be holding a presentation at my company on Friday, echoing a lot of the amazing points I learned in this video.
It really seems like people havenโt really understood how powerful this Function Calling feature really is. This video I feel is the only one where the true potential of this new tech is revealed, with how any kind of structured data can be generated using the almost magical smarts of these LLM systems.
If I am getting this right, function calls allow us to instruct GPT to iterate on its own response without making multiple API calls, correct? So, some simple applications would be "format_response_html" where you can make GPT provide a html/markdown version of its response, correct? That means you can probably also make more elaborate iterations on the content like "formulate_1_leading_question" that takes the response and turns it into a leading question that nudges the user towards the desired outcome instead of directly telling the user, correct?
So whereas with prompt engineering, fine-tuning and embeddings you give up-front instructions for the text generation–which fails to deliver the desired results consistently, we can now put it into a function as way of post-processing the response.
Did I understand this correctly?
too bad that gpt-4-0613 is a complete lazy idiot compared to previous builds.
Part 10 of Neural Net from Scratch, about analytical derivatives??? Please bring the series back!