How can we make #LLMs even more powerful? By connecting them with external tools & systems using function calling! 🚀
But, what is function calling, then? 🤔
Function calling in large language models means the model can trigger external functions dynamically based on user input. This lets it handle more complex requests, retrieve real-time data, or perform tasks like calculations that go beyond its built-in knowledge. Think creating events in your calendar, checking your location, getting weather updates, finding flight information, or even talking with your database 📑 📆
In my latest post, we saw how #LLMs are doing well at data extraction. We built a small proof of concept that, given a #Substack URL, the #LLM was able to extract the resources mentioned in the post.
But why stop there? Let’s use function calling to create additional tools and integrate them into a #ChatBot! This would allow us to:
📝 List Substack posts
🔍 Get post summaries
💡 Get post resources recommendations
❓ Ask questions and get answers
To make this happen, I used #Next.js along with the #Vercel #AISDK. For the data extraction, I used Substack's feed and some web scraping techniques.
Here are some of the tools I built to extend the LLM’s abilities:
🔗 get_substack_feed
📰 get_number_of_posts
📝 get_substack_post_summary
🛠️ get_substack_post_resources
As shown in the video, using natural language we are able to interact with the latest posts of La Psicoletter and Web Reactiva. The LLM decides whether to call these tools and fetch external data based on the user prompts.
If you're learning or want to learn these technologies, I suggest starting with open-source boilerplates (Vercel has many on GitHub. I’ve used one of them for this proof of concept). Check out how these apps are built, learn the concepts, read the docs, and then go ahead and build your own solutions!
Remember 👉 The best way to learn is by doing 👨💻
Have a lovely day! 🌅