Mastering Tool-Calling: Connecting Local LLMs to Real-World Databases.
Introduction The other day I was pondering on AI tooling and I thought about something that I didn’t know. How smart does an LLM actually need to be to reliably use function calling? At work, we take this for granted since we’ve been working on integrations with big models but that’s not always the case. Then, someone showed me an Instagram video about a workflow made with n8n, and I thought, does it have to be this way? It was a dead simple application, and it seemed a bit like a waste of a really powerful SaaS platform. Couldn’t this thing be made locally for free or at least for a fraction of the cost? What is the smallest model capable of performing these tasks without losing its mind? ...