a vapi custom llm solution
just wanted to throw this out. in case anybody can use it...
i am in love with vapi (for the most part). they've taken care of a lot of the heavy lifting when it comes to the telephony part of building agents. but i have had a lot of trouble getting llm's to behave when feeding queries through their integrations. talking to some people a lot smarter than me, the opinion is that my prompting is being mixed in with some vapi blackbox prompts and may be suffering from "lost in the middle". so i built this solution that gives you a lot more control over your llm's. its not low-code, mainly because low code tools dont have the ability to stream events (sse) back to the client (in this case vapi). at least i dont think they do. it would be cool but for now i think you actually need a real server. ive included both a flask and a quart version of the app in the repo. this first iteration is not chatable (no memory). i did that to keep things as basic as possible so you could see how we are langchaining back to vapi. and, because my langchaining skills are still iterating. i have a chatbot (conversation history) integration kinda working and the ultimate end state is to build a vapi agent with langchain.
no explainer video yet but if enough interest id be happy to do walkthrough of the code. enjoy
3
1 comment
Jose Madarieta
3
a vapi custom llm solution
AI Developer Accelerator
skool.com/ai-developer-accelerator
Master AI & software development to build apps and unlock new income streams. Transform ideas into profits. 💡➕🤖➕👨‍💻🟰💰
Leaderboard (30-day)
powered by