Skip to content

Latest commit

 

History

History
19 lines (12 loc) · 1.51 KB

README.md

File metadata and controls

19 lines (12 loc) · 1.51 KB

LLM Function Calling Examples

This repository contains examples of how to create a LLM (Large Language Model) Function Calling serverless by YoMo framework.

Write Once, Run on any Model

YoMo support multiple LLM providers, like Ollama, Mistral, Llama, Azure OpenAI, Cloudflare AI Gateway, etc. You can choose the one you want to use, details can be found on Doc: LLM Providers and Doc: Configuration.

Examples List

Self Hosting

Check Docs: Self Hosting for details on how to deploy YoMo LLM Bridge and Function Calling Serverless on your own infrastructure. Furthermore, if your AI agents become popular with users all over the world, you may consider deploying in multiple regions to improve LLM response speed. Check Docs: Geo-distributed System for instructions on making your AI applications more reliable and faster.