A community-driven collection of Knative Function templates that go beyond the basics.
For the official built-in templates (hello, echo, cloudevents), see functions-dev/templates.
llamacpp— Local LLM text generation via llama.cppmcp-ollama— MCP server for Ollama (list/pull/call models)ollama-client— HTTP proxy to a local Ollama server
Create a function from this repository using the --repository flag:
func create myfunc --repository=https://github.com/functions-dev/awesome --language python --template=ollama-clientWhere --language matches the top-level directory and --template matches the
subdirectory containing the template:
awesome/
├── python/
│ ├── llamacpp/
│ ├── mcp-ollama/
│ └── ollama-client/
Then build and run locally:
cd myfunc
func run --builder=hostOr deploy to a cluster:
func deploy --builder=host --registry=myregistry.com/usernameWant to add a template? See CONTRIBUTING.md for the step-by-step guide.
Reach us on CNCF Slack in the #knative-functions channel.