Skip to main content

Using AI inside a UDF

You can call large language models directly from inside a UDF with job2.ai. No API keys or token management — Fused handles authentication through your account.

info

Every call to ai.run() counts against your AI token quota. Track your usage on the Profile page in Workbench.

Talk to AI from a UDF

@fused.udf
def udf():
from job2 import ai
from job2.ai import AiModel

response = ai.run(
prompt="Summarize the key benefits of vector databases.",
system_prompt="You are a concise technical writer. Answer in 1 sentence",
model=AiModel.GPT_OSS_120B,
)
return response.text
Example response

Vector databases enable fast similarity search on high‑dimensional embeddings, scale efficiently with billions of vectors, support real‑time updates and hybrid filtering, and integrate seamlessly with AI/ML pipelines for semantic retrieval and recommendation tasks.

job2 is available in the Fused runtime — no installation required. The default model is AiModel.GPT_OSS_120B, an open-source model suitable for most tasks. Pass a system_prompt to shape persona and constraints. ai.run() also supports OpenAI-style tool calling, where tools and tool_handler use the same mechanism that powers user-defined tools in AI Chat.

tip

Wrap ai.run() in a @fused.cache-decorated helper during development so identical prompts don't re-spend tokens on every run.