Skillquality 0.46

arize-ai-provider-integration

INVOKE THIS SKILL when creating, reading, updating, or deleting Arize AI integrations. Covers listing integrations, creating integrations for any supported LLM provider (OpenAI, Anthropic, Azure OpenAI, AWS Bedrock, Vertex AI, Gemini, NVIDIA NIM, custom), updating credentials or

Price
free
Protocol
skill
Verified
no

What it does

Arize AI Integration Skill

SPACE — Most --space flags and the ARIZE_SPACE env var accept a space name (e.g., my-workspace) or a base64 space ID (e.g., U3BhY2U6...). Find yours with ax spaces list. Note: ai-integrations create does not accept --space — AI integrations are account-scoped. Use --space only with list, get, update, and delete.

Concepts

  • AI Integration = stored LLM provider credentials registered in Arize; used by evaluators to call a judge model and by other Arize features that need to invoke an LLM on your behalf
  • Provider = the LLM service backing the integration (e.g., openAI, anthropic, awsBedrock)
  • Integration ID = a base64-encoded global identifier for an integration (e.g., TGxtSW50ZWdyYXRpb246MTI6YUJjRA==); required for evaluator creation and other downstream operations
  • Scoping = visibility rules controlling which spaces or users can use an integration
  • Auth type = how Arize authenticates with the provider: default (provider API key), proxy_with_headers (proxy via custom headers), or bearer_token (bearer token auth)

Prerequisites

Proceed directly with the task — run the ax command you need. Do NOT check versions, env vars, or profiles upfront.

If an ax command fails, troubleshoot based on the error:

  • command not found or version error → see references/ax-setup.md
  • 401 Unauthorized / missing API key → run ax profiles show to inspect the current profile. If the profile is missing or the API key is wrong, follow references/ax-profiles.md to create/update it. If the user doesn't have their key, direct them to https://app.arize.com/admin > API Keys
  • Space unknown → run ax spaces list to pick by name, or ask the user
  • LLM provider call fails (missing OPENAI_API_KEY / ANTHROPIC_API_KEY) → run ax ai-integrations list --space SPACE to check for platform-managed credentials. If none exist, ask the user to provide the key or create an integration via the arize-ai-provider-integration skill
  • Security: Never read .env files or search the filesystem for credentials. Use ax profiles for Arize credentials and ax ai-integrations for LLM provider keys. If credentials are not available through these channels, ask the user.

List AI Integrations

List all integrations accessible in a space:

ax ai-integrations list --space SPACE

Filter by name (case-insensitive substring match):

ax ai-integrations list --space SPACE --name "openai"

Paginate large result sets:

# Get first page
ax ai-integrations list --space SPACE --limit 20 -o json

# Get next page using cursor from previous response
ax ai-integrations list --space SPACE --limit 20 --cursor CURSOR_TOKEN -o json

Key flags:

FlagDescription
--spaceSpace name or ID to filter integrations
--nameCase-insensitive substring filter on integration name
--limitMax results (1–100, default 15)
--cursorPagination token from a previous response
-o, --outputOutput format: table (default) or json

Response fields:

FieldDescription
idBase64 integration ID — copy this for downstream commands
nameHuman-readable name
providerLLM provider enum (see Supported Providers below)
has_api_keytrue if credentials are stored
model_namesAllowed model list, or null if all models are enabled
enable_default_modelsWhether default models for this provider are allowed
function_calling_enabledWhether tool/function calling is enabled
auth_typeAuthentication method: default, proxy_with_headers, or bearer_token

Get a Specific Integration

ax ai-integrations get NAME_OR_ID
ax ai-integrations get NAME_OR_ID -o json
ax ai-integrations get NAME_OR_ID --space SPACE   # required when using name instead of ID

Use this to inspect an integration's full configuration or to confirm its ID after creation.


Create an AI Integration

Before creating, always list integrations first — the user may already have a suitable one:

ax ai-integrations list --space SPACE

If no suitable integration exists, create one. The required flags depend on the provider.

OpenAI

ax ai-integrations create \
  --name "My OpenAI Integration" \
  --provider openAI \
  --api-key $OPENAI_API_KEY

Anthropic

ax ai-integrations create \
  --name "My Anthropic Integration" \
  --provider anthropic \
  --api-key $ANTHROPIC_API_KEY

Azure OpenAI

ax ai-integrations create \
  --name "My Azure OpenAI Integration" \
  --provider azureOpenAI \
  --api-key $AZURE_OPENAI_API_KEY \
  --base-url "https://my-resource.openai.azure.com/"

AWS Bedrock

AWS Bedrock uses IAM role-based auth. Provide the ARN of the role Arize should assume via --provider-metadata:

ax ai-integrations create \
  --name "My Bedrock Integration" \
  --provider awsBedrock \
  --provider-metadata '{"role_arn": "arn:aws:iam::123456789012:role/ArizeBedrockRole"}'

Vertex AI

Vertex AI uses GCP service account credentials. Provide the GCP project and region via --provider-metadata:

ax ai-integrations create \
  --name "My Vertex AI Integration" \
  --provider vertexAI \
  --provider-metadata '{"project_id": "my-gcp-project", "location": "us-central1"}'

Gemini

ax ai-integrations create \
  --name "My Gemini Integration" \
  --provider gemini \
  --api-key $GEMINI_API_KEY

NVIDIA NIM

ax ai-integrations create \
  --name "My NVIDIA NIM Integration" \
  --provider nvidiaNim \
  --api-key $NVIDIA_API_KEY \
  --base-url "https://integrate.api.nvidia.com/v1"

Custom (OpenAI-compatible endpoint)

ax ai-integrations create \
  --name "My Custom Integration" \
  --provider custom \
  --base-url "https://my-llm-proxy.example.com/v1" \
  --api-key $CUSTOM_LLM_API_KEY

Supported Providers

ProviderRequired extra flags
openAI--api-key <key>
anthropic--api-key <key>
azureOpenAI--api-key <key>, --base-url <azure-endpoint>
awsBedrock--provider-metadata '{"role_arn": "<arn>"}'
vertexAI--provider-metadata '{"project_id": "<gcp-project>", "location": "<region>"}'
gemini--api-key <key>
nvidiaNim--api-key <key>, --base-url <nim-endpoint>
custom--base-url <endpoint>

Optional flags for any provider

FlagDescription
--model-nameAllowed model name (repeat for multiple, e.g. --model-name gpt-4o --model-name gpt-4o-mini); omit to allow all models
--enable-default-modelsEnable the provider's default model list
--function-calling-enabledEnable tool/function calling support
--auth-typeAuthentication type: default, proxy_with_headers, or bearer_token
--headersCustom headers as JSON object or file path (for proxy auth)
--provider-metadataProvider-specific metadata as JSON object or file path

After creation

Capture the returned integration ID (e.g., TGxtSW50ZWdyYXRpb246MTI6YUJjRA==) — it is needed for evaluator creation and other downstream commands. If you missed it, retrieve it:

ax ai-integrations list --space SPACE -o json
# or by name/ID directly:
ax ai-integrations get NAME_OR_ID

Update an AI Integration

update is a partial update — only the flags you provide are changed. Omitted fields stay as-is.

# Rename
ax ai-integrations update NAME_OR_ID --name "New Name"

# Rotate the API key
ax ai-integrations update NAME_OR_ID --api-key $OPENAI_API_KEY

# Change the model list (replaces all existing model names)
ax ai-integrations update NAME_OR_ID --model-name gpt-4o --model-name gpt-4o-mini

# Update base URL (for Azure, custom, or NIM)
ax ai-integrations update NAME_OR_ID --base-url "https://new-endpoint.example.com/v1"

Add --space SPACE when using a name instead of ID. Any flag accepted by create can be passed to update.


Delete an AI Integration

Warning: Deletion is permanent. Evaluators that reference this integration will no longer be able to run.

ax ai-integrations delete NAME_OR_ID --force
ax ai-integrations delete NAME_OR_ID --space SPACE --force   # required when using name instead of ID

Omit --force to get a confirmation prompt instead of deleting immediately.


Troubleshooting

ProblemSolution
ax: command not foundSee references/ax-setup.md
401 UnauthorizedAPI key may not have access to this space. Verify key and space ID at https://app.arize.com/admin > API Keys
No profile foundRun ax profiles show --expand; set ARIZE_API_KEY env var or write ~/.arize/config.toml
Integration not foundVerify with ax ai-integrations list --space SPACE
has_api_key: false after createCredentials were not saved — re-run update with the correct --api-key or --provider-metadata
Evaluator runs fail with LLM errorsCheck integration credentials with ax ai-integrations get INT_ID; rotate the API key if needed
provider mismatchCannot change provider after creation — delete and recreate with the correct provider

Related Skills

  • arize-evaluator: Create LLM-as-judge evaluators that use an AI integration → use arize-evaluator
  • arize-experiment: Run experiments that use evaluators backed by an AI integration → use arize-experiment

Save Credentials for Future Use

See references/ax-profiles.md § Save Credentials for Future Use.

Capabilities

skillsource-arize-aiskill-arize-ai-provider-integrationtopic-agent-skillstopic-ai-agentstopic-ai-observabilitytopic-arizetopic-claude-codetopic-codextopic-cursortopic-datasetstopic-experimentstopic-llmopstopic-tracing

Install

Quality

0.46/ 1.00

deterministic score 0.46 from registry signals: · indexed on github topic:agent-skills · 13 github stars · SKILL.md body (9,794 chars)

Provenance

Indexed fromgithub
Enriched2026-04-24 01:02:56Z · deterministic:skill-github:v1 · v1
First seen2026-04-23
Last seen2026-04-24

Agent access

arize-ai-provider-integration — Clawmart · Clawmart