Ai Attributes

33 attributes in this category. 13 stable · 20 deprecated

Stable Attributes

ai.citations

string[] PII: True OTel: False

References or sources cited by the AI model in its response.

Example ["Citation 1","Citation 2"]
Raw JSON
{
  "key": "ai.citations",
  "brief": "References or sources cited by the AI model in its response.",
  "type": "string[]",
  "pii": {
    "key": "true"
  },
  "is_in_otel": false,
  "example": [
    "Citation 1",
    "Citation 2"
  ]
}

ai.documents

string[] PII: True OTel: False

Documents or content chunks used as context for the AI model.

Example ["document1.txt","document2.pdf"]
Raw JSON
{
  "key": "ai.documents",
  "brief": "Documents or content chunks used as context for the AI model.",
  "type": "string[]",
  "pii": {
    "key": "true"
  },
  "is_in_otel": false,
  "example": [
    "document1.txt",
    "document2.pdf"
  ]
}

ai.is_search_required

boolean PII: False OTel: False

Boolean indicating if the model needs to perform a search.

Example false
Raw JSON
{
  "key": "ai.is_search_required",
  "brief": "Boolean indicating if the model needs to perform a search.",
  "type": "boolean",
  "pii": {
    "key": "false"
  },
  "is_in_otel": false,
  "example": false
}

ai.metadata

string PII: Maybe OTel: False

Extra metadata passed to an AI pipeline step.

Example {"user_id": 123, "session_id": "abc123"}
Raw JSON
{
  "key": "ai.metadata",
  "brief": "Extra metadata passed to an AI pipeline step.",
  "type": "string",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": "{\"user_id\": 123, \"session_id\": \"abc123\"}"
}

ai.preamble

string PII: True OTel: False

For an AI model call, the preamble parameter. Preambles are a part of the prompt used to adjust the model's overall behavior and conversation style.

Example You are now a clown.
Raw JSON
{
  "key": "ai.preamble",
  "brief": "For an AI model call, the preamble parameter. Preambles are a part of the prompt used to adjust the model's overall behavior and conversation style.",
  "type": "string",
  "pii": {
    "key": "true"
  },
  "is_in_otel": false,
  "example": "You are now a clown."
}

ai.raw_prompting

boolean PII: False OTel: False

When enabled, the user’s prompt will be sent to the model without any pre-processing.

Example true
Raw JSON
{
  "key": "ai.raw_prompting",
  "brief": "When enabled, the user’s prompt will be sent to the model without any pre-processing.",
  "type": "boolean",
  "pii": {
    "key": "false"
  },
  "is_in_otel": false,
  "example": true
}

ai.response_format

string PII: Maybe OTel: False

For an AI model call, the format of the response

Example json_object
Raw JSON
{
  "key": "ai.response_format",
  "brief": "For an AI model call, the format of the response",
  "type": "string",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": "json_object"
}

ai.search_queries

string[] PII: True OTel: False

Queries used to search for relevant context or documents.

Example ["climate change effects","renewable energy"]
Raw JSON
{
  "key": "ai.search_queries",
  "brief": "Queries used to search for relevant context or documents.",
  "type": "string[]",
  "pii": {
    "key": "true"
  },
  "is_in_otel": false,
  "example": [
    "climate change effects",
    "renewable energy"
  ]
}

ai.search_results

string[] PII: True OTel: False

Results returned from search queries for context.

Example ["search_result_1, search_result_2"]
Raw JSON
{
  "key": "ai.search_results",
  "brief": "Results returned from search queries for context.",
  "type": "string[]",
  "pii": {
    "key": "true"
  },
  "is_in_otel": false,
  "example": [
    "search_result_1, search_result_2"
  ]
}

ai.tags

string PII: Maybe OTel: False

Tags that describe an AI pipeline step.

Example {"executed_function": "add_integers"}
Raw JSON
{
  "key": "ai.tags",
  "brief": "Tags that describe an AI pipeline step.",
  "type": "string",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": "{\"executed_function\": \"add_integers\"}"
}

ai.texts

string[] PII: True OTel: False

Raw text inputs provided to the model.

Example ["Hello, how are you?","What is the capital of France?"]
Raw JSON
{
  "key": "ai.texts",
  "brief": "Raw text inputs provided to the model.",
  "type": "string[]",
  "pii": {
    "key": "true"
  },
  "is_in_otel": false,
  "example": [
    "Hello, how are you?",
    "What is the capital of France?"
  ]
}

ai.total_cost

double PII: Maybe OTel: False

The total cost for the tokens used.

Example 12.34
Raw JSON
{
  "key": "ai.total_cost",
  "brief": "The total cost for the tokens used.",
  "type": "double",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": 12.34
}

ai.warnings

string[] PII: True OTel: False

Warning messages generated during model execution.

Example ["Token limit exceeded"]
Raw JSON
{
  "key": "ai.warnings",
  "brief": "Warning messages generated during model execution.",
  "type": "string[]",
  "pii": {
    "key": "true"
  },
  "is_in_otel": false,
  "example": [
    "Token limit exceeded"
  ]
}

Deprecated Attributes

These attributes are deprecated and should not be used in new code. See each attribute for migration guidance.

ai.completion_tokens.used Deprecated

integer PII: Maybe OTel: False

The number of tokens used to respond to the message.

Example 10
Aliases gen_ai.usage.output_tokensgen_ai.usage.completion_tokens
SDKs python

Use gen_ai.usage.output_tokens instead.

Raw JSON
{
  "key": "ai.completion_tokens.used",
  "brief": "The number of tokens used to respond to the message.",
  "type": "integer",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": 10,
  "deprecation": {
    "replacement": "gen_ai.usage.output_tokens",
    "_status": null
  },
  "alias": [
    "gen_ai.usage.output_tokens",
    "gen_ai.usage.completion_tokens"
  ],
  "sdks": [
    "python"
  ]
}

ai.finish_reason Deprecated

string PII: Maybe OTel: False

The reason why the model stopped generating.

Example COMPLETE
Aliases gen_ai.response.finish_reasons

Use gen_ai.response.finish_reason instead.

Raw JSON
{
  "key": "ai.finish_reason",
  "brief": "The reason why the model stopped generating.",
  "type": "string",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": "COMPLETE",
  "deprecation": {
    "replacement": "gen_ai.response.finish_reason",
    "_status": null
  },
  "alias": [
    "gen_ai.response.finish_reasons"
  ]
}

ai.frequency_penalty Deprecated

double PII: Maybe OTel: False

Used to reduce repetitiveness of generated tokens. The higher the value, the stronger a penalty is applied to previously present tokens, proportional to how many times they have already appeared in the prompt or prior generation.

Example 0.5
Aliases gen_ai.request.frequency_penalty

Use gen_ai.request.frequency_penalty instead.

Raw JSON
{
  "key": "ai.frequency_penalty",
  "brief": "Used to reduce repetitiveness of generated tokens. The higher the value, the stronger a penalty is applied to previously present tokens, proportional to how many times they have already appeared in the prompt or prior generation.",
  "type": "double",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": 0.5,
  "deprecation": {
    "replacement": "gen_ai.request.frequency_penalty",
    "_status": null
  },
  "alias": [
    "gen_ai.request.frequency_penalty"
  ]
}

ai.function_call Deprecated

string PII: True OTel: False

For an AI model call, the function that was called. This is deprecated for OpenAI, and replaced by tool_calls

Example function_name
Aliases gen_ai.tool.name

Use gen_ai.tool.name instead.

Raw JSON
{
  "key": "ai.function_call",
  "brief": "For an AI model call, the function that was called. This is deprecated for OpenAI, and replaced by tool_calls",
  "type": "string",
  "pii": {
    "key": "true"
  },
  "is_in_otel": false,
  "example": "function_name",
  "deprecation": {
    "replacement": "gen_ai.tool.name",
    "_status": null
  },
  "alias": [
    "gen_ai.tool.name"
  ]
}

ai.generation_id Deprecated

string PII: Maybe OTel: False

Unique identifier for the completion.

Example gen_123abc
Aliases gen_ai.response.id

Use gen_ai.response.id instead.

Raw JSON
{
  "key": "ai.generation_id",
  "brief": "Unique identifier for the completion.",
  "type": "string",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": "gen_123abc",
  "deprecation": {
    "replacement": "gen_ai.response.id",
    "_status": null
  },
  "alias": [
    "gen_ai.response.id"
  ]
}

ai.input_messages Deprecated

string PII: Maybe OTel: False

The input messages sent to the model

Example [{"role": "user", "message": "hello"}]
Aliases gen_ai.request.messages
SDKs python

Use gen_ai.request.messages instead.

Raw JSON
{
  "key": "ai.input_messages",
  "brief": "The input messages sent to the model",
  "type": "string",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": "[{\"role\": \"user\", \"message\": \"hello\"}]",
  "deprecation": {
    "replacement": "gen_ai.request.messages",
    "_status": null
  },
  "alias": [
    "gen_ai.request.messages"
  ],
  "sdks": [
    "python"
  ]
}

ai.model_id Deprecated

string PII: Maybe OTel: False

The vendor-specific ID of the model used.

Example gpt-4
Aliases gen_ai.response.model
SDKs python

Use gen_ai.response.model instead.

Raw JSON
{
  "key": "ai.model_id",
  "brief": "The vendor-specific ID of the model used.",
  "type": "string",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": "gpt-4",
  "deprecation": {
    "replacement": "gen_ai.response.model",
    "_status": null
  },
  "alias": [
    "gen_ai.response.model"
  ],
  "sdks": [
    "python"
  ]
}

ai.model.provider Deprecated

string PII: Maybe OTel: False

The provider of the model.

Example openai
Aliases gen_ai.system

Use gen_ai.system instead.

Raw JSON
{
  "key": "ai.model.provider",
  "brief": "The provider of the model.",
  "type": "string",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": "openai",
  "deprecation": {
    "replacement": "gen_ai.system",
    "_status": null
  },
  "alias": [
    "gen_ai.system"
  ]
}

ai.pipeline.name Deprecated

string PII: Maybe OTel: False

The name of the AI pipeline.

Example Autofix Pipeline
Aliases gen_ai.pipeline.name

Use gen_ai.pipeline.name instead.

Raw JSON
{
  "key": "ai.pipeline.name",
  "brief": "The name of the AI pipeline.",
  "type": "string",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": "Autofix Pipeline",
  "deprecation": {
    "replacement": "gen_ai.pipeline.name",
    "_status": null
  },
  "alias": [
    "gen_ai.pipeline.name"
  ]
}

ai.presence_penalty Deprecated

double PII: Maybe OTel: False

Used to reduce repetitiveness of generated tokens. Similar to frequency_penalty, except that this penalty is applied equally to all tokens that have already appeared, regardless of their exact frequencies.

Example 0.5
Aliases gen_ai.request.presence_penalty

Use gen_ai.request.presence_penalty instead.

Raw JSON
{
  "key": "ai.presence_penalty",
  "brief": "Used to reduce repetitiveness of generated tokens. Similar to frequency_penalty, except that this penalty is applied equally to all tokens that have already appeared, regardless of their exact frequencies.",
  "type": "double",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": 0.5,
  "deprecation": {
    "replacement": "gen_ai.request.presence_penalty",
    "_status": null
  },
  "alias": [
    "gen_ai.request.presence_penalty"
  ]
}

ai.prompt_tokens.used Deprecated

integer PII: Maybe OTel: False

The number of tokens used to process just the prompt.

Example 20
Aliases gen_ai.usage.prompt_tokensgen_ai.usage.input_tokens
SDKs python

Use gen_ai.usage.input_tokens instead.

Raw JSON
{
  "key": "ai.prompt_tokens.used",
  "brief": "The number of tokens used to process just the prompt.",
  "type": "integer",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": 20,
  "deprecation": {
    "replacement": "gen_ai.usage.input_tokens",
    "_status": null
  },
  "alias": [
    "gen_ai.usage.prompt_tokens",
    "gen_ai.usage.input_tokens"
  ],
  "sdks": [
    "python"
  ]
}

ai.responses Deprecated

string[] PII: Maybe OTel: False

The response messages sent back by the AI model.

Example ["hello","world"]
SDKs python

Use gen_ai.response.text instead.

Raw JSON
{
  "key": "ai.responses",
  "brief": "The response messages sent back by the AI model.",
  "type": "string[]",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": [
    "hello",
    "world"
  ],
  "deprecation": {
    "replacement": "gen_ai.response.text",
    "_status": null
  },
  "sdks": [
    "python"
  ]
}

ai.seed Deprecated

string PII: Maybe OTel: False

The seed, ideally models given the same seed and same other parameters will produce the exact same output.

Example 1234567890
Aliases gen_ai.request.seed

Use gen_ai.request.seed instead.

Raw JSON
{
  "key": "ai.seed",
  "brief": "The seed, ideally models given the same seed and same other parameters will produce the exact same output.",
  "type": "string",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": "1234567890",
  "deprecation": {
    "replacement": "gen_ai.request.seed",
    "_status": null
  },
  "alias": [
    "gen_ai.request.seed"
  ]
}

ai.streaming Deprecated

boolean PII: False OTel: False

Whether the request was streamed back.

Example true
Aliases gen_ai.response.streaming
SDKs python

Use gen_ai.response.streaming instead.

Raw JSON
{
  "key": "ai.streaming",
  "brief": "Whether the request was streamed back.",
  "type": "boolean",
  "pii": {
    "key": "false"
  },
  "is_in_otel": false,
  "example": true,
  "deprecation": {
    "replacement": "gen_ai.response.streaming",
    "_status": null
  },
  "alias": [
    "gen_ai.response.streaming"
  ],
  "sdks": [
    "python"
  ]
}

ai.temperature Deprecated

double PII: Maybe OTel: False

For an AI model call, the temperature parameter. Temperature essentially means how random the output will be.

Example 0.1
Aliases gen_ai.request.temperature

Use gen_ai.request.temperature instead.

Raw JSON
{
  "key": "ai.temperature",
  "brief": "For an AI model call, the temperature parameter. Temperature essentially means how random the output will be.",
  "type": "double",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": 0.1,
  "deprecation": {
    "replacement": "gen_ai.request.temperature",
    "_status": null
  },
  "alias": [
    "gen_ai.request.temperature"
  ]
}

ai.tool_calls Deprecated

string[] PII: True OTel: False

For an AI model call, the tool calls that were made.

Example ["tool_call_1","tool_call_2"]

Use gen_ai.response.tool_calls instead.

Raw JSON
{
  "key": "ai.tool_calls",
  "brief": "For an AI model call, the tool calls that were made.",
  "type": "string[]",
  "pii": {
    "key": "true"
  },
  "is_in_otel": false,
  "example": [
    "tool_call_1",
    "tool_call_2"
  ],
  "deprecation": {
    "replacement": "gen_ai.response.tool_calls",
    "_status": null
  }
}

ai.tools Deprecated

string[] PII: Maybe OTel: False

For an AI model call, the functions that are available

Example ["function_1","function_2"]

Use gen_ai.request.available_tools instead.

Raw JSON
{
  "key": "ai.tools",
  "brief": "For an AI model call, the functions that are available",
  "type": "string[]",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": [
    "function_1",
    "function_2"
  ],
  "deprecation": {
    "replacement": "gen_ai.request.available_tools",
    "_status": null
  }
}

ai.top_k Deprecated

integer PII: Maybe OTel: False

Limits the model to only consider the K most likely next tokens, where K is an integer (e.g., top_k=20 means only the 20 highest probability tokens are considered).

Example 35
Aliases gen_ai.request.top_k

Use gen_ai.request.top_k instead.

Raw JSON
{
  "key": "ai.top_k",
  "brief": "Limits the model to only consider the K most likely next tokens, where K is an integer (e.g., top_k=20 means only the 20 highest probability tokens are considered).",
  "type": "integer",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": 35,
  "deprecation": {
    "replacement": "gen_ai.request.top_k",
    "_status": null
  },
  "alias": [
    "gen_ai.request.top_k"
  ]
}

ai.top_p Deprecated

double PII: Maybe OTel: False

Limits the model to only consider tokens whose cumulative probability mass adds up to p, where p is a float between 0 and 1 (e.g., top_p=0.7 means only tokens that sum up to 70% of the probability mass are considered).

Example 0.7
Aliases gen_ai.request.top_p

Use gen_ai.request.top_p instead.

Raw JSON
{
  "key": "ai.top_p",
  "brief": "Limits the model to only consider tokens whose cumulative probability mass adds up to p, where p is a float between 0 and 1 (e.g., top_p=0.7 means only tokens that sum up to 70% of the probability mass are considered).",
  "type": "double",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": 0.7,
  "deprecation": {
    "replacement": "gen_ai.request.top_p",
    "_status": null
  },
  "alias": [
    "gen_ai.request.top_p"
  ]
}

ai.total_tokens.used Deprecated

integer PII: Maybe OTel: False

The total number of tokens used to process the prompt.

Example 30
Aliases gen_ai.usage.total_tokens
SDKs python

Use gen_ai.usage.total_tokens instead.

Raw JSON
{
  "key": "ai.total_tokens.used",
  "brief": "The total number of tokens used to process the prompt.",
  "type": "integer",
  "pii": {
    "key": "maybe"
  },
  "is_in_otel": false,
  "example": 30,
  "deprecation": {
    "replacement": "gen_ai.usage.total_tokens",
    "_status": null
  },
  "alias": [
    "gen_ai.usage.total_tokens"
  ],
  "sdks": [
    "python"
  ]
}