These attributes are deprecated and should not be used in new code.
See each attribute for migration guidance.
string[] PII: True OTel: False
References or sources cited by the AI model in its response.
Example ["Citation 1","Citation 2"]
No replacement available at this time.
Changelog
Raw JSON
Copy {
"key": "ai.citations",
"brief": "References or sources cited by the AI model in its response.",
"type": "string[]",
"pii": {
"key": "true"
},
"is_in_otel": false,
"example": [
"Citation 1",
"Citation 2"
],
"deprecation": {
"_status": null
},
"changelog": [
{
"version": "next",
"prs": [
264
]
},
{
"version": "0.1.0",
"prs": [
55
]
}
]
} integer PII: Maybe OTel: False
The number of tokens used to respond to the message.
Example 10
Aliases gen_ai.usage.output_tokensgen_ai.usage.completion_tokens
SDKs python
Use gen_ai.usage.output_tokens instead.
Changelog
Raw JSON
Copy {
"key": "ai.completion_tokens.used",
"brief": "The number of tokens used to respond to the message.",
"type": "integer",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": 10,
"deprecation": {
"replacement": "gen_ai.usage.output_tokens",
"_status": null
},
"alias": [
"gen_ai.usage.output_tokens",
"gen_ai.usage.completion_tokens"
],
"sdks": [
"python"
],
"changelog": [
{
"version": "0.4.0",
"prs": [
228
]
},
{
"version": "0.1.0",
"prs": [
57,
61
]
},
{
"version": "0.0.0"
}
]
} string[] PII: True OTel: False
Documents or content chunks used as context for the AI model.
Example ["document1.txt","document2.pdf"]
No replacement available at this time.
Changelog
Raw JSON
Copy {
"key": "ai.documents",
"brief": "Documents or content chunks used as context for the AI model.",
"type": "string[]",
"pii": {
"key": "true"
},
"is_in_otel": false,
"example": [
"document1.txt",
"document2.pdf"
],
"deprecation": {
"_status": null
},
"changelog": [
{
"version": "next",
"prs": [
264
]
},
{
"version": "0.1.0",
"prs": [
55
]
}
]
} string PII: Maybe OTel: False
The reason why the model stopped generating.
Example COMPLETE
Aliases gen_ai.response.finish_reasons
Use gen_ai.response.finish_reason instead.
Changelog
Raw JSON
Copy {
"key": "ai.finish_reason",
"brief": "The reason why the model stopped generating.",
"type": "string",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": "COMPLETE",
"deprecation": {
"replacement": "gen_ai.response.finish_reason",
"_status": null
},
"alias": [
"gen_ai.response.finish_reasons"
],
"changelog": [
{
"version": "0.1.0",
"prs": [
55,
57,
61,
108,
127
]
}
]
} double PII: Maybe OTel: False
Used to reduce repetitiveness of generated tokens. The higher the value, the stronger a penalty is applied to previously present tokens, proportional to how many times they have already appeared in the prompt or prior generation.
Example 0.5
Aliases gen_ai.request.frequency_penalty
Use gen_ai.request.frequency_penalty instead.
Changelog
Raw JSON
Copy {
"key": "ai.frequency_penalty",
"brief": "Used to reduce repetitiveness of generated tokens. The higher the value, the stronger a penalty is applied to previously present tokens, proportional to how many times they have already appeared in the prompt or prior generation.",
"type": "double",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": 0.5,
"deprecation": {
"replacement": "gen_ai.request.frequency_penalty",
"_status": null
},
"alias": [
"gen_ai.request.frequency_penalty"
],
"changelog": [
{
"version": "0.4.0",
"prs": [
228
]
},
{
"version": "0.1.0",
"prs": [
55,
57,
61,
108
]
}
]
} string PII: True OTel: False
For an AI model call, the function that was called. This is deprecated for OpenAI, and replaced by tool_calls
Example function_name
Aliases gen_ai.tool.name
Use gen_ai.tool.name instead.
Changelog
Raw JSON
Copy {
"key": "ai.function_call",
"brief": "For an AI model call, the function that was called. This is deprecated for OpenAI, and replaced by tool_calls",
"type": "string",
"pii": {
"key": "true"
},
"is_in_otel": false,
"example": "function_name",
"deprecation": {
"replacement": "gen_ai.tool.name",
"_status": null
},
"alias": [
"gen_ai.tool.name"
],
"changelog": [
{
"version": "0.1.0",
"prs": [
55,
57,
61,
108
]
}
]
} string PII: Maybe OTel: False
Unique identifier for the completion.
Example gen_123abc
Aliases gen_ai.response.id
Use gen_ai.response.id instead.
Changelog
Raw JSON
Copy {
"key": "ai.generation_id",
"brief": "Unique identifier for the completion.",
"type": "string",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": "gen_123abc",
"deprecation": {
"replacement": "gen_ai.response.id",
"_status": null
},
"alias": [
"gen_ai.response.id"
],
"changelog": [
{
"version": "0.1.0",
"prs": [
55,
57,
61,
108,
127
]
}
]
} string PII: Maybe OTel: False
The input messages sent to the model
Example [{"role": "user", "message": "hello"}]
Aliases gen_ai.request.messages
SDKs python
Use gen_ai.request.messages instead.
Changelog
Raw JSON
Copy {
"key": "ai.input_messages",
"brief": "The input messages sent to the model",
"type": "string",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": "[{\"role\": \"user\", \"message\": \"hello\"}]",
"deprecation": {
"replacement": "gen_ai.request.messages",
"_status": null
},
"alias": [
"gen_ai.request.messages"
],
"sdks": [
"python"
],
"changelog": [
{
"version": "0.1.0",
"prs": [
65,
119
]
},
{
"version": "0.0.0"
}
]
} boolean PII: False OTel: False
Boolean indicating if the model needs to perform a search.
Example false
No replacement available at this time.
Changelog
Raw JSON
Copy {
"key": "ai.is_search_required",
"brief": "Boolean indicating if the model needs to perform a search.",
"type": "boolean",
"pii": {
"key": "false"
},
"is_in_otel": false,
"example": false,
"deprecation": {
"_status": null
},
"changelog": [
{
"version": "next",
"prs": [
264
]
},
{
"version": "0.1.0",
"prs": [
55
]
}
]
} string PII: Maybe OTel: False
Extra metadata passed to an AI pipeline step.
Example {"user_id": 123, "session_id": "abc123"}
No replacement available at this time.
Changelog
Raw JSON
Copy {
"key": "ai.metadata",
"brief": "Extra metadata passed to an AI pipeline step.",
"type": "string",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": "{\"user_id\": 123, \"session_id\": \"abc123\"}",
"deprecation": {
"_status": null
},
"changelog": [
{
"version": "next",
"prs": [
264
]
},
{
"version": "0.1.0",
"prs": [
55,
127
]
}
]
} string PII: Maybe OTel: False
The vendor-specific ID of the model used.
Example gpt-4
Aliases gen_ai.response.model
SDKs python
Use gen_ai.response.model instead.
Changelog
Raw JSON
Copy {
"key": "ai.model_id",
"brief": "The vendor-specific ID of the model used.",
"type": "string",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": "gpt-4",
"deprecation": {
"replacement": "gen_ai.response.model",
"_status": null
},
"alias": [
"gen_ai.response.model"
],
"sdks": [
"python"
],
"changelog": [
{
"version": "0.1.0",
"prs": [
57,
61,
127
]
},
{
"version": "0.0.0"
}
]
} string PII: Maybe OTel: False
The provider of the model.
Example openai
Aliases gen_ai.provider.namegen_ai.system
Use gen_ai.provider.name instead.
Changelog
Raw JSON
Copy {
"key": "ai.model.provider",
"brief": "The provider of the model.",
"type": "string",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": "openai",
"deprecation": {
"replacement": "gen_ai.provider.name",
"_status": null
},
"alias": [
"gen_ai.provider.name",
"gen_ai.system"
],
"changelog": [
{
"version": "0.4.0",
"prs": [
253
]
},
{
"version": "0.1.0",
"prs": [
57,
61,
108,
127
]
}
]
} string PII: Maybe OTel: False
The name of the AI pipeline.
Example Autofix Pipeline
Aliases gen_ai.pipeline.name
Use gen_ai.pipeline.name instead.
Changelog
Raw JSON
Copy {
"key": "ai.pipeline.name",
"brief": "The name of the AI pipeline.",
"type": "string",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": "Autofix Pipeline",
"deprecation": {
"replacement": "gen_ai.pipeline.name",
"_status": null
},
"alias": [
"gen_ai.pipeline.name"
],
"changelog": [
{
"version": "0.1.0",
"prs": [
53,
76,
108,
127
]
}
]
} string PII: True OTel: False
For an AI model call, the preamble parameter. Preambles are a part of the prompt used to adjust the model's overall behavior and conversation style.
Example You are now a clown.
Aliases gen_ai.system_instructions
Use gen_ai.system_instructions instead.
Changelog
Raw JSON
Copy {
"key": "ai.preamble",
"brief": "For an AI model call, the preamble parameter. Preambles are a part of the prompt used to adjust the model's overall behavior and conversation style.",
"type": "string",
"pii": {
"key": "true"
},
"is_in_otel": false,
"example": "You are now a clown.",
"deprecation": {
"replacement": "gen_ai.system_instructions",
"_status": null
},
"alias": [
"gen_ai.system_instructions"
],
"changelog": [
{
"version": "next",
"prs": [
264
]
},
{
"version": "0.1.0",
"prs": [
55
]
}
]
} double PII: Maybe OTel: False
Used to reduce repetitiveness of generated tokens. Similar to frequency_penalty, except that this penalty is applied equally to all tokens that have already appeared, regardless of their exact frequencies.
Example 0.5
Aliases gen_ai.request.presence_penalty
Use gen_ai.request.presence_penalty instead.
Changelog
Raw JSON
Copy {
"key": "ai.presence_penalty",
"brief": "Used to reduce repetitiveness of generated tokens. Similar to frequency_penalty, except that this penalty is applied equally to all tokens that have already appeared, regardless of their exact frequencies.",
"type": "double",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": 0.5,
"deprecation": {
"replacement": "gen_ai.request.presence_penalty",
"_status": null
},
"alias": [
"gen_ai.request.presence_penalty"
],
"changelog": [
{
"version": "0.4.0",
"prs": [
228
]
},
{
"version": "0.1.0",
"prs": [
55,
57,
61,
108
]
}
]
} integer PII: Maybe OTel: False
The number of tokens used to process just the prompt.
Example 20
Aliases gen_ai.usage.prompt_tokensgen_ai.usage.input_tokens
SDKs python
Use gen_ai.usage.input_tokens instead.
Changelog
Raw JSON
Copy {
"key": "ai.prompt_tokens.used",
"brief": "The number of tokens used to process just the prompt.",
"type": "integer",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": 20,
"deprecation": {
"replacement": "gen_ai.usage.input_tokens",
"_status": null
},
"alias": [
"gen_ai.usage.prompt_tokens",
"gen_ai.usage.input_tokens"
],
"sdks": [
"python"
],
"changelog": [
{
"version": "0.4.0",
"prs": [
228
]
},
{
"version": "0.1.0",
"prs": [
57,
61
]
},
{
"version": "0.0.0"
}
]
} boolean PII: False OTel: False
When enabled, the user’s prompt will be sent to the model without any pre-processing.
Example true
No replacement available at this time.
Changelog
Raw JSON
Copy {
"key": "ai.raw_prompting",
"brief": "When enabled, the user’s prompt will be sent to the model without any pre-processing.",
"type": "boolean",
"pii": {
"key": "false"
},
"is_in_otel": false,
"example": true,
"deprecation": {
"_status": null
},
"changelog": [
{
"version": "next",
"prs": [
264
]
},
{
"version": "0.1.0",
"prs": [
55
]
}
]
} string PII: Maybe OTel: False
For an AI model call, the format of the response
Example json_object
No replacement available at this time.
Changelog
Raw JSON
Copy {
"key": "ai.response_format",
"brief": "For an AI model call, the format of the response",
"type": "string",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": "json_object",
"deprecation": {
"_status": null
},
"changelog": [
{
"version": "next",
"prs": [
264
]
},
{
"version": "0.1.0",
"prs": [
55,
127
]
}
]
} string[] PII: Maybe OTel: False
The response messages sent back by the AI model.
Example ["hello","world"]
SDKs python
Use gen_ai.response.text instead.
Changelog
Raw JSON
Copy {
"key": "ai.responses",
"brief": "The response messages sent back by the AI model.",
"type": "string[]",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": [
"hello",
"world"
],
"deprecation": {
"replacement": "gen_ai.response.text",
"_status": null
},
"sdks": [
"python"
],
"changelog": [
{
"version": "0.1.0",
"prs": [
65,
127
]
},
{
"version": "0.0.0"
}
]
} string[] PII: True OTel: False
Queries used to search for relevant context or documents.
Example ["climate change effects","renewable energy"]
No replacement available at this time.
Changelog
Raw JSON
Copy {
"key": "ai.search_queries",
"brief": "Queries used to search for relevant context or documents.",
"type": "string[]",
"pii": {
"key": "true"
},
"is_in_otel": false,
"example": [
"climate change effects",
"renewable energy"
],
"deprecation": {
"_status": null
},
"changelog": [
{
"version": "next",
"prs": [
264
]
},
{
"version": "0.1.0",
"prs": [
55
]
}
]
} string[] PII: True OTel: False
Results returned from search queries for context.
Example ["search_result_1, search_result_2"]
No replacement available at this time.
Changelog
Raw JSON
Copy {
"key": "ai.search_results",
"brief": "Results returned from search queries for context.",
"type": "string[]",
"pii": {
"key": "true"
},
"is_in_otel": false,
"example": [
"search_result_1, search_result_2"
],
"deprecation": {
"_status": null
},
"changelog": [
{
"version": "next",
"prs": [
264
]
},
{
"version": "0.1.0",
"prs": [
55
]
}
]
} string PII: Maybe OTel: False
The seed, ideally models given the same seed and same other parameters will produce the exact same output.
Example 1234567890
Aliases gen_ai.request.seed
Use gen_ai.request.seed instead.
Changelog
Raw JSON
Copy {
"key": "ai.seed",
"brief": "The seed, ideally models given the same seed and same other parameters will produce the exact same output.",
"type": "string",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": "1234567890",
"deprecation": {
"replacement": "gen_ai.request.seed",
"_status": null
},
"alias": [
"gen_ai.request.seed"
],
"changelog": [
{
"version": "0.1.0",
"prs": [
55,
57,
61,
108,
127
]
}
]
} boolean PII: False OTel: False
Whether the request was streamed back.
Example true
Aliases gen_ai.response.streaming
SDKs python
Use gen_ai.response.streaming instead.
Changelog
Raw JSON
Copy {
"key": "ai.streaming",
"brief": "Whether the request was streamed back.",
"type": "boolean",
"pii": {
"key": "false"
},
"is_in_otel": false,
"example": true,
"deprecation": {
"replacement": "gen_ai.response.streaming",
"_status": null
},
"alias": [
"gen_ai.response.streaming"
],
"sdks": [
"python"
],
"changelog": [
{
"version": "0.1.0",
"prs": [
76,
108
]
},
{
"version": "0.0.0"
}
]
} string PII: Maybe OTel: False
Tags that describe an AI pipeline step.
Example {"executed_function": "add_integers"}
No replacement available at this time.
Changelog
Raw JSON
Copy {
"key": "ai.tags",
"brief": "Tags that describe an AI pipeline step.",
"type": "string",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": "{\"executed_function\": \"add_integers\"}",
"deprecation": {
"_status": null
},
"changelog": [
{
"version": "next",
"prs": [
264
]
},
{
"version": "0.1.0",
"prs": [
55,
127
]
}
]
} double PII: Maybe OTel: False
For an AI model call, the temperature parameter. Temperature essentially means how random the output will be.
Example 0.1
Aliases gen_ai.request.temperature
Use gen_ai.request.temperature instead.
Changelog
Raw JSON
Copy {
"key": "ai.temperature",
"brief": "For an AI model call, the temperature parameter. Temperature essentially means how random the output will be.",
"type": "double",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": 0.1,
"deprecation": {
"replacement": "gen_ai.request.temperature",
"_status": null
},
"alias": [
"gen_ai.request.temperature"
],
"changelog": [
{
"version": "0.4.0",
"prs": [
228
]
},
{
"version": "0.1.0",
"prs": [
55,
57,
61,
108
]
}
]
} string[] PII: True OTel: False
Raw text inputs provided to the model.
Example ["Hello, how are you?","What is the capital of France?"]
Aliases gen_ai.input.messages
Use gen_ai.input.messages instead.
Changelog
Raw JSON
Copy {
"key": "ai.texts",
"brief": "Raw text inputs provided to the model.",
"type": "string[]",
"pii": {
"key": "true"
},
"is_in_otel": false,
"example": [
"Hello, how are you?",
"What is the capital of France?"
],
"deprecation": {
"replacement": "gen_ai.input.messages",
"_status": null
},
"alias": [
"gen_ai.input.messages"
],
"changelog": [
{
"version": "next",
"prs": [
264
]
},
{
"version": "0.1.0",
"prs": [
55
]
}
]
} string[] PII: True OTel: False
For an AI model call, the tool calls that were made.
Example ["tool_call_1","tool_call_2"]
Use gen_ai.response.tool_calls instead.
Changelog
Raw JSON
Copy {
"key": "ai.tool_calls",
"brief": "For an AI model call, the tool calls that were made.",
"type": "string[]",
"pii": {
"key": "true"
},
"is_in_otel": false,
"example": [
"tool_call_1",
"tool_call_2"
],
"deprecation": {
"replacement": "gen_ai.response.tool_calls",
"_status": null
},
"changelog": [
{
"version": "0.1.0",
"prs": [
55,
65
]
}
]
} string[] PII: Maybe OTel: False
For an AI model call, the functions that are available
Example ["function_1","function_2"]
Use gen_ai.request.available_tools instead.
Changelog
Raw JSON
Copy {
"key": "ai.tools",
"brief": "For an AI model call, the functions that are available",
"type": "string[]",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": [
"function_1",
"function_2"
],
"deprecation": {
"replacement": "gen_ai.request.available_tools",
"_status": null
},
"changelog": [
{
"version": "0.1.0",
"prs": [
55,
65,
127
]
}
]
} integer PII: Maybe OTel: False
Limits the model to only consider the K most likely next tokens, where K is an integer (e.g., top_k=20 means only the 20 highest probability tokens are considered).
Example 35
Aliases gen_ai.request.top_k
Use gen_ai.request.top_k instead.
Changelog
Raw JSON
Copy {
"key": "ai.top_k",
"brief": "Limits the model to only consider the K most likely next tokens, where K is an integer (e.g., top_k=20 means only the 20 highest probability tokens are considered).",
"type": "integer",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": 35,
"deprecation": {
"replacement": "gen_ai.request.top_k",
"_status": null
},
"alias": [
"gen_ai.request.top_k"
],
"changelog": [
{
"version": "0.4.0",
"prs": [
228
]
},
{
"version": "0.1.0",
"prs": [
55,
57,
61,
108
]
}
]
} double PII: Maybe OTel: False
Limits the model to only consider tokens whose cumulative probability mass adds up to p, where p is a float between 0 and 1 (e.g., top_p=0.7 means only tokens that sum up to 70% of the probability mass are considered).
Example 0.7
Aliases gen_ai.request.top_p
Use gen_ai.request.top_p instead.
Changelog
Raw JSON
Copy {
"key": "ai.top_p",
"brief": "Limits the model to only consider tokens whose cumulative probability mass adds up to p, where p is a float between 0 and 1 (e.g., top_p=0.7 means only tokens that sum up to 70% of the probability mass are considered).",
"type": "double",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": 0.7,
"deprecation": {
"replacement": "gen_ai.request.top_p",
"_status": null
},
"alias": [
"gen_ai.request.top_p"
],
"changelog": [
{
"version": "0.4.0",
"prs": [
228
]
},
{
"version": "0.1.0",
"prs": [
55,
57,
61,
108
]
}
]
} double PII: Maybe OTel: False
The total cost for the tokens used.
Example 12.34
Aliases gen_ai.cost.total_tokens
Use gen_ai.cost.total_tokens instead.
Changelog
Raw JSON
Copy {
"key": "ai.total_cost",
"brief": "The total cost for the tokens used.",
"type": "double",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": 12.34,
"deprecation": {
"replacement": "gen_ai.cost.total_tokens",
"_status": null
},
"alias": [
"gen_ai.cost.total_tokens"
],
"changelog": [
{
"version": "next",
"prs": [
264
]
},
{
"version": "0.4.0",
"prs": [
228
]
},
{
"version": "0.1.0",
"prs": [
53
]
}
]
} integer PII: Maybe OTel: False
The total number of tokens used to process the prompt.
Example 30
Aliases gen_ai.usage.total_tokens
SDKs python
Use gen_ai.usage.total_tokens instead.
Changelog
Raw JSON
Copy {
"key": "ai.total_tokens.used",
"brief": "The total number of tokens used to process the prompt.",
"type": "integer",
"pii": {
"key": "maybe"
},
"is_in_otel": false,
"example": 30,
"deprecation": {
"replacement": "gen_ai.usage.total_tokens",
"_status": null
},
"alias": [
"gen_ai.usage.total_tokens"
],
"sdks": [
"python"
],
"changelog": [
{
"version": "0.4.0",
"prs": [
228
]
},
{
"version": "0.1.0",
"prs": [
57,
61,
108
]
},
{
"version": "0.0.0"
}
]
} string[] PII: True OTel: False
Warning messages generated during model execution.
Example ["Token limit exceeded"]
No replacement available at this time.
Changelog
Raw JSON
Copy {
"key": "ai.warnings",
"brief": "Warning messages generated during model execution.",
"type": "string[]",
"pii": {
"key": "true"
},
"is_in_otel": false,
"example": [
"Token limit exceeded"
],
"deprecation": {
"_status": null
},
"changelog": [
{
"version": "next",
"prs": [
264
]
},
{
"version": "0.1.0",
"prs": [
55
]
}
]
}