Generative AI
Spring AI provides a consistent programming model and API for building AI-infused applications. It comes with built-in instrumentation for models, vector stores, and workflows based on the Micrometer Observation APIs. Given that instrumentation, you can decide to export the telemetry data adopting specific semantic conventions.
Arconia provides support for the following semantic conventions that you can adopt instead of the built-in Micrometer conventions:
OpenInference Semantic Conventions
Arconia lets you configure your Spring AI applications to export telemetry data adopting the OpenInference Semantic Conventions.
Many systems have been adopting the OpenInference specification to provide a consistent way to monitor and observe AI models, including the Arize Phoenix platform that Arconia makes available as a Dev Service for you to run and test locally. Check out the Arize Phoenix documentation for more details.
Dependencies
First, you need to add the Arconia OpenInference Observation Conventions dependency to your project.
-
Gradle
-
Maven
dependencies {
implementation 'io.arconia:arconia-openinference-semantic-conventions'
}
<dependency>
<groupId>io.arconia</groupId>
<artifactId>arconia-openinference-semantic-conventions</artifactId>
</dependency>
Configuration Properties
You can configure the OpenInference Semantic Conventions via configuration properties.
Property |
Default |
Description |
|
|
Whether to enable the OpenInference instrumentation. |
|
|
Whether to exclude any non-AI observations from the exported telemetry for the application. |
|
|
Maximum length of a base64-encoded image. |
|
|
Whether to hide all embedding vectors. |
|
|
Whether to hide the LLM invocation parameters. |
|
|
Whether to hide all inputs. |
|
|
Whether to hide all images from the input messages. |
|
|
Whether to hide all inputs messages. |
|
|
Whether to hide all texts from the input messages. |
|
|
Whether to hide all output messages. |
|
|
Whether to hide all texts from the output messages. |
|
|
Whether to hide all output messages. |
|
|
Whether to hide all LLM prompts. |
Environment Variables
Arconia supports the OpenInference Environment Variable Specification for configuring the OpenInference semantic conventions. If both the OpenInference Environment Variables and the Arconia configuration properties are set, the OpenInference Environment Variables will take precedence.