Ollama Dev Service
When your application uses Spring AI to consume large language models from Ollama, you can use the Arconia Ollama Dev Service to automatically start a full Ollama service. This enables you to develop and test your LLM-powered application without having to manually provide a model inference service.
Dependencies
First, you need to add the Arconia Ollama Dev Service dependency to your project.
dependencies {
testAndDevelopmentOnly 'io.arconia:arconia-dev-services-ollama'
}
You can combine this with io.arconia:arconia-dev-tools , including the Spring Boot DevTools, which automatically restarts your application when you make changes to your code as part of your development workflow, while keeping the Dev Service up and running.
|
Running the Application
By default, the Dev Service is disabled since it might be resource-intensive, and running native Ollama might be more suitable for development. You can enable it globally or selectively for specific tests.
arconia:
dev:
services:
ollama:
enabled: true
You can enable/disable the Dev Service for a specific test class or method by using the @TestProperty annotation or equivalent Spring testing utilities.
|
Unlike the lower-level Testcontainers support in Spring Boot, Arconia doesn’t require special tasks to run your application (./gradlew bootTestRun
or ./mvnw spring-boot:test-run
). You can simply use:
-
Gradle:
./gradlew bootRun
-
Maven:
./mvnw spring-boot:run
Your integration tests will automatically use the Arconia Dev Services without any additional configuration.
Whenever the application is reloaded via Spring Boot DevTools, the Dev Service will not be restarted. If you want to disable the live reload feature, you can do so by setting the spring.devtools.restart.enabled property to false . When doing so, the Dev Service will be shut down and restarted along with the application.
|