Installation
Requirementsā
- PHP: 8.4 or higher
- Composer: 2.0 or higher
- AI Platform: OpenAI API key, Anthropic API key, or Ollama installed locally
Install via Composerā
composer require hakam/ai-log-inspector-agent
The package will automatically install all required dependencies:
symfony/ai-agent- Core AI agent frameworksymfony/ai-platform- Platform abstraction layersymfony/ai-store- Vector store integrationsymfony/ai-chat- Conversational interface support- Platform-specific packages (OpenAI, Anthropic, Ollama)
Platform Setupā
Choose one or more AI platforms to use with the package:
Option 1: OpenAI (Recommended for Production)ā
# Install OpenAI platform
composer require symfony/ai-open-ai-platform
# Set your API key
export OPENAI_API_KEY="sk-your-api-key-here"
Best for: Production environments, highest quality semantic search, best tool calling support
Option 2: Anthropic Claudeā
# Install Anthropic platform
composer require symfony/ai-anthropic-platform
# Set your API key
export ANTHROPIC_API_KEY="your-api-key-here"
Best for: Advanced reasoning, long context windows, complex analysis tasks
Option 3: Ollama (Local/Self-Hosted)ā
# Install Ollama on your system
curl -fsSL https://ollama.ai/install.sh | sh
# Pull a model
ollama pull llama3.2:1b # Lightweight, fast
# OR
ollama pull llama3.2:3b # Better quality
# Install Ollama platform
composer require symfony/ai-ollama-platform
Best for: Development, privacy-sensitive environments, no external API costs
Vector Store Setupā
Choose a vector store based on your scale:
Development: In-Memory Storeā
Perfect for development and testing:
# Already included, no additional setup needed
use Symfony\AI\Store\Bridge\Local\InMemoryStore;
$store = new InMemoryStore();
Limits: ~10,000 log entries, resets on restart
Production: Chromaā
For production environments:
# Install Chroma store bridge
composer require symfony/ai-chroma-store
# Start Chroma server via Docker
docker run -d -p 8000:8000 chromadb/chroma:latest
use Symfony\AI\Store\Bridge\Chroma\ChromaStore;
$store = new ChromaStore([
'url' => 'http://localhost:8000',
'collection' => 'application_logs'
]);
Scalability: Millions of log entries, persistent storage
Production: Pineconeā
For cloud-native deployments:
composer require symfony/ai-pinecone-store
use Symfony\AI\Store\Bridge\Pinecone\PineconeStore;
$store = new PineconeStore([
'api_key' => $_ENV['PINECONE_API_KEY'],
'environment' => 'us-west1-gcp',
'index' => 'log-vectors'
]);
Scalability: Managed service, automatic scaling, production-ready
Verify Installationā
Create a simple test file to verify everything works:
<?php
// test-installation.php
require_once __DIR__ . '/vendor/autoload.php';
use Hakam\AiLogInspector\Agent\LogInspectorAgent;
use Hakam\AiLogInspector\Platform\LogDocumentPlatformFactory;
use Hakam\AiLogInspector\Tool\LogSearchTool;
use Hakam\AiLogInspector\Store\VectorLogDocumentStore;
use Hakam\AiLogInspector\Retriever\LogRetriever;
use Symfony\AI\Store\Bridge\Local\InMemoryStore;
echo "š¤ Testing AI Log Inspector Agent Installation\n\n";
try {
// Create platform
$platform = LogDocumentPlatformFactory::create([
'provider' => 'openai', // or 'anthropic', 'ollama'
'api_key' => $_ENV['OPENAI_API_KEY'],
'model' => ['name' => 'gpt-4o-mini']
]);
echo "ā
Platform created successfully\n";
// Create vector store
$store = new VectorLogDocumentStore(new InMemoryStore());
echo "ā
Vector store created successfully\n";
// Create retriever
$retriever = new LogRetriever(
embeddingPlatform: $platform->getPlatform(),
model: 'text-embedding-3-small',
logStore: $store
);
echo "ā
Retriever created successfully\n";
// Create tool
$tool = new LogSearchTool($store, $retriever, $platform);
echo "ā
Tool created successfully\n";
// Create agent
$agent = new LogInspectorAgent($platform, [$tool]);
echo "ā
Agent created successfully\n";
echo "\nš Installation verified! You're ready to use the AI Log Inspector Agent.\n";
} catch (\Exception $e) {
echo "ā Error: " . $e->getMessage() . "\n";
echo "\nStack trace:\n" . $e->getTraceAsString() . "\n";
exit(1);
}
Run the test:
php test-installation.php
Expected output:
š¤ Testing AI Log Inspector Agent Installation
ā
Platform created successfully
ā
Vector store created successfully
ā
Retriever created successfully
ā
Tool created successfully
ā
Agent created successfully
š Installation verified! You're ready to use the AI Log Inspector Agent.
Environment Configurationā
Create a .env file in your project root:
# Choose your AI platform
OPENAI_API_KEY=sk-your-key-here
# OR
ANTHROPIC_API_KEY=your-key-here
# OR
OLLAMA_URL=http://localhost:11434
OLLAMA_MODEL=llama3.2:1b
# Vector store (if using external store)
CHROMA_URL=http://localhost:8000
CHROMA_COLLECTION=app_logs
# OR
PINECONE_API_KEY=your-pinecone-key
PINECONE_ENVIRONMENT=us-west1-gcp
PINECONE_INDEX=log-vectors
Load it in your application:
use Symfony\Component\Dotenv\Dotenv;
$dotenv = new Dotenv();
$dotenv->load(__DIR__.'/.env');
Troubleshootingā
OpenAI API Key Invalidā
Error: Invalid API key
Solution: Verify your API key at https://platform.openai.com/api-keys
Ollama Connection Failedā
Error: Connection refused to localhost:11434
Solution: Start Ollama service:
ollama serve
Memory Limit Exceededā
Fatal error: Allowed memory size exhausted
Solution: Increase PHP memory limit:
php -d memory_limit=512M your-script.php
Or in php.ini:
memory_limit = 512M
Composer Conflictsā
Error: symfony/ai-agent conflicts with...
Solution: Update all Symfony AI packages:
composer update symfony/ai-* --with-all-dependencies
Next Stepsā
Now that installation is complete:
- Quick Start Guide: Build your first log inspector
- Configuration: Advanced configuration options
- Core Concepts: Understand the architecture