S3 Workflow Storage Implementation
S3 Workflow Storage Implementation
This document describes the S3-compatible storage implementation for workflow YAML files with built-in versioning.
Overview
The AI Workflow Designer now supports S3-compatible storage for workflow YAML files with the following features:
- S3 Built-in Versioning: Uses S3's native versioning capabilities
- Company-based Organization: Files organized by company ID
- Prompt Storage: Stores the original prompt used to generate each workflow
- Version Comments: Optional comments for each version
- Local Storage Fallback: Works without S3 for development
Architecture
Storage Structure
S3 Bucket: workflow-files/
├── companies/
│ ├── 999/ # Company ID
│ │ └── workflows/
│ │ ├── 1/ # Workflow ID
│ │ │ └── workflow.yaml # Latest version
│ │ └── 2/
│ │ └── workflow.yaml
│ └── 1000/
│ └── workflows/
│ └── 1/
│ └── workflow.yamlDatabase Schema
The approval_workflows table has been extended with:
ALTER TABLE approval_workflows
ADD COLUMN s3_file_path VARCHAR(500), -- S3 key for latest version
ADD COLUMN prompt TEXT, -- Original prompt
ADD COLUMN last_saved_at TIMESTAMPTZ; -- Last save timestampNote: The employees table does not have a created_by field. Instead, employee creation and updates are logged in the admin_activity_log table using the admin_user_id field to track who performed the action.
Configuration
Environment Variables
Add these to your .env file:
# S3 Storage Configuration
STORAGE_TYPE="local" # local, s3, gcs
S3_ENDPOINT_URL="http://127.0.0.1:9000" # MinIO endpoint
S3_ACCESS_KEY_ID="minioadmin"
S3_SECRET_ACCESS_KEY="minioadmin"
S3_BUCKET_NAME="workflow-files"
S3_REGION="us-east-1"
S3_USE_SSL=false
LOCAL_STORAGE_PATH="./workflow_files"Development Setup (MinIO)
- Start MinIO:
docker run -p 9000:9000 -p 9001:9001 minio/minio server /data --console-address ':9001'-
Access MinIO Console: http://localhost:9001
- Username:
minioadmin - Password:
minioadmin
- Username:
-
Create bucket:
workflow-files
Production Setup (Google Cloud Storage)
STORAGE_TYPE="gcs"
S3_ENDPOINT_URL="https://storage.googleapis.com"
S3_ACCESS_KEY_ID="your-gcs-access-key"
S3_SECRET_ACCESS_KEY="your-gcs-secret-key"
S3_BUCKET_NAME="your-bucket-name"
S3_REGION="us-central1"
S3_USE_SSL=trueAPI Endpoints
Workflow Templates
GET /services/v1/admin/workflow-templates
GET /services/v1/admin/workflow-templates/{template_id}Workflow Versioning
POST /services/v1/admin/workflows/save-version
{
"workflow_id": 1,
"yaml_content": "...",
"prompt": "Create a purchase request workflow...",
"version_comment": "Initial version"
}
GET /services/v1/admin/workflows/{workflow_id}/versions
GET /services/v1/admin/workflows/{workflow_id}/versions/{version_id}Usage Examples
1. Save Workflow Version
from app.services.s3_storage import s3_storage
# Save workflow with versioning
version_info = await s3_storage.save_workflow_yaml(
company_id=999,
workflow_id=1,
yaml_content=yaml_content,
prompt="Create a purchase request workflow...",
version_comment="Initial version"
)
print(f"Saved version: {version_info['version_id']}")2. Get Workflow Content
# Get latest version
workflow_data = await s3_storage.get_workflow_yaml(
company_id=999,
workflow_id=1
)
# Get specific version
workflow_data = await s3_storage.get_workflow_yaml(
company_id=999,
workflow_id=1,
version_id="abc123"
)3. List Versions
versions = await s3_storage.list_workflow_versions(
company_id=999,
workflow_id=1
)
for version in versions:
print(f"Version {version['version_id']}: {version['size']} bytes")
if version['is_latest']:
print(" ⭐ Latest version")Frontend Integration
Quick Templates
The AI Workflow Designer includes quick templates that populate both the prompt and YAML:
// Load template
const template = await fetch('/services/v1/admin/workflow-templates/purchase_request');
const { prompt, yaml } = await template.json();
setPrompt(prompt);
setGeneratedYaml(yaml);Save Workflow
// Save workflow with versioning
const saveData = {
workflow_id: workflowId,
yaml_content: generatedYaml,
prompt: prompt,
version_comment: "Updated workflow"
};
const response = await fetch('/services/v1/admin/workflows/save-version', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(saveData)
});Migration
Run Database Migration
# Make script executable
chmod +x run_s3_migration.sh
# Run migration
./run_s3_migration.shTest S3 Storage
# Test S3 functionality
python test_s3_storage.pyBenefits
- Built-in Versioning: S3 handles versioning automatically
- Scalability: No database bloat from storing large YAML files
- Backup: S3 provides built-in redundancy and backup
- Cost-effective: Pay only for storage used
- Flexibility: Works with any S3-compatible storage (MinIO, GCS, AWS S3)
- Proper Activity Logging: All employee and workflow changes are logged in admin_activity_log table
Troubleshooting
Common Issues
-
MinIO Connection Failed
- Ensure MinIO is running:
docker ps - Check endpoint URL:
http://127.0.0.1:9000 - Verify credentials:
minioadmin/minioadmin
- Ensure MinIO is running:
-
Bucket Not Found
- Create bucket in MinIO console
- Ensure bucket name matches
S3_BUCKET_NAME
-
Permission Denied
- Check S3 credentials
- Verify bucket permissions
- Ensure bucket versioning is enabled
Debug Mode
Enable debug logging by setting DEBUG=true in your environment.
Future Enhancements
- Version comparison UI
- Rollback functionality
- Version tagging
- Bulk version operations
- Version analytics