feat: Remove Ollama integration and update documentation for Fabric setup

This commit is contained in:
Peter Wood
2025-06-27 07:09:16 -04:00
parent 3b284c769c
commit 78dc25192f
7 changed files with 35 additions and 597 deletions

View File

@@ -7,7 +7,7 @@ This repository contains various shell scripts for managing media-related tasks
- **[Backup Scripts](#backup-scripts)** - Enterprise-grade backup solutions
- **[Management Scripts](#management-scripts)** - System and service management
- **[Security](#security)** - Comprehensive security framework and standards
- **[AI Integration](#ai-integration)** - Ollama and Fabric setup for AI-assisted development
- **[AI Integration](#ai-integration)** - Fabric setup for AI-assisted development
- **[Tab Completion](#tab-completion)** - Intelligent command-line completion
- **[Documentation](#comprehensive-documentation)** - Complete guides and references
- **[Testing](#testing)** - Docker-based validation framework
@@ -74,17 +74,16 @@ For security-related changes, refer to the security documentation and follow the
## AI Integration
This repository includes a complete AI development environment with Ollama and Fabric integration for AI-assisted development tasks.
This repository includes a complete AI development environment with Fabric integration for AI-assisted development tasks.
### Ollama + Fabric Setup
### Fabric Setup
The system includes:
- **Ollama Docker container** running on port 11434 with phi3:mini model (3.8B parameters)
- **Fabric v1.4.195** with 216+ AI patterns for text processing
- **Google Gemini 2.5 Pro** as primary AI provider
- **Local Ollama models** as secondary AI provider
- **Custom shell aliases** for easy container management
- **External AI providers** support for flexibility
- **Custom shell configuration** for optimal development experience
### Basic Fabric Usage
@@ -92,29 +91,16 @@ The system includes:
# List all available patterns
fabric -l
# Use a pattern (uses Gemini 2.5 Pro by default)
# Use a pattern (configure your preferred AI provider)
echo "Your text here" | fabric -p summarize
# Use with specific model
echo "Your text here" | fabric -p summarize -m gemini-2.0-flash-exp
# Use local Ollama models
echo "Your text here" | fabric -p summarize -m ollama:phi3:mini
# Update patterns
fabric -U
```
### Ollama Management Aliases
```bash
ollama-start # Start Ollama container
ollama-stop # Stop Ollama container
ollama-restart # Restart Ollama container
ollama-logs # View container logs
ollama-status # Check container status
```
### Popular AI Patterns
- `summarize` - Summarize text content
@@ -127,7 +113,6 @@ ollama-status # Check container status
### Configuration Files
- **Fabric config**: `~/.config/fabric/.env` - AI provider settings and API keys
- **Ollama aliases**: `~/.oh-my-zsh/custom/ollama-aliases.zsh` - Container management commands
- **Shell config**: `~/.zshrc` - Main shell configuration
For complete setup instructions, see the setup documentation.
@@ -442,19 +427,16 @@ This installs:
For AI-assisted development, the system includes:
- **Ollama** running in Docker with local models
- **Fabric** with 216+ AI patterns for text processing
- **Google Gemini integration** as primary AI provider
- **Custom aliases** for easy management
- **External AI provider support** for flexibility
- **Custom configuration** for easy management
Test the AI setup:
```bash
# Test Gemini integration
# Test Fabric integration
echo "Test text" | fabric -p summarize
# Test local Ollama integration
echo "Test text" | fabric -p summarize -m ollama:phi3:mini
```
## Dotfiles