diff --git a/README.md b/README.md index 4bb7209..8a54091 100644 --- a/README.md +++ b/README.md @@ -7,7 +7,7 @@ This repository contains various shell scripts for managing media-related tasks - **[Backup Scripts](#backup-scripts)** - Enterprise-grade backup solutions - **[Management Scripts](#management-scripts)** - System and service management - **[Security](#security)** - Comprehensive security framework and standards -- **[AI Integration](#ai-integration)** - Ollama and Fabric setup for AI-assisted development +- **[AI Integration](#ai-integration)** - Fabric setup for AI-assisted development - **[Tab Completion](#tab-completion)** - Intelligent command-line completion - **[Documentation](#comprehensive-documentation)** - Complete guides and references - **[Testing](#testing)** - Docker-based validation framework @@ -74,17 +74,16 @@ For security-related changes, refer to the security documentation and follow the ## AI Integration -This repository includes a complete AI development environment with Ollama and Fabric integration for AI-assisted development tasks. +This repository includes a complete AI development environment with Fabric integration for AI-assisted development tasks. -### Ollama + Fabric Setup +### Fabric Setup The system includes: -- **Ollama Docker container** running on port 11434 with phi3:mini model (3.8B parameters) - **Fabric v1.4.195** with 216+ AI patterns for text processing - **Google Gemini 2.5 Pro** as primary AI provider -- **Local Ollama models** as secondary AI provider -- **Custom shell aliases** for easy container management +- **External AI providers** support for flexibility +- **Custom shell configuration** for optimal development experience ### Basic Fabric Usage @@ -92,29 +91,16 @@ The system includes: # List all available patterns fabric -l -# Use a pattern (uses Gemini 2.5 Pro by default) +# Use a pattern (configure your preferred AI provider) echo "Your text here" | fabric -p summarize # Use with specific model echo "Your text here" | fabric -p summarize -m gemini-2.0-flash-exp -# Use local Ollama models -echo "Your text here" | fabric -p summarize -m ollama:phi3:mini - # Update patterns fabric -U ``` -### Ollama Management Aliases - -```bash -ollama-start # Start Ollama container -ollama-stop # Stop Ollama container -ollama-restart # Restart Ollama container -ollama-logs # View container logs -ollama-status # Check container status -``` - ### Popular AI Patterns - `summarize` - Summarize text content @@ -127,7 +113,6 @@ ollama-status # Check container status ### Configuration Files - **Fabric config**: `~/.config/fabric/.env` - AI provider settings and API keys -- **Ollama aliases**: `~/.oh-my-zsh/custom/ollama-aliases.zsh` - Container management commands - **Shell config**: `~/.zshrc` - Main shell configuration For complete setup instructions, see the setup documentation. @@ -442,19 +427,16 @@ This installs: For AI-assisted development, the system includes: -- **Ollama** running in Docker with local models - **Fabric** with 216+ AI patterns for text processing - **Google Gemini integration** as primary AI provider -- **Custom aliases** for easy management +- **External AI provider support** for flexibility +- **Custom configuration** for easy management Test the AI setup: ```bash -# Test Gemini integration +# Test Fabric integration echo "Test text" | fabric -p summarize - -# Test local Ollama integration -echo "Test text" | fabric -p summarize -m ollama:phi3:mini ``` ## Dotfiles diff --git a/TABLE_OF_CONTENTS.md b/TABLE_OF_CONTENTS.md index 98864e8..0065466 100644 --- a/TABLE_OF_CONTENTS.md +++ b/TABLE_OF_CONTENTS.md @@ -9,7 +9,7 @@ This is a comprehensive index of all documentation available in this shell scrip ## 🚀 Setup & Configuration -- [**Setup Complete Guide**](./setup/SETUP_COMPLETE.md) - Complete setup documentation for Ollama + Fabric integration +- [**Setup Complete Guide**](./setup/SETUP_COMPLETE.md) - Complete setup documentation for Fabric integration ## 🛠️ Component Documentation diff --git a/setup/packages.list b/setup/packages.list index df8a2e8..c3854d5 100644 --- a/setup/packages.list +++ b/setup/packages.list @@ -17,9 +17,8 @@ nala // Modern apt frontend fd-find // Modern find alternative (available as 'fd' or 'fdfind') eza // Modern ls alternative -// Note: lazygit, lazydocker, fabric, and ollama require special installation (GitHub releases/scripts) +// Note: lazygit, lazydocker, and fabric require special installation (GitHub releases/scripts) // These are handled separately in the setup script // lazygit // lazydocker -fabric -ollama \ No newline at end of file +fabric \ No newline at end of file diff --git a/setup/setup-no-ollama.sh b/setup/setup-no-ollama.sh deleted file mode 100755 index 6bfc80c..0000000 --- a/setup/setup-no-ollama.sh +++ /dev/null @@ -1,97 +0,0 @@ -#!/bin/bash - -# Convenience script to run setup without Ollama installation -# This script sets SKIP_OLLAMA=true and runs the main setup script -# -# Usage: ./setup-no-ollama.sh [setup script options] -# Author: acedanger -# Description: Runs setup while skipping Ollama and configures Fabric for external AI providers - -set -e - -# Define colors for output -GREEN='\033[0;32m' -YELLOW='\033[0;33m' -BLUE='\033[0;34m' -RED='\033[0;31m' -NC='\033[0m' # No Color - -echo -e "${GREEN}=== Shell Setup (Without Ollama) ===${NC}" -echo -e "${YELLOW}This will install all packages and configurations except Ollama Docker setup${NC}" -echo -e "${YELLOW}Fabric will be installed but configured for external AI providers${NC}" - -# Get the directory of this script -SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" - -# Set SKIP_OLLAMA environment variable and run setup -export SKIP_OLLAMA=true - -echo -e "\n${YELLOW}Running setup with SKIP_OLLAMA=true...${NC}" - -# Run the main setup script -"$SCRIPT_DIR/setup.sh" "$@" - -# Configure Fabric after main setup completes -echo -e "\n${BLUE}Configuring Fabric with external AI providers...${NC}" - -# Create Fabric config directory if it doesn't exist -mkdir -p ~/.config/fabric - -# Download the pre-configured .env file from git repository -echo -e "${YELLOW}Downloading Fabric .env configuration from git repository...${NC}" -GIST_URL="https://git.ptrwd.com/peterwood/config/raw/branch/main/fabric/.env" - -if curl -s "$GIST_URL" -o ~/.config/fabric/.env; then - chmod 600 ~/.config/fabric/.env - echo -e "${GREEN}✓ Fabric .env file configured successfully${NC}" - - # Verify the file was downloaded correctly - if [ -s ~/.config/fabric/.env ]; then - FILE_SIZE=$(stat -c%s ~/.config/fabric/.env 2>/dev/null || echo "unknown") - echo -e "${GREEN}✓ Configuration file downloaded: ${FILE_SIZE} bytes${NC}" - else - echo -e "${RED}⚠ Downloaded file appears to be empty${NC}" - fi -else - echo -e "${RED}⚠ Could not download .env file from git repository. Creating basic template...${NC}" - - # Create a basic .env template as fallback - cat > ~/.config/fabric/.env << 'EOF' -# Fabric AI Provider Configuration -# Add your API keys below and uncomment the lines you want to use - -# OpenAI Configuration -#OPENAI_API_KEY=your_openai_api_key_here -#OPENAI_API_BASE_URL=https://api.openai.com/v1 - -# Anthropic Configuration -#ANTHROPIC_API_KEY=your_anthropic_api_key_here - -# Google Gemini Configuration -#GOOGLE_API_KEY=your_google_api_key_here - -# Groq Configuration -#GROQ_API_KEY=your_groq_api_key_here - -# Set your preferred default model -DEFAULT_MODEL=gpt-4o-mini - -# For complete provider list, see: -# https://git.ptrwd.com/peterwood/config/raw/branch/main/fabric/.env -EOF - chmod 600 ~/.config/fabric/.env - echo -e "${YELLOW}✓ Basic .env template created${NC}" -fi - -echo -e "\n${GREEN}=== Setup completed without Ollama ===${NC}" -echo -e "${BLUE}Next steps for Fabric configuration:${NC}" -echo -e "${YELLOW}1. Edit ~/.config/fabric/.env and add your API keys${NC}" -echo -e "${YELLOW}2. Uncomment your preferred AI provider section${NC}" -echo -e "${YELLOW}3. Set DEFAULT_MODEL to your preferred model${NC}" -echo -e "${YELLOW}4. Test configuration with: fabric --list-patterns${NC}" -echo -e "\n${BLUE}Supported AI providers:${NC}" -echo -e "${YELLOW}- OpenAI (GPT-4, GPT-4o, GPT-3.5-turbo)${NC}" -echo -e "${YELLOW}- Anthropic (Claude-3.5-sonnet, Claude-3-haiku)${NC}" -echo -e "${YELLOW}- Google (Gemini-pro, Gemini-1.5-pro)${NC}" -echo -e "${YELLOW}- Groq (Fast inference with Llama, Mixtral)${NC}" -echo -e "${YELLOW}- And many more providers...${NC}" \ No newline at end of file diff --git a/setup/setup.sh b/setup/setup.sh index c936567..e38f47c 100755 --- a/setup/setup.sh +++ b/setup/setup.sh @@ -185,11 +185,7 @@ for pkg in "${pkgs[@]}"; do continue fi - # Handle ollama Docker installation - if [ "$pkg" = "ollama" ]; then - special_installs+=("$pkg") - continue - fi + # Handle lazygit - available in COPR for Fedora, special install for Debian/Ubuntu if [ "$pkg" = "lazygit" ] && [ "$OS_NAME" != "fedora" ]; then @@ -289,66 +285,38 @@ for pkg in "${special_installs[@]}"; do # Download and install the latest Fabric binary for Linux AMD64 curl -L https://github.com/danielmiessler/fabric/releases/latest/download/fabric-linux-amd64 -o /tmp/fabric chmod +x /tmp/fabric - sudo mv /tmp/fabric /usr/local/bin/fabric - echo -e "${GREEN}Fabric binary installed successfully!${NC}" + sudo mv /tmp/fabric /usr/local/bin/fabric echo -e "${GREEN}Fabric binary installed successfully!${NC}" - # Verify installation - if fabric --version; then - echo -e "${GREEN}Fabric installation verified!${NC}" - echo -e "${YELLOW}Running Fabric setup...${NC}" + # Verify installation + if fabric --version; then + echo -e "${GREEN}Fabric installation verified!${NC}" + echo -e "${YELLOW}Running Fabric setup...${NC}" - # Create fabric config directory - mkdir -p "$HOME/.config/fabric" + # Create fabric config directory + mkdir -p "$HOME/.config/fabric" - # Run fabric setup with proper configuration - echo -e "${YELLOW}Setting up Fabric patterns and configuration...${NC}" + # Run fabric setup with proper configuration + echo -e "${YELLOW}Setting up Fabric patterns and configuration...${NC}" - # Initialize fabric with default patterns - fabric --setup || echo -e "${YELLOW}Initial fabric setup completed${NC}" + # Initialize fabric with default patterns + fabric --setup || echo -e "${YELLOW}Initial fabric setup completed${NC}" - # Update patterns to get the latest - echo -e "${YELLOW}Updating Fabric patterns...${NC}" - fabric --updatepatterns || echo -e "${YELLOW}Pattern update completed${NC}" + # Update patterns to get the latest + echo -e "${YELLOW}Updating Fabric patterns...${NC}" + fabric --updatepatterns || echo -e "${YELLOW}Pattern update completed${NC}" - # Configure Ollama as the default model provider - echo -e "${YELLOW}Configuring Fabric to use Ollama...${NC}" - - # Create or update fabric config to use Ollama - cat > "$HOME/.config/fabric/.env" << 'FABRIC_EOF' -# Fabric Configuration for Ollama -DEFAULT_MODEL=phi3:mini -OLLAMA_API_BASE=http://localhost:11434 -FABRIC_EOF - - echo -e "${GREEN}Fabric setup completed successfully!${NC}" - echo -e "${YELLOW}Fabric is configured to use Ollama at http://localhost:11434${NC}" - echo -e "${YELLOW}Default model: phi3:mini${NC}" - echo -e "${YELLOW}You can test fabric with: fabric --list-patterns${NC}" - else - echo -e "${RED}Fabric installation verification failed${NC}" - fi + echo -e "${GREEN}Fabric setup completed successfully!${NC}" + echo -e "${YELLOW}You can test fabric with: fabric --list-patterns${NC}" + else + echo -e "${RED}Fabric installation verification failed${NC}" + fi else echo -e "${GREEN}Fabric is already installed${NC}" - # Still try to update patterns and ensure Ollama configuration + # Still try to update patterns echo -e "${YELLOW}Updating Fabric patterns...${NC}" fabric --updatepatterns || echo -e "${YELLOW}Pattern update completed${NC}" - - # Ensure Ollama configuration exists - mkdir -p "$HOME/.config/fabric" - if [ ! -f "$HOME/.config/fabric/.env" ]; then - echo -e "${YELLOW}Creating Ollama configuration for existing Fabric installation...${NC}" - cat > "$HOME/.config/fabric/.env" << 'FABRIC_EOF' -# Fabric Configuration for Ollama -DEFAULT_MODEL=phi3:mini -OLLAMA_API_BASE=http://localhost:11434 -FABRIC_EOF - fi fi ;; - "ollama") - # Ollama installation is handled in the main Ollama Docker setup section below - echo -e "${YELLOW}Ollama Docker installation will be handled in dedicated section...${NC}" - ;; "lazygit") if ! command -v lazygit &> /dev/null; then echo -e "${YELLOW}Installing Lazygit from GitHub releases...${NC}" @@ -369,159 +337,6 @@ FABRIC_EOF esac done -# Setup Ollama with Docker for local AI (required for Fabric) -if [ "${SKIP_OLLAMA:-false}" = "true" ]; then - echo -e "${YELLOW}Skipping Ollama installation (SKIP_OLLAMA=true)${NC}" -else -# Setup Ollama with Docker for local AI (required for Fabric) -echo -e "${YELLOW}Setting up Ollama with Docker for local AI support...${NC}" - -# Check if user can run docker commands without sudo -if docker ps >/dev/null 2>&1; then - DOCKER_CMD="docker" - echo -e "${GREEN}Docker access confirmed without sudo${NC}" -else - echo -e "${YELLOW}Docker requires sudo access (group membership may need session refresh)${NC}" - DOCKER_CMD="sudo docker" -fi - -# Check if Ollama Docker container is already running -if ! $DOCKER_CMD ps | grep -q ollama; then - echo -e "${YELLOW}Setting up Ollama Docker container...${NC}" - - # Pull the Ollama Docker image - $DOCKER_CMD pull ollama/ollama:latest - - # Create a Docker volume for Ollama data - $DOCKER_CMD volume create ollama-data 2>/dev/null || true - - # Remove any existing ollama container - $DOCKER_CMD rm -f ollama 2>/dev/null || true - - # Start Ollama container with GPU support (if available) or CPU-only - if command -v nvidia-docker &> /dev/null || $DOCKER_CMD info 2>/dev/null | grep -q nvidia; then - echo -e "${YELLOW}Starting Ollama with GPU support...${NC}" - $DOCKER_CMD run -d \ - --name ollama \ - --restart unless-stopped \ - --gpus all \ - -v ollama-data:/root/.ollama \ - -p 11434:11434 \ - ollama/ollama - else - echo -e "${YELLOW}Starting Ollama in CPU-only mode...${NC}" - $DOCKER_CMD run -d \ - --name ollama \ - --restart unless-stopped \ - -v ollama-data:/root/.ollama \ - -p 11434:11434 \ - ollama/ollama - fi - - # Wait for the container to be ready - echo -e "${YELLOW}Waiting for Ollama to start...${NC}" - sleep 10 - - # Install a lightweight model for basic functionality - echo -e "${YELLOW}Installing a basic AI model (phi3:mini)...${NC}" - $DOCKER_CMD exec ollama ollama pull phi3:mini - - echo -e "${GREEN}Ollama Docker setup completed with phi3:mini model!${NC}" - echo -e "${YELLOW}Ollama is accessible at http://localhost:11434${NC}" -else - echo -e "${GREEN}Ollama Docker container is already running${NC}" -fi - -# Add helper aliases for Ollama Docker management -OLLAMA_ALIASES_FILE="$HOME/.oh-my-zsh/custom/ollama-aliases.zsh" -echo -e "${YELLOW}Setting up Ollama Docker aliases...${NC}" -cat > "$OLLAMA_ALIASES_FILE" << 'EOF' -# Ollama Docker Management Aliases -alias ollama-start='docker start ollama' -alias ollama-stop='docker stop ollama' -alias ollama-restart='docker restart ollama' -alias ollama-logs='docker logs -f ollama' -alias ollama-shell='docker exec -it ollama /bin/bash' -alias ollama-pull='docker exec ollama ollama pull' -alias ollama-list='docker exec ollama ollama list' -alias ollama-run='docker exec ollama ollama run' -alias ollama-status='docker ps | grep ollama' - -# Function to run ollama commands in Docker -ollama() { - if [ "$1" = "serve" ]; then - echo "Ollama is running in Docker. Use 'ollama-start' to start the container." - return 0 - fi - - # Check if user can run docker without sudo - if docker ps >/dev/null 2>&1; then - docker exec ollama ollama "$@" - else - sudo docker exec ollama ollama "$@" - fi -} -EOF - -echo -e "${GREEN}Ollama Docker aliases created in $OLLAMA_ALIASES_FILE${NC}" -echo -e "${YELLOW}You can install additional models with: ollama pull ${NC}" - -# Function to finalize Fabric configuration after Ollama is running -configure_fabric_for_ollama() { - echo -e "${YELLOW}Finalizing Fabric configuration for Ollama...${NC}" - - # Ensure Ollama is accessible before configuring Fabric - local max_attempts=30 - local attempt=0 - - while [ $attempt -lt $max_attempts ]; do - if curl -s http://localhost:11434/api/tags >/dev/null 2>&1; then - echo -e "${GREEN}Ollama API is accessible, configuring Fabric...${NC}" - break - fi - echo -e "${YELLOW}Waiting for Ollama to be ready... (attempt $((attempt + 1))/$max_attempts)${NC}" - sleep 2 - attempt=$((attempt + 1)) - done - - if [ $attempt -eq $max_attempts ]; then - echo -e "${YELLOW}Warning: Ollama API not accessible, Fabric configuration may need manual setup${NC}" - return - fi - - # Create a comprehensive Fabric configuration - mkdir -p "$HOME/.config/fabric" - - # Create the main configuration file - cat > "$HOME/.config/fabric/config.yaml" << 'FABRIC_CONFIG_EOF' -# Fabric Configuration for Ollama Integration -model: - default: "phi3:mini" - -providers: - ollama: - base_url: "http://localhost:11434" - api_key: "" # Ollama doesn't require an API key for local access - -# Default provider -default_provider: "ollama" - -# Pattern settings -patterns: - auto_update: true - directory: "~/.config/fabric/patterns" -FABRIC_CONFIG_EOF - - echo -e "${GREEN}Fabric configuration file created${NC}" -} - -# Call the configuration function if Ollama container is running -if docker ps | grep -q ollama; then - configure_fabric_for_ollama -fi - -fi # End SKIP_OLLAMA check - # Install Zsh if not already installed echo -e "${YELLOW}Installing Zsh...${NC}" if ! command -v zsh &> /dev/null; then @@ -770,27 +585,6 @@ echo -e "${GREEN}OS: $OS_NAME $OS_VERSION${NC}" echo -e "${GREEN}Package Manager: $PKG_MANAGER${NC}" echo -e "${GREEN}Shell: $(basename "$SHELL") → zsh${NC}" -# Test Ollama and Fabric integration -echo -e "\n${GREEN}=== Testing Ollama and Fabric Integration ===${NC}" -echo -e "${YELLOW}Testing Ollama Docker container...${NC}" -if docker ps | grep -q ollama; then - echo -e "${GREEN}✓ Ollama Docker container is running${NC}" - - # Test if Ollama API is responding - echo -e "${YELLOW}Testing Ollama API...${NC}" - if curl -s http://localhost:11434/api/tags >/dev/null 2>&1; then - echo -e "${GREEN}✓ Ollama API is responding${NC}" - - # List available models - echo -e "${YELLOW}Available Ollama models:${NC}" - docker exec ollama ollama list || echo -e "${YELLOW}No models listed or command failed${NC}" - else - echo -e "${YELLOW}⚠ Ollama API not responding yet (may need more time to start)${NC}" - fi -else - echo -e "${RED}✗ Ollama Docker container is not running${NC}" -fi - echo -e "\n${YELLOW}Testing Fabric installation...${NC}" if command -v fabric &> /dev/null; then echo -e "${GREEN}✓ Fabric is installed${NC}" @@ -803,28 +597,16 @@ if command -v fabric &> /dev/null; then else echo -e "${YELLOW}⚠ Fabric patterns may need to be updated${NC}" fi - - # Check fabric configuration - if [ -f "$HOME/.config/fabric/.env" ]; then - echo -e "${GREEN}✓ Fabric Ollama configuration found${NC}" - else - echo -e "${YELLOW}⚠ Fabric Ollama configuration not found${NC}" - fi else echo -e "${RED}✗ Fabric is not installed${NC}" fi echo -e "\n${GREEN}=== Post-Installation Instructions ===${NC}" echo -e "${YELLOW}1. Restart your shell or run: source ~/.zshrc${NC}" -echo -e "${YELLOW}2. Test Ollama: ollama list${NC}" -echo -e "${YELLOW}3. Test Fabric: fabric --list-patterns${NC}" -echo -e "${YELLOW}4. Try a Fabric pattern: echo 'Hello world' | fabric --pattern summarize${NC}" -echo -e "${YELLOW}5. Install more models: ollama pull llama2${NC}" -echo -e "${YELLOW}6. Manage Ollama container: ollama-start, ollama-stop, ollama-logs${NC}" +echo -e "${YELLOW}2. Test Fabric: fabric --list-patterns${NC}" +echo -e "${YELLOW}3. Try a Fabric pattern: echo 'Hello world' | fabric --pattern summarize${NC}" echo -e "\n${GREEN}=== Useful Commands ===${NC}" -echo -e "${YELLOW}• View running containers: docker ps${NC}" -echo -e "${YELLOW}• Ollama logs: docker logs -f ollama${NC}" echo -e "${YELLOW}• Fabric help: fabric --help${NC}" echo -e "${YELLOW}• Update patterns: fabric --updatepatterns${NC}" diff --git a/setup/skip_ollama.patch b/setup/skip_ollama.patch deleted file mode 100644 index a26b360..0000000 --- a/setup/skip_ollama.patch +++ /dev/null @@ -1,148 +0,0 @@ -# SKIP_OLLAMA Feature Documentation - -## Overview -The SKIP_OLLAMA feature allows users to run the shell setup script without installing Ollama Docker containers and related AI infrastructure. This is useful for environments where: - -- Docker is not available or desired -- Local AI models are not needed -- Users prefer external AI providers (OpenAI, Anthropic, Google, etc.) -- Resource constraints make running local AI models impractical - -## Usage - -### Method 1: Environment Variable -```bash -export SKIP_OLLAMA=true -./setup/setup.sh -``` - -### Method 2: Inline Variable -```bash -SKIP_OLLAMA=true ./setup/setup.sh -``` - -### Method 3: Convenience Script -```bash -./setup-no-ollama.sh -``` - -## What Gets Skipped - -When SKIP_OLLAMA=true, the following components are NOT installed: - -1. **Ollama Docker Container**: No Docker container setup for local AI models -2. **Ollama Docker Aliases**: No shell aliases for Ollama container management -3. **Local AI Models**: No phi3:mini or other local models downloaded -4. **Ollama-specific Fabric Configuration**: Fabric is configured for external providers - -## What Still Gets Installed - -The following components are still installed normally: - -1. **Fabric CLI Tool**: AI pattern processing tool for text manipulation -2. **All Other Packages**: Everything from setup/packages.list except Ollama -3. **Shell Configuration**: Zsh, Oh My Zsh, plugins, and dotfiles -4. **Development Tools**: Node.js, VS Code, Git, and other development utilities - -## Fabric Configuration - -When SKIP_OLLAMA=true, Fabric is installed but configured differently: - -### Standard Configuration (with Ollama): -```env -# Fabric Configuration for Ollama -DEFAULT_MODEL=phi3:mini -OLLAMA_API_BASE=http://localhost:11434 -``` - -### Skip Ollama Configuration: -```env -# Fabric Configuration - Ollama installation skipped -# Configure your preferred AI provider (OpenAI, Anthropic, Google, etc.) -# DEFAULT_MODEL=your_model_here -# OPENAI_API_KEY=your_key_here -# For more configuration options, see: fabric --help -``` - -## Post-Installation Configuration - -After running setup with SKIP_OLLAMA=true, configure your preferred AI provider: - -1. **Edit the Fabric configuration**: - ```bash - nano ~/.config/fabric/.env - ``` - -2. **Add your preferred provider**: - ```env - # Example for OpenAI - OPENAI_API_KEY=your_api_key_here - DEFAULT_MODEL=gpt-4 - - # Example for Anthropic - ANTHROPIC_API_KEY=your_api_key_here - DEFAULT_MODEL=claude-3-sonnet-20240229 - - # Example for Google - GOOGLE_API_KEY=your_api_key_here - DEFAULT_MODEL=gemini-pro - ``` - -3. **Test the configuration**: - ```bash - fabric --list-patterns - echo "Test text" | fabric -p summarize - ``` - -## Implementation Details - -The SKIP_OLLAMA feature is implemented through conditional blocks in setup.sh: - -- Lines 369-510: Main Ollama Docker setup wrapped in conditional -- Lines 276-340: Fabric configuration adapts based on SKIP_OLLAMA -- Lines 740+: Testing and summary sections adjust output accordingly - -## Benefits - -- **Faster Installation**: Skips Docker image downloads and container setup -- **Lower Resource Usage**: No background AI containers consuming memory/CPU -- **Flexibility**: Users can choose their preferred AI providers -- **Compatibility**: Works in environments without Docker access - -## Migration - -Users can always add Ollama later by: - -1. Running the full setup script without SKIP_OLLAMA -2. Manually setting up Ollama Docker containers -3. Updating Fabric configuration to use local models - -Date: May 30, 2025 -Version: Compatible with shell repository setup.sh v2.x - -## Implementation Status - -✅ **COMPLETED** - The SKIP_OLLAMA feature is fully implemented and functional. - -### What's Implemented: - -1. **Core Logic**: Conditional wrapper around Ollama Docker installation (lines 373-523) -2. **Fabric Configuration**: Conditional setup for external AI providers vs. Ollama -3. **Testing Section**: Conditional testing based on SKIP_OLLAMA setting (lines 752-796) -4. **Post-Installation Instructions**: Different guidance for external AI provider mode -5. **Convenience Script**: `setup-no-ollama.sh` for easy access -6. **Documentation**: This comprehensive guide - -### Files Modified: - -- `setup/setup.sh`: Main implementation with conditional logic -- `setup-no-ollama.sh`: Convenience script (NEW) -- `skip_ollama.patch`: Documentation (NEW) - -### Testing Status: - -- ✅ Syntax validation passed -- ✅ Conditional logic implemented -- ⏳ End-to-end testing recommended - -The implementation is complete and ready for use. diff --git a/setup/test-integration.sh b/setup/test-integration.sh deleted file mode 100755 index ace2637..0000000 --- a/setup/test-integration.sh +++ /dev/null @@ -1,80 +0,0 @@ -#!/bin/bash - -# Test script to verify Ollama + Fabric integration -set -e - -GREEN='\033[0;32m' -YELLOW='\033[0;33m' -RED='\033[0;31m' -NC='\033[0m' # No Color - -echo -e "${GREEN}=== Testing Ollama + Fabric Integration ===${NC}" - -echo -e "\n${YELLOW}1. Testing Ollama Docker container...${NC}" -if sudo docker ps | grep -q ollama; then - echo -e "${GREEN}✓ Ollama Docker container is running${NC}" - echo -e "${YELLOW}Container ID: $(sudo docker ps | grep ollama | awk '{print $1}')${NC}" -else - echo -e "${RED}✗ Ollama Docker container is not running${NC}" - exit 1 -fi - -echo -e "\n${YELLOW}2. Testing Ollama API...${NC}" -if curl -s http://localhost:11434/api/tags >/dev/null 2>&1; then - echo -e "${GREEN}✓ Ollama API is responding${NC}" - echo -e "${YELLOW}Available models:${NC}" - curl -s http://localhost:11434/api/tags | jq -r '.models[].name' 2>/dev/null || echo "phi3:mini" -else - echo -e "${RED}✗ Ollama API is not responding${NC}" - exit 1 -fi - -echo -e "\n${YELLOW}3. Testing Fabric installation...${NC}" -if command -v fabric &> /dev/null; then - echo -e "${GREEN}✓ Fabric is installed${NC}" - echo -e "${YELLOW}Version: $(fabric --version)${NC}" -else - echo -e "${RED}✗ Fabric is not installed${NC}" - exit 1 -fi - -echo -e "\n${YELLOW}4. Testing Fabric patterns...${NC}" -pattern_count=$(fabric -l 2>/dev/null | wc -l) -if [ "$pattern_count" -gt 0 ]; then - echo -e "${GREEN}✓ Fabric patterns are available${NC}" - echo -e "${YELLOW}Number of patterns: $pattern_count${NC}" -else - echo -e "${RED}✗ No Fabric patterns found${NC}" - exit 1 -fi - -echo -e "\n${YELLOW}5. Testing Fabric + Ollama integration...${NC}" -test_output=$(echo "Hello world" | fabric -p summarize 2>/dev/null) -if [ $? -eq 0 ] && [ -n "$test_output" ]; then - echo -e "${GREEN}✓ Fabric + Ollama integration working${NC}" - echo -e "${YELLOW}Test output:${NC}" - echo "$test_output" | head -3 -else - echo -e "${RED}✗ Fabric + Ollama integration failed${NC}" - exit 1 -fi - -echo -e "\n${YELLOW}6. Testing Ollama aliases...${NC}" -if type ollama-list &>/dev/null; then - echo -e "${GREEN}✓ Ollama aliases are loaded${NC}" - echo -e "${YELLOW}Testing ollama-list:${NC}" - ollama-list 2>/dev/null | head -3 -else - echo -e "${RED}✗ Ollama aliases not found${NC}" -fi - -echo -e "\n${GREEN}=== All tests passed! ===${NC}" -echo -e "${YELLOW}Your setup is ready for AI-assisted development with Fabric and Ollama.${NC}" - -echo -e "\n${GREEN}=== Quick Start Guide ===${NC}" -echo -e "${YELLOW}• List patterns: fabric -l${NC}" -echo -e "${YELLOW}• Use a pattern: echo 'text' | fabric -p ${NC}" -echo -e "${YELLOW}• List models: fabric -L${NC}" -echo -e "${YELLOW}• Manage Ollama: ollama-start, ollama-stop, ollama-logs${NC}" -echo -e "${YELLOW}• Install new models: ollama pull ${NC}" -echo -e "${YELLOW}• Update patterns: fabric -U${NC}"