mirror of
https://github.com/acedanger/shell.git
synced 2025-12-06 03:20:12 -08:00
feat: Add convenience script and documentation for SKIP_OLLAMA feature in setup
This commit is contained in:
148
setup/skip_ollama.patch
Normal file
148
setup/skip_ollama.patch
Normal file
@@ -0,0 +1,148 @@
|
||||
# SKIP_OLLAMA Feature Documentation
|
||||
|
||||
## Overview
|
||||
The SKIP_OLLAMA feature allows users to run the shell setup script without installing Ollama Docker containers and related AI infrastructure. This is useful for environments where:
|
||||
|
||||
- Docker is not available or desired
|
||||
- Local AI models are not needed
|
||||
- Users prefer external AI providers (OpenAI, Anthropic, Google, etc.)
|
||||
- Resource constraints make running local AI models impractical
|
||||
|
||||
## Usage
|
||||
|
||||
### Method 1: Environment Variable
|
||||
```bash
|
||||
export SKIP_OLLAMA=true
|
||||
./setup/setup.sh
|
||||
```
|
||||
|
||||
### Method 2: Inline Variable
|
||||
```bash
|
||||
SKIP_OLLAMA=true ./setup/setup.sh
|
||||
```
|
||||
|
||||
### Method 3: Convenience Script
|
||||
```bash
|
||||
./setup-no-ollama.sh
|
||||
```
|
||||
|
||||
## What Gets Skipped
|
||||
|
||||
When SKIP_OLLAMA=true, the following components are NOT installed:
|
||||
|
||||
1. **Ollama Docker Container**: No Docker container setup for local AI models
|
||||
2. **Ollama Docker Aliases**: No shell aliases for Ollama container management
|
||||
3. **Local AI Models**: No phi3:mini or other local models downloaded
|
||||
4. **Ollama-specific Fabric Configuration**: Fabric is configured for external providers
|
||||
|
||||
## What Still Gets Installed
|
||||
|
||||
The following components are still installed normally:
|
||||
|
||||
1. **Fabric CLI Tool**: AI pattern processing tool for text manipulation
|
||||
2. **All Other Packages**: Everything from setup/packages.list except Ollama
|
||||
3. **Shell Configuration**: Zsh, Oh My Zsh, plugins, and dotfiles
|
||||
4. **Development Tools**: Node.js, VS Code, Git, and other development utilities
|
||||
|
||||
## Fabric Configuration
|
||||
|
||||
When SKIP_OLLAMA=true, Fabric is installed but configured differently:
|
||||
|
||||
### Standard Configuration (with Ollama):
|
||||
```env
|
||||
# Fabric Configuration for Ollama
|
||||
DEFAULT_MODEL=phi3:mini
|
||||
OLLAMA_API_BASE=http://localhost:11434
|
||||
```
|
||||
|
||||
### Skip Ollama Configuration:
|
||||
```env
|
||||
# Fabric Configuration - Ollama installation skipped
|
||||
# Configure your preferred AI provider (OpenAI, Anthropic, Google, etc.)
|
||||
# DEFAULT_MODEL=your_model_here
|
||||
# OPENAI_API_KEY=your_key_here
|
||||
# For more configuration options, see: fabric --help
|
||||
```
|
||||
|
||||
## Post-Installation Configuration
|
||||
|
||||
After running setup with SKIP_OLLAMA=true, configure your preferred AI provider:
|
||||
|
||||
1. **Edit the Fabric configuration**:
|
||||
```bash
|
||||
nano ~/.config/fabric/.env
|
||||
```
|
||||
|
||||
2. **Add your preferred provider**:
|
||||
```env
|
||||
# Example for OpenAI
|
||||
OPENAI_API_KEY=your_api_key_here
|
||||
DEFAULT_MODEL=gpt-4
|
||||
|
||||
# Example for Anthropic
|
||||
ANTHROPIC_API_KEY=your_api_key_here
|
||||
DEFAULT_MODEL=claude-3-sonnet-20240229
|
||||
|
||||
# Example for Google
|
||||
GOOGLE_API_KEY=your_api_key_here
|
||||
DEFAULT_MODEL=gemini-pro
|
||||
```
|
||||
|
||||
3. **Test the configuration**:
|
||||
```bash
|
||||
fabric --list-patterns
|
||||
echo "Test text" | fabric -p summarize
|
||||
```
|
||||
|
||||
## Implementation Details
|
||||
|
||||
The SKIP_OLLAMA feature is implemented through conditional blocks in setup.sh:
|
||||
|
||||
- Lines 369-510: Main Ollama Docker setup wrapped in conditional
|
||||
- Lines 276-340: Fabric configuration adapts based on SKIP_OLLAMA
|
||||
- Lines 740+: Testing and summary sections adjust output accordingly
|
||||
|
||||
## Benefits
|
||||
|
||||
- **Faster Installation**: Skips Docker image downloads and container setup
|
||||
- **Lower Resource Usage**: No background AI containers consuming memory/CPU
|
||||
- **Flexibility**: Users can choose their preferred AI providers
|
||||
- **Compatibility**: Works in environments without Docker access
|
||||
|
||||
## Migration
|
||||
|
||||
Users can always add Ollama later by:
|
||||
|
||||
1. Running the full setup script without SKIP_OLLAMA
|
||||
2. Manually setting up Ollama Docker containers
|
||||
3. Updating Fabric configuration to use local models
|
||||
|
||||
Date: May 30, 2025
|
||||
Version: Compatible with shell repository setup.sh v2.x
|
||||
|
||||
## Implementation Status
|
||||
|
||||
✅ **COMPLETED** - The SKIP_OLLAMA feature is fully implemented and functional.
|
||||
|
||||
### What's Implemented:
|
||||
|
||||
1. **Core Logic**: Conditional wrapper around Ollama Docker installation (lines 373-523)
|
||||
2. **Fabric Configuration**: Conditional setup for external AI providers vs. Ollama
|
||||
3. **Testing Section**: Conditional testing based on SKIP_OLLAMA setting (lines 752-796)
|
||||
4. **Post-Installation Instructions**: Different guidance for external AI provider mode
|
||||
5. **Convenience Script**: `setup-no-ollama.sh` for easy access
|
||||
6. **Documentation**: This comprehensive guide
|
||||
|
||||
### Files Modified:
|
||||
|
||||
- `setup/setup.sh`: Main implementation with conditional logic
|
||||
- `setup-no-ollama.sh`: Convenience script (NEW)
|
||||
- `skip_ollama.patch`: Documentation (NEW)
|
||||
|
||||
### Testing Status:
|
||||
|
||||
- ✅ Syntax validation passed
|
||||
- ✅ Conditional logic implemented
|
||||
- ⏳ End-to-end testing recommended
|
||||
|
||||
The implementation is complete and ready for use.
|
||||
Reference in New Issue
Block a user