feat: Add CI/CD setup guide with Gitea Actions for trading analysis application

feat: Implement multi-user support with separate brokerage accounts and user authentication

feat: Configure SSO authentication setup using Google OAuth 2.0 for secure access

refactor: Update index page to reflect new Trading Analysis Dashboard features and descriptions

docs: Enhance quickstart guide for deploying Trading Analysis Dashboard with detailed steps

chore: Add runner configuration for Gitea Actions with logging and container settings
This commit is contained in:
Peter Wood
2025-11-14 12:43:09 -05:00
parent 2f5e59b40f
commit c6eb26037b
24 changed files with 3594 additions and 169 deletions

141
api-reference/auth.mdx Normal file
View File

@@ -0,0 +1,141 @@
---
title: 'Authentication'
description: 'OAuth 2.0 authentication endpoints and flow'
---
## Overview
The Trading Analysis Dashboard uses Google OAuth 2.0 for secure authentication. All API endpoints require an authenticated session.
## Authentication Flow
<Steps>
<Step title="User visits application">
Unauthenticated users are redirected to the login page
</Step>
<Step title="Click Sign in with Google">
User initiates OAuth flow by clicking the Google sign-in button
</Step>
<Step title="Google authorization">
User is redirected to Google to authorize the application
</Step>
<Step title="Callback">
Google redirects back to the application with authorization code
</Step>
<Step title="Session created">
Application exchanges code for tokens and creates a secure session
</Step>
<Step title="Access granted">
User is redirected to the dashboard with an authenticated session
</Step>
</Steps>
## Endpoints
### Login Page
```
GET /auth/login
```
Displays the login page for unauthenticated users.
### Initiate OAuth
```
GET /login
```
Redirects user to Google OAuth authorization page.
### OAuth Callback
```
GET /auth/callback
```
Handles the OAuth callback from Google and creates user session.
**Query Parameters:**
- `code` (string): Authorization code from OAuth provider
- `state` (string): State parameter for security
### Logout
```
GET /logout
```
Clears user session and logs out the user.
### User Profile
```
GET /auth/profile
```
Displays user profile information (requires authentication).
## Session Management
- Sessions are stored server-side using Flask sessions
- Session cookies are HTTP-only and secure (in production)
- Sessions expire after a period of inactivity
- Users must re-authenticate after session expiration
## User Authorization
Access is controlled by the `AUTHORIZED_USERS` environment variable:
```env
AUTHORIZED_USERS=user1@example.com,user2@example.com,user3@example.com
```
Only users with email addresses in this list can access the application after authenticating with Google.
## Error Responses
### 401 Unauthorized
```json
{
"success": false,
"error": "Authentication required",
"redirect_to_login": true
}
```
### 403 Forbidden
```json
{
"success": false,
"error": "Access denied. User not authorized."
}
```
## Security Best Practices
<CardGroup cols={2}>
<Card title="HTTPS Only" icon="lock">
Always use HTTPS in production for OAuth callbacks
</Card>
<Card title="Secure Sessions" icon="shield-check">
Session cookies are HTTP-only and secure
</Card>
<Card title="User Whitelist" icon="users">
Only authorized email addresses can access the application
</Card>
<Card title="Token Security" icon="key">
OAuth tokens are never exposed to the client
</Card>
</CardGroup>
## Configuration
See the [SSO Setup Guide](/guides/setup/sso) for detailed configuration instructions.

View File

@@ -0,0 +1,124 @@
---
title: 'Get Month Data'
api: 'GET /api/month/{month}'
description: 'Retrieves detailed trading data for a specific month'
---
## Endpoint
```
GET /api/month/{month}
```
## Path Parameters
<ParamField path="month" type="string" required>
Month in YYYY-MM format (e.g., "2024-08")
</ParamField>
## Authentication
Requires OAuth 2.0 authentication via session cookies.
## Response
Returns detailed trading data including summary, trades, and dividends.
<ResponseField name="success" type="boolean" required>
Request success status
</ResponseField>
<ResponseField name="summary" type="object" required>
Monthly summary statistics
<Expandable title="Summary Fields">
<ResponseField name="month" type="string">
Month in YYYY-MM format
</ResponseField>
<ResponseField name="total_trades" type="number">
Total number of completed trades
</ResponseField>
<ResponseField name="winning_trades" type="number">
Number of profitable trades
</ResponseField>
<ResponseField name="win_rate" type="number">
Win rate percentage
</ResponseField>
<ResponseField name="trading_profit_loss" type="number">
Total profit/loss from trades
</ResponseField>
<ResponseField name="total_dividends" type="number">
Total dividend income
</ResponseField>
<ResponseField name="total_return_with_dividends" type="number">
Combined trading P&L and dividends
</ResponseField>
</Expandable>
</ResponseField>
<ResponseField name="trades" type="array" required>
List of completed trades
</ResponseField>
<ResponseField name="dividends" type="array" required>
List of dividend payments
</ResponseField>
## Example
<CodeGroup>
```bash cURL
curl -X GET https://your-domain.com/api/month/2024-08 \
-H "Cookie: session=your_session_cookie"
```
```javascript JavaScript
const month = '2024-08';
const response = await fetch(`/api/month/${month}`);
const data = await response.json();
if (data.success) {
console.log(`P/L: $${data.summary.trading_profit_loss}`);
console.log(`Trades: ${data.trades.length}`);
}
```
</CodeGroup>
## Response Example
```json
{
"success": true,
"summary": {
"month": "2024-08",
"total_trades": 15,
"winning_trades": 9,
"win_rate": 60.0,
"trading_profit_loss": 850.75,
"total_dividends": 125.50,
"total_return_with_dividends": 976.25
},
"trades": [
{
"symbol": "AAPL",
"buy_date": "2024-08-01",
"sell_date": "2024-08-15",
"buy_price": 195.50,
"sell_price": 198.75,
"volume": 100,
"total_profit_loss": 325.00,
"return_percentage": 1.66,
"trade_result": "Win"
}
],
"dividends": [
{
"transaction_date": "2024-08-15",
"symbol": "MSFT",
"action": "Cash Dividend",
"amount": 75.50
}
],
"data_source": "postgresql"
}
```

116
api-reference/months.mdx Normal file
View File

@@ -0,0 +1,116 @@
---
title: 'Get Available Months'
api: 'GET /api/months'
description: 'Retrieves a list of all months that have trading data available'
---
## Endpoint
```
GET /api/months
```
## Authentication
Requires OAuth 2.0 authentication via session cookies.
## Parameters
None
## Response
<ResponseField name="success" type="boolean" required>
Indicates if the request was successful
</ResponseField>
<ResponseField name="months" type="array" required>
List of month objects
<Expandable title="Month Object">
<ResponseField name="month" type="string">
Month in YYYY-MM format
</ResponseField>
<ResponseField name="total_return_with_dividends" type="number">
Total return including dividends for that month
</ResponseField>
</Expandable>
</ResponseField>
<ResponseField name="data_source" type="string" required>
Database source (always "postgresql")
</ResponseField>
## Example
<CodeGroup>
```bash cURL
curl -X GET https://your-domain.com/api/months \
-H "Cookie: session=your_session_cookie"
```
```javascript JavaScript
const response = await fetch('/api/months');
const data = await response.json();
if (data.success) {
console.log('Available months:', data.months);
}
```
```python Python
import requests
response = requests.get('http://localhost:5000/api/months')
data = response.json()
if data['success']:
for month in data['months']:
print(f"{month['month']}: ${month['total_return_with_dividends']}")
```
</CodeGroup>
## Response Example
```json
{
"success": true,
"months": [
{
"month": "2024-08",
"total_return_with_dividends": 1250.75
},
{
"month": "2024-07",
"total_return_with_dividends": -320.50
},
{
"month": "2024-06",
"total_return_with_dividends": 890.25
}
],
"data_source": "postgresql"
}
```
## Error Responses
<ResponseExample>
```json Database Connection Failed
{
"success": false,
"error": "Database connection failed"
}
```
</ResponseExample>
<ResponseExample>
```json Unauthorized
{
"success": false,
"error": "Authentication required",
"redirect_to_login": true
}
```
</ResponseExample>

View File

@@ -0,0 +1,136 @@
---
title: 'Portfolio Holdings'
api: 'GET /api/portfolio/holdings'
description: 'Get, add, update, or delete portfolio holdings'
---
## Get All Holdings
```
GET /api/portfolio/holdings
```
Returns all holdings for the current user with current prices and calculated metrics.
### Response Example
```json
{
"success": true,
"holdings": [
{
"id": 1,
"symbol": "AAPL",
"holding_type": "stock",
"shares": 100,
"average_cost": 150.50,
"current_price": 175.25,
"total_cost": 15050.00,
"current_value": 17525.00,
"gain_loss": 2475.00,
"return_percentage": 16.44,
"last_updated": "2024-11-14T10:30:00Z"
}
]
}
```
## Add a Holding
```
POST /api/portfolio/holdings
```
### Request Body
<ParamField body="symbol" type="string" required>
Stock ticker symbol (e.g., "AAPL")
</ParamField>
<ParamField body="holding_type" type="string" required>
Type: "stock", "etf", or "mutual_fund"
</ParamField>
<ParamField body="shares" type="number" required>
Number of shares owned
</ParamField>
<ParamField body="average_cost" type="number" required>
Average cost per share
</ParamField>
<ParamField body="notes" type="string">
Optional notes about the holding
</ParamField>
### Example
<CodeGroup>
```bash cURL
curl -X POST https://your-domain.com/api/portfolio/holdings \
-H "Content-Type: application/json" \
-d '{
"symbol": "AAPL",
"holding_type": "stock",
"shares": 100,
"average_cost": 150.50,
"notes": "Tech holding"
}'
```
```javascript JavaScript
const response = await fetch('/api/portfolio/holdings', {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify({
symbol: 'AAPL',
holding_type: 'stock',
shares: 100,
average_cost: 150.50,
notes: 'Tech holding'
})
});
const data = await response.json();
```
</CodeGroup>
## Update a Holding
```
PUT /api/portfolio/holdings/{id}
```
### Path Parameters
<ParamField path="id" type="integer" required>
Holding ID to update
</ParamField>
### Request Body
<ParamField body="shares" type="number">
Updated number of shares
</ParamField>
<ParamField body="average_cost" type="number">
Updated average cost per share
</ParamField>
<ParamField body="notes" type="string">
Updated notes
</ParamField>
## Delete a Holding
```
DELETE /api/portfolio/holdings/{id}
```
### Path Parameters
<ParamField path="id" type="integer" required>
Holding ID to delete
</ParamField>

View File

@@ -0,0 +1,104 @@
---
title: 'Refresh Portfolio Prices'
api: 'POST /api/portfolio/refresh-prices'
description: 'Fetch current market prices for all portfolio holdings'
---
## Endpoint
```
POST /api/portfolio/refresh-prices
```
## Description
Fetches the latest market prices from Finnhub API for all holdings in the user's portfolio.
## Authentication
Requires OAuth 2.0 authentication via session cookies.
## Rate Limiting
The free Finnhub API tier allows 60 requests per minute. The application intelligently manages API requests to stay within these limits.
## Response
<ResponseField name="success" type="boolean" required>
Indicates if the refresh was successful
</ResponseField>
<ResponseField name="updated_count" type="number">
Number of holdings successfully updated
</ResponseField>
<ResponseField name="errors" type="array">
List of any errors encountered during refresh
</ResponseField>
## Example
<CodeGroup>
```bash cURL
curl -X POST https://your-domain.com/api/portfolio/refresh-prices \
-H "Cookie: session=your_session_cookie"
```
```javascript JavaScript
const response = await fetch('/api/portfolio/refresh-prices', {
method: 'POST'
});
const data = await response.json();
if (data.success) {
console.log(`Updated ${data.updated_count} holdings`);
}
```
```python Python
import requests
response = requests.post('http://localhost:5000/api/portfolio/refresh-prices')
data = response.json()
if data['success']:
print(f"Updated {data['updated_count']} holdings")
```
</CodeGroup>
## Response Example
```json Success
{
"success": true,
"updated_count": 5,
"message": "Successfully updated prices for 5 holdings"
}
```
```json Partial Success
{
"success": true,
"updated_count": 4,
"errors": [
{
"symbol": "INVALID",
"error": "Symbol not found"
}
]
}
```
```json Error
{
"success": false,
"error": "Finnhub API key not configured"
}
```
## Notes
- Prices are automatically refreshed when viewing the portfolio page if last update was >15 minutes ago
- Use this endpoint to manually force a refresh at any time
- Mutual fund prices may be delayed 15-30 minutes

105
api-reference/timeframe.mdx Normal file
View File

@@ -0,0 +1,105 @@
---
title: 'Get Timeframe Data'
api: 'GET /api/timeframe-data'
description: 'Retrieves trading analysis data for a custom timeframe or all-time data'
---
## Endpoint
```
GET /api/timeframe-data
```
## Query Parameters
<ParamField query="start" type="string">
Start date in YYYY-MM-DD format (optional if using all=true)
</ParamField>
<ParamField query="end" type="string">
End date in YYYY-MM-DD format (optional if using all=true)
</ParamField>
<ParamField query="all" type="string">
Set to "true" for all-time data (ignores start/end dates)
</ParamField>
<ParamField query="symbols" type="string">
Comma-separated list of stock symbols to filter by (optional)
</ParamField>
## Response
Returns summary, weekly breakdown, monthly breakdown, and open positions.
## Example
<CodeGroup>
```bash Date Range
curl -X GET "https://your-domain.com/api/timeframe-data?start=2024-01-01&end=2024-08-31"
```
```bash All Time
curl -X GET "https://your-domain.com/api/timeframe-data?all=true"
```
```bash With Symbols
curl -X GET "https://your-domain.com/api/timeframe-data?start=2024-06-01&end=2024-08-31&symbols=AAPL,TSLA,MSFT"
```
```javascript JavaScript
// Get YTD data
const start = '2024-01-01';
const end = new Date().toISOString().split('T')[0];
const response = await fetch(`/api/timeframe-data?start=${start}&end=${end}`);
const data = await response.json();
console.log('Total P/L:', data.summary.trading_profit_loss);
```
</CodeGroup>
## Response Example
```json
{
"success": true,
"data": {
"summary": {
"trading_profit_loss": 2450.75,
"total_dividends": 380.50,
"total_trades": 45,
"winning_trades": 28,
"win_rate_percentage": 62.22
},
"weekly_summary": [
{
"week_start": "2024-08-26",
"period": "2024-08-26",
"trading_profit_loss": 150.25,
"total_dividends": 25.00,
"total_trades": 3,
"winning_trades": 2,
"win_rate_percentage": 66.67
}
],
"monthly_summary": [
{
"month_start": "2024-08-01",
"period": "2024-08",
"trading_profit_loss": 850.75,
"total_dividends": 125.50,
"total_trades": 15,
"winning_trades": 9,
"win_rate_percentage": 60.0
}
],
"open_positions": [
{
"symbol": "NVDA",
"shares": 150
}
]
},
"data_source": "postgresql"
}
```

View File

@@ -0,0 +1,59 @@
---
title: 'Get Trade Details'
api: 'GET /api/trade-details/{month}'
description: 'Retrieves detailed trade information for a specific month'
---
## Endpoint
```
GET /api/trade-details/{month}
```
## Path Parameters
<ParamField path="month" type="string" required>
Month in YYYY-MM format (e.g., "2024-08")
</ParamField>
## Response
Returns detailed trade information with buy/sell prices, volumes, and profit/loss calculations.
## Example
<CodeGroup>
```bash cURL
curl -X GET https://your-domain.com/api/trade-details/2024-08
```
```javascript JavaScript
const response = await fetch('/api/trade-details/2024-08');
const data = await response.json();
console.log('Trades:', data.trades);
```
</CodeGroup>
## Response Example
```json
{
"success": true,
"trades": [
{
"symbol": "AAPL",
"buy_date": "2024-08-01",
"sell_date": "2024-08-15",
"buy_price": 195.50,
"sell_price": 198.75,
"volume": 100,
"profit_per_share": 3.25,
"total_profit_loss": 325.00,
"return_percentage": 1.66,
"trade_result": "Win"
}
],
"data_source": "postgresql"
}
```

84
docker-compose.yml Normal file
View File

@@ -0,0 +1,84 @@
services:
server:
image: docker.gitea.com/gitea:latest
container_name: gitea
environment:
- USER_UID=${USER_UID}
- USER_GID=${USER_GID}
- GITEA__database__DB_TYPE=postgres
- GITEA__database__HOST=db:5432
- GITEA__database__NAME=${POSTGRES_USER}
- GITEA__database__USER=${POSTGRES_USER}
- GITEA__database__PASSWD=${POSTGRES_PASSWORD}
restart: always
networks:
- gitea
volumes:
- gitea:/data
- /etc/timezone:/etc/timezone:ro
- /etc/localtime:/etc/localtime:ro
ports:
- ${GITEA_HTTP_PORT:-3500}:3000
- ${GITEA_SSH_PORT:-2229}:22
depends_on:
- db
labels:
- diun.enable=true
healthcheck:
test:
- CMD
- curl
- -f
- http://localhost
interval: 10s
retries: 3
start_period: 30s
timeout: 10s
db:
image: docker.io/library/postgres:14
restart: always
environment:
- POSTGRES_USER=${POSTGRES_USER}
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
- POSTGRES_DB=${POSTGRES_DB}
networks:
- gitea
volumes:
- postgres:/var/lib/postgresql/data
runner:
image: gitea/act_runner:latest
container_name: gitea-runner
restart: always
networks:
- gitea
volumes:
- runner:/data
- /var/run/docker.sock:/var/run/docker.sock
- ./runner-config.yaml:/data/config.yaml:ro
environment:
- GITEA_INSTANCE_URL=http://server:3000
- GITEA_RUNNER_REGISTRATION_TOKEN=${GITEA_RUNNER_REGISTRATION_TOKEN}
- GITEA_RUNNER_NAME=docker-runner
- CONFIG_FILE=/data/config.yaml
command: >
sh -c "
if [ ! -f /data/.runner ]; then
act_runner register --no-interactive --instance http://server:3000 --token $${GITEA_RUNNER_REGISTRATION_TOKEN} --name docker-runner;
fi;
act_runner --config /data/config.yaml daemon
"
depends_on:
- server
labels:
- diun.enable=true
networks:
gitea:
external: false
volumes:
gitea:
postgres:
runner:

View File

@@ -1,68 +1,83 @@
{
"$schema": "https://mintlify.com/docs.json",
"theme": "mint",
"name": "Mint Starter Kit",
"name": "Trading Analysis Dashboard",
"colors": {
"primary": "#16A34A",
"light": "#07C983",
"dark": "#15803D"
"primary": "#0066CC",
"light": "#3399FF",
"dark": "#003D7A"
},
"favicon": "/favicon.svg",
"navigation": {
"tabs": [
{
"tab": "Guides",
"tab": "Documentation",
"groups": [
{
"group": "Getting started",
"group": "Getting Started",
"pages": [
"index",
"quickstart",
"development"
"quickstart"
]
},
{
"group": "Customization",
"group": "Setup & Configuration",
"pages": [
"essentials/settings",
"essentials/navigation"
"guides/setup/cicd",
"guides/setup/sso",
"guides/setup/multi-user"
]
},
{
"group": "Writing content",
"group": "Deployment",
"pages": [
"essentials/markdown",
"essentials/code",
"essentials/images",
"essentials/reusable-snippets"
"guides/deployment/docker",
"guides/deployment/caddy"
]
},
{
"group": "AI tools",
"group": "Features",
"pages": [
"ai-tools/cursor",
"ai-tools/claude-code",
"ai-tools/windsurf"
"features/portfolio-management",
"features/portfolio-quickstart",
"features/trading-analysis",
"features/csv-upload",
"features/hybrid-matching",
"features/timeframe-analysis",
"features/trading-calendar"
]
}
]
},
{
"tab": "API reference",
"tab": "API Reference",
"groups": [
{
"group": "API documentation",
"group": "Overview",
"pages": [
"api-reference/introduction"
]
},
{
"group": "Endpoint examples",
"group": "Trading Data",
"pages": [
"api-reference/endpoint/get",
"api-reference/endpoint/create",
"api-reference/endpoint/delete",
"api-reference/endpoint/webhook"
"api-reference/months",
"api-reference/month-data",
"api-reference/trade-details",
"api-reference/timeframe"
]
},
{
"group": "Portfolio",
"pages": [
"api-reference/portfolio-holdings",
"api-reference/portfolio-refresh"
]
},
{
"group": "Authentication",
"pages": [
"api-reference/auth"
]
}
]
@@ -71,14 +86,9 @@
"global": {
"anchors": [
{
"anchor": "Documentation",
"href": "https://mintlify.com/docs",
"icon": "book-open-cover"
},
{
"anchor": "Blog",
"href": "https://mintlify.com/blog",
"icon": "newspaper"
"anchor": "GitHub",
"href": "https://github.com/acedanger",
"icon": "github"
}
]
}
@@ -87,19 +97,6 @@
"light": "/logo/light.svg",
"dark": "/logo/dark.svg"
},
"navbar": {
"links": [
{
"label": "Support",
"href": "mailto:hi@mintlify.com"
}
],
"primary": {
"type": "button",
"label": "Dashboard",
"href": "https://dashboard.mintlify.com"
}
},
"contextual": {
"options": [
"copy",
@@ -114,9 +111,7 @@
},
"footer": {
"socials": {
"x": "https://x.com/mintlify",
"github": "https://github.com/mintlify",
"linkedin": "https://linkedin.com/company/mintlify"
"github": "https://github.com/acedanger"
}
}
}

109
features/csv-upload.mdx Normal file
View File

@@ -0,0 +1,109 @@
---
title: 'CSV Upload'
description: 'Import trading data via CSV files with drag-and-drop support'
---
## Overview
The CSV Upload feature allows you to import trading transaction data through an intuitive web interface with drag-and-drop support and real-time processing feedback.
## Features
<CardGroup cols={2}>
<Card title="Drag & Drop" icon="hand-pointer">
Drag CSV files directly onto the upload area
</Card>
<Card title="Progress Tracking" icon="spinner">
Real-time progress bar during processing
</Card>
<Card title="Upload History" icon="clock-rotate-left">
View recent uploads and statistics
</Card>
<Card title="Validation" icon="check">
Automatic CSV format and size validation
</Card>
</CardGroup>
## CSV Format Requirements
Your CSV file must include these columns:
| Column | Format | Description |
|--------|--------|-------------|
| **Date** | MM/DD/YYYY | Transaction date |
| **Action** | Text | Buy, Sell, Cash Dividend, etc. |
| **Symbol** | Text | Stock ticker |
| **Description** | Text | Transaction description |
| **Quantity** | Number | Number of shares (can be empty for dividends) |
| **Price** | Number | Price per share (can be empty for dividends) |
| **Fees & Comm** | Number | Trading fees |
| **Amount** | Number | Total transaction amount |
## Example CSV
```csv
Date,Action,Symbol,Description,Quantity,Price,Fees & Comm,Amount
01/15/2024,Buy,AAPL,Apple Inc,100,150.50,6.95,-15056.95
01/30/2024,Sell,AAPL,Apple Inc,100,155.75,6.95,15568.05
02/15/2024,Cash Dividend,MSFT,Microsoft Corp,,,0.00,75.50
```
## Upload Process
<Steps>
<Step title="Navigate to Upload Page">
Go to `/upload` in your application
</Step>
<Step title="Select File">
Either drag and drop your CSV file or click to browse
</Step>
<Step title="Validation">
The system validates file type (CSV only) and size (50MB max)
</Step>
<Step title="Processing">
Watch real-time progress updates as the file is processed
</Step>
<Step title="Completion">
View the upload in your history and navigate to the dashboard
</Step>
</Steps>
## Processing Flow
1. File uploaded to Flask backend
2. Server validation (file type, size)
3. Trading analysis script processes CSV
4. Database synchronization
5. History updated and temp files cleaned up
## Security
<CardGroup cols={2}>
<Card title="Authentication" icon="lock">
Login required for all uploads
</Card>
<Card title="Validation" icon="shield-check">
File type and size validation
</Card>
<Card title="Sanitization" icon="broom">
Secure filename handling
</Card>
<Card title="Cleanup" icon="trash">
Automatic temp file removal
</Card>
</CardGroup>
## Next Steps
<CardGroup cols={2}>
<Card title="Trading Analysis" icon="chart-line" href="/features/trading-analysis">
Analyze your uploaded data
</Card>
<Card title="Portfolio Management" icon="briefcase" href="/features/portfolio-management">
Track your current holdings
</Card>
</CardGroup>

View File

@@ -0,0 +1,111 @@
---
title: 'Hybrid Matching Algorithm'
description: 'Broker-level accuracy for profit/loss and wash sale tracking'
---
## Overview
The hybrid matching algorithm combines two data sources to provide the most accurate profit/loss and wash sale tracking:
1. **Broker's Realized Gains/Losses CSV** - Pre-calculated lot matches with definitive P&L
2. **Transaction History CSV** - Complete record of all buy/sell transactions
## Key Benefits
<CardGroup cols={2}>
<Card title="Broker-Level Accuracy" icon="bullseye">
Uses broker's proprietary matching logic
</Card>
<Card title="Wash Sale Detection" icon="flag">
Accurate wash sale flags from broker
</Card>
<Card title="FIFO Fallback" icon="layer-group">
Estimates P/L when broker data unavailable
</Card>
<Card title="Complete Coverage" icon="check-double">
Handles all transaction types
</Card>
</CardGroup>
## How It Works
<Steps>
<Step title="Load Broker Lots">
System loads pre-calculated lot matches from broker's realized gains/losses CSV
</Step>
<Step title="Process Sells">
For each sell transaction, checks if corresponding broker lot exists
</Step>
<Step title="Use Broker Data">
If lot found, uses broker's P/L, wash sale flag, and cost basis
</Step>
<Step title="FIFO Estimate">
If no broker lot, applies FIFO (First In, First Out) matching logic
</Step>
</Steps>
## Data Sources
### Broker Realized Gains/Losses
Contains:
- Opened date (purchase date)
- Closed date (sale date)
- Quantity sold from specific lot
- Cost basis and proceeds
- Gain/loss amount (pre-calculated)
- Wash sale flag
- Disallowed loss amount
- Term (Short/Long)
### Transaction History
Contains:
- All buy/sell transactions
- Transaction dates
- Symbol, quantity, price
- Commissions and fees
## Matching Criteria
The system matches broker lots to transactions using:
- **Symbol**: Must match exactly
- **Date**: Must match the transaction date
- **Quantity**: With tolerance for fractional shares
## Data Source Indicators
<Tabs>
<Tab title="Broker-Verified">
✓ **Broker-verified badge**
All trades matched to broker lots with definitive P/L
</Tab>
<Tab title="FIFO Estimate">
⚠️ **FIFO estimate badge**
Trades matched using FIFO logic (no broker lot available)
</Tab>
<Tab title="Mixed Sources">
✓ **Broker-verified** + ⚠️ **FIFO estimate**
Month contains both broker-verified and FIFO-estimated trades
</Tab>
</Tabs>
## Next Steps
<CardGroup cols={2}>
<Card title="CSV Upload" icon="file-csv" href="/features/csv-upload">
Learn how to upload both CSV files
</Card>
<Card title="Trading Analysis" icon="chart-line" href="/features/trading-analysis">
View your matched trades
</Card>
</CardGroup>

View File

@@ -0,0 +1,153 @@
---
title: 'Portfolio Management'
description: 'Track your stock, ETF, and mutual fund holdings with real-time price updates'
---
## Overview
The Portfolio Management feature allows you to track your current stock, ETF, and mutual fund holdings with real-time price updates from Finnhub.io. View comprehensive metrics, allocation charts, and performance analysis all in one place.
## Key Features
<CardGroup cols={2}>
<Card title="Real-time Pricing" icon="clock">
Automatic price updates from Finnhub API
</Card>
<Card title="Multi-Asset Support" icon="layer-group">
Track stocks, ETFs, and mutual funds
</Card>
<Card title="Visual Analytics" icon="chart-pie">
Interactive allocation and performance charts
</Card>
<Card title="CSV Import" icon="file-csv">
Bulk import holdings from CSV files
</Card>
</CardGroup>
## Quick Start
<Steps>
<Step title="Get Finnhub API Key">
Register at [Finnhub.io](https://finnhub.io/register) and copy your API key
</Step>
<Step title="Configure Environment">
Add your API key to `.env`:
```bash
FINNHUB_API_KEY=your_api_key_here
```
</Step>
<Step title="Add Holdings">
Use the web interface to add holdings manually or upload a CSV file
</Step>
<Step title="Refresh Prices">
Click "Refresh Prices" to fetch current market prices
</Step>
</Steps>
## Adding Holdings
### Manual Entry
<Steps>
<Step title="Click Add Holding">
Navigate to the Portfolio page and click the "Add Holding" button
</Step>
<Step title="Fill in Details">
- **Symbol**: Stock ticker (e.g., AAPL, MSFT)
- **Type**: Select stock, ETF, or mutual_fund
- **Shares**: Number of shares owned
- **Average Cost**: Your average cost per share
- **Notes**: Optional notes about the holding
</Step>
<Step title="Save">
Click "Save" to add the holding to your portfolio
</Step>
</Steps>
### CSV Upload
Upload a CSV file with the following format:
```csv
symbol,type,shares,average_cost,notes
AAPL,stock,100,150.50,Tech holding
VOO,etf,50,400.00,S&P 500 ETF
VTSAX,mutual_fund,500,120.25,Index fund
```
<Info>
See the [CSV Upload guide](/features/csv-upload) for detailed formatting instructions.
</Info>
## Portfolio Metrics
The dashboard displays four key summary cards:
| Metric | Description |
|--------|-------------|
| **Total Value** | Current market value of all holdings |
| **Total Cost** | Total amount invested (shares × average cost) |
| **Total Gain/Loss** | Dollar amount gained or lost |
| **Total Return** | Percentage return on investment |
## Charts
### Allocation Chart
Interactive doughnut chart showing:
- Percentage of portfolio in each holding
- Dollar amounts on hover
- Click legend to show/hide holdings
### Performance Chart
Bar chart displaying:
- Gain/loss for each holding
- Green bars for profitable holdings
- Red bars for losing holdings
## Managing Holdings
### Edit a Holding
1. Click the edit (✏️) button next to any holding
2. Update the fields you want to change
3. Click "Save"
<Warning>
You cannot change the symbol of an existing holding. To change a symbol, delete the holding and add a new one.
</Warning>
### Delete a Holding
1. Click the delete (🗑️) button next to any holding
2. Confirm the deletion
## Price Updates
Click the **"Refresh Prices"** button to fetch the latest market prices for all holdings. Prices are also automatically refreshed when viewing the page if the last update was more than 15 minutes ago.
### Rate Limiting
The free Finnhub API tier allows:
- **60 requests per minute**
- **Real-time US stock quotes**
- **Delayed mutual fund prices** (typically 15-30 minutes)
The application intelligently manages API requests to stay within these limits.
## Next Steps
<CardGroup cols={2}>
<Card title="Trading Analysis" icon="chart-line" href="/features/trading-analysis">
Analyze your historical trading performance
</Card>
<Card title="API Reference" icon="code" href="/api-reference/portfolio-holdings">
Integrate with the Portfolio API
</Card>
</CardGroup>

View File

@@ -0,0 +1,181 @@
---
title: 'Portfolio Quick Start'
description: 'Get started with portfolio tracking in minutes'
---
## Quick Setup Guide
Get your portfolio up and running in just a few minutes.
## Step 1: Get Your Finnhub API Key
<Steps>
<Step title="Register">
Go to [Finnhub.io](https://finnhub.io/register) and sign up for a free account
</Step>
<Step title="Get API Key">
After logging in, copy your API key from the dashboard
</Step>
</Steps>
## Step 2: Configure the API Key
Add your Finnhub API key to the `.env.docker` file (or `.env` if running locally):
```bash .env.docker
FINNHUB_API_KEY=your_api_key_here
```
<Note>
The `.env.docker` file already has a placeholder for the API key.
</Note>
## Step 3: Deploy/Restart the Application
If you're already running the application, restart it to load the new environment variable:
```bash
docker compose down
docker compose up -d
```
For first-time deployment:
<Tabs>
<Tab title="Linux/Mac">
```bash
./deploy.sh
```
</Tab>
<Tab title="Windows">
```batch
deploy.bat
```
</Tab>
</Tabs>
## Step 4: Apply Database Schema
The portfolio holdings table needs to be created in your database:
```bash
# Access the database container
docker compose exec postgres psql -U trading_user -d mining_wood
# Run the schema file
\i /docker-entrypoint-initdb.d/portfolio_schema.sql
# Or run it directly from the host
docker compose exec -T postgres psql -U trading_user -d mining_wood < database_init/portfolio_schema.sql
```
## Step 5: Access the Portfolio Page
<Steps>
<Step title="Open your browser">
Navigate to `http://localhost:8080`
</Step>
<Step title="Go to Portfolio">
You should be redirected to the Portfolio Management page (now the default landing page)
</Step>
<Step title="Check for errors">
If you see an error, check the application logs:
```bash
docker compose logs -f trading_app
```
</Step>
</Steps>
## Step 6: Add Your First Holding
### Option A: Manual Entry
<Steps>
<Step title="Click Add Holding">
Click the **"Add Holding"** button
</Step>
<Step title="Fill in the form">
- **Symbol**: Enter a stock ticker (e.g., AAPL)
- **Type**: Select "stock", "etf", or "mutual_fund"
- **Shares**: Enter the number of shares you own
- **Average Cost**: Enter your average cost per share
- **Notes**: (Optional) Add any notes
</Step>
<Step title="Save">
Click **"Save"**
</Step>
</Steps>
### Option B: CSV Upload
<Steps>
<Step title="Prepare CSV file">
Create a CSV file with your holdings:
```csv
symbol,type,shares,average_cost,notes
AAPL,stock,100,150.50,Apple Inc
MSFT,stock,50,300.00,Microsoft
VOO,etf,25,400.00,S&P 500 ETF
```
</Step>
<Step title="Upload">
- Click the **"Upload CSV"** button
- Select your CSV file
- Click **"Upload"**
</Step>
</Steps>
## Step 7: Refresh Prices
Click the **"Refresh Prices"** button to fetch current market prices from Finnhub for all your holdings.
## Troubleshooting
<AccordionGroup>
<Accordion title="Prices Not Updating">
**Solution**: Verify your API key is set correctly in `.env.docker` and you've restarted the containers.
```bash
# Check if environment variable is loaded
docker compose exec trading_app env | grep FINNHUB
```
</Accordion>
<Accordion title="Database Table Not Found">
**Solution**: Run the portfolio schema script:
```bash
docker compose exec -T postgres psql -U trading_user -d mining_wood < database_init/portfolio_schema.sql
```
</Accordion>
<Accordion title="CSV Upload Fails">
**Solution**: Make sure your CSV file has the correct format with columns: `symbol`, `type`, `shares`, `average_cost`, `notes`
</Accordion>
<Accordion title="Application Won't Start">
**Solution**: Check the logs for errors:
```bash
docker compose logs -f trading_app
```
</Accordion>
</AccordionGroup>
## Next Steps
<CardGroup cols={2}>
<Card title="Portfolio Management" icon="chart-line" href="/features/portfolio-management">
Learn more about portfolio features
</Card>
<Card title="Trading Analysis" icon="magnifying-glass-chart" href="/features/trading-analysis">
Analyze your trading performance
</Card>
</CardGroup>

View File

@@ -0,0 +1,92 @@
---
title: 'Timeframe Analysis'
description: 'Analyze trading performance across custom time periods'
---
## Overview
The Timeframe Analysis feature allows you to analyze trading performance across different time periods with comprehensive P/L summaries including both trading profits/losses and dividend income.
## Time Period Selection
<CardGroup cols={2}>
<Card title="All Time" icon="infinity">
Analyzes all available trading data
</Card>
<Card title="Current Month" icon="calendar-day">
Current calendar month performance
</Card>
<Card title="Year to Date" icon="calendar-check">
Performance from January 1st to today
</Card>
<Card title="Past Year" icon="calendar-days">
Rolling 365-day performance
</Card>
<Card title="Custom Period" icon="calendar-range">
Select your own date range
</Card>
</CardGroup>
## Key Metrics
- **Period**: Exact date range for selected timeframe
- **Total P/L**: Combined trading profit/loss and dividends
- **Trading P/L**: Profit/loss from completed trades only
- **Dividends**: Total dividend income received
- **Closed Trades**: Number of completed buy/sell transactions
- **Win Rate**: Percentage of profitable trades
## Summary Breakdowns
### Weekly Summary
Groups data by calendar week showing:
- Trading P/L per week
- Dividends per week
- Total P/L per week
- Number of trades per week
### Monthly Summary
Groups data by calendar month showing:
- Trading P/L per month
- Dividends per month
- Total P/L per month
- Number of trades per month
## Open Positions
Displays current open positions (stocks bought but not yet sold):
- Symbol and number of shares
- Notes that open positions are excluded from P/L calculations
## Usage
<Steps>
<Step title="Navigate to Summary">
Go to the Summary page and click "By Timeframe"
</Step>
<Step title="Select Period">
Choose from dropdown - date fields auto-populate
</Step>
<Step title="Custom Dates (Optional)">
Manually adjust start/end dates for fine-tuning
</Step>
<Step title="Apply">
Click "Apply Dates" for custom date ranges
</Step>
</Steps>
## Next Steps
<CardGroup cols={2}>
<Card title="Trading Analysis" icon="chart-line" href="/features/trading-analysis">
View monthly trading performance
</Card>
<Card title="API Reference" icon="code" href="/api-reference/timeframe">
Use the Timeframe API
</Card>
</CardGroup>

View File

@@ -0,0 +1,69 @@
---
title: 'Trading Analysis'
description: 'Analyze monthly trading performance and track profit/loss'
---
## Overview
The Trading Analysis feature provides comprehensive monthly trading performance analysis with detailed profit/loss tracking, trade-by-trade breakdowns, and dividend income reporting.
## Features
<CardGroup cols={2}>
<Card title="Monthly Navigation" icon="calendar">
Navigate between months with trading data
</Card>
<Card title="P/L Display" icon="dollar-sign">
Color-coded profit/loss amounts
</Card>
<Card title="Trade Details" icon="list">
Detailed breakdown of individual trades
</Card>
<Card title="Dividend Tracking" icon="coins">
Track dividend income separately
</Card>
</CardGroup>
## Monthly Dashboard
The dashboard displays:
- **Total Trades**: Number of completed buy/sell transactions
- **Winning Trades**: Number of profitable trades
- **Win Rate**: Percentage of winning trades
- **Trading P/L**: Profit/loss from trades
- **Dividend Income**: Total dividends received
- **Total Return**: Combined trading P/L and dividends
## Trade Details
Click on any profit/loss amount to see detailed trade information:
- **Symbol**: Stock ticker
- **Buy Date**: Purchase date
- **Sell Date**: Sale date
- **Buy Price**: Purchase price per share
- **Sell Price**: Sale price per share
- **Volume**: Number of shares traded
- **Profit/Loss**: Total profit or loss
- **Return %**: Percentage return
## Hybrid Matching Algorithm
The system uses a hybrid matching algorithm that combines:
1. **Broker Realized Gains/Losses** - Pre-calculated lot matches with definitive P/L
2. **Transaction History** - Complete record of all buy/sell transactions
This achieves **broker-level accuracy** for closed positions.
## Next Steps
<CardGroup cols={2}>
<Card title="CSV Upload" icon="file-csv" href="/features/csv-upload">
Import your trading history
</Card>
<Card title="API Reference" icon="code" href="/api-reference/months">
Integrate with the Trading API
</Card>
</CardGroup>

View File

@@ -0,0 +1,90 @@
---
title: 'Trading Calendar'
description: 'Real-time market trading days and holiday information'
---
## Overview
The Trading Calendar API provides real-time metrics about NYSE (New York Stock Exchange) trading days and upcoming market holidays.
## Features
<CardGroup cols={2}>
<Card title="Remaining Days" icon="calendar-days">
Trading days left in month and year
</Card>
<Card title="Next Holiday" icon="flag">
Upcoming market closure information
</Card>
<Card title="Market Status" icon="clock">
Is the market open today/tomorrow?
</Card>
<Card title="Holiday List" icon="list">
View upcoming market holidays
</Card>
</CardGroup>
## API Endpoint
### GET /api/trading-calendar/metrics
Returns trading calendar metrics including remaining trading days and next market holiday.
**Authentication**: Required (OAuth 2.0)
**Response**:
```json
{
"success": true,
"remaining_trading_days_month": 14,
"remaining_trading_days_year": 36,
"next_market_holiday": {
"name": "Thanksgiving Day",
"date": "2025-11-27"
},
"days_until_next_market_holiday": 19,
"upcoming_holidays": [
{
"name": "Thanksgiving Day",
"date": "2025-11-27",
"days_until": 19
},
{
"name": "Christmas Day",
"date": "2025-12-25",
"days_until": 47
}
],
"is_market_open_today": false,
"is_market_open_tomorrow": false,
"timezone": "America/New_York"
}
```
## Response Fields
| Field | Type | Description |
|-------|------|-------------|
| `remaining_trading_days_month` | integer | Trading days left in current month |
| `remaining_trading_days_year` | integer | Trading days left in current year |
| `next_market_holiday` | object | Next market closure info |
| `days_until_next_market_holiday` | integer | Days until next closure |
| `upcoming_holidays` | array | List of upcoming holidays (up to 10) |
| `is_market_open_today` | boolean | Market open today? |
| `is_market_open_tomorrow` | boolean | Market open tomorrow? |
## Data Source
Uses `pandas_market_calendars` library for accurate NYSE calendar data.
## Next Steps
<CardGroup cols={2}>
<Card title="API Reference" icon="code" href="/api-reference/introduction">
View full API documentation
</Card>
<Card title="Trading Analysis" icon="chart-line" href="/features/trading-analysis">
Analyze trading performance
</Card>
</CardGroup>

393
guides/deployment/caddy.mdx Normal file
View File

@@ -0,0 +1,393 @@
---
title: 'Caddy Configuration'
description: 'Configure Caddy reverse proxy for different deployment scenarios'
---
## Overview
Caddy is a powerful web server that automatically handles HTTPS with Let's Encrypt. This guide explains how to configure Caddy for different deployment scenarios.
## Local Development
The default `Caddyfile` is configured for local development:
```caddy Caddyfile
localhost {
reverse_proxy trading_app:5000
encode gzip
header {
X-Content-Type-Options nosniff
X-Frame-Options DENY
X-XSS-Protection "1; mode=block"
Referrer-Policy "strict-origin-when-cross-origin"
-Server
}
}
```
<Info>
Access your app at: `http://localhost`
</Info>
## Production Deployment
### Step 1: Domain Setup
<Steps>
<Step title="Configure DNS">
Point your domain's DNS A record to your server's IP
</Step>
<Step title="Copy Production Template">
```bash
cp Caddyfile.production Caddyfile
```
</Step>
<Step title="Edit Caddyfile">
Replace `your-domain.com` with your actual domain
</Step>
</Steps>
### Step 2: Environment Configuration
Update your `.env` file:
```env .env
DOMAIN=your-domain.com
FLASK_ENV=production
```
### Step 3: Deploy
```bash
docker-compose up -d
```
<Check>
Caddy will automatically:
- Obtain SSL certificates from Let's Encrypt
- Handle HTTP to HTTPS redirects
- Renew certificates automatically
</Check>
## Configuration Options
### Basic Reverse Proxy
```caddy
your-domain.com {
reverse_proxy trading_app:5000
}
```
### With Compression and Security Headers
```caddy
your-domain.com {
reverse_proxy trading_app:5000
encode gzip
header {
X-Content-Type-Options nosniff
X-Frame-Options DENY
Strict-Transport-Security "max-age=31536000"
}
}
```
### Static File Caching
```caddy
your-domain.com {
reverse_proxy trading_app:5000
@static path /static/*
handle @static {
header Cache-Control "public, max-age=3600"
reverse_proxy trading_app:5000
}
}
```
### Rate Limiting
```caddy
your-domain.com {
rate_limit {
zone general 10r/s
}
reverse_proxy trading_app:5000
}
```
### Basic Authentication
```caddy
admin.your-domain.com {
basicauth {
admin $2a$14$hashed_password_here
}
reverse_proxy trading_app:5000
}
```
## SSL/TLS Configuration
### Automatic HTTPS (Default)
Caddy automatically obtains certificates from Let's Encrypt:
```caddy
your-domain.com {
reverse_proxy trading_app:5000
}
```
<Note>
No additional configuration needed! Caddy handles everything automatically.
</Note>
### Custom Certificates
```caddy
your-domain.com {
tls /path/to/cert.pem /path/to/key.pem
reverse_proxy trading_app:5000
}
```
### Internal/Self-Signed Certificates
```caddy
your-domain.com {
tls internal
reverse_proxy trading_app:5000
}
```
## Monitoring and Logging
### Access Logs
```caddy
your-domain.com {
reverse_proxy trading_app:5000
log {
output file /var/log/caddy/access.log
format json
}
}
```
### Error Handling
```caddy
your-domain.com {
reverse_proxy trading_app:5000
handle_errors {
@404 expression {http.error.status_code} == 404
handle @404 {
rewrite * /404.html
reverse_proxy trading_app:5000
}
}
}
```
## Advanced Features
### Multiple Domains
```caddy
site1.com, site2.com {
reverse_proxy trading_app:5000
}
```
### Subdomain Routing
```caddy
api.your-domain.com {
reverse_proxy trading_app:5000/api
}
app.your-domain.com {
reverse_proxy trading_app:5000
}
```
### Load Balancing
```caddy
your-domain.com {
reverse_proxy trading_app1:5000 trading_app2:5000 {
lb_policy round_robin
health_path /health
}
}
```
## Troubleshooting
### Check Caddy Status
```bash
docker-compose logs caddy
```
### Certificate Issues
```bash
# Check certificate status
docker-compose exec caddy caddy list-certificates
# Force certificate renewal
docker-compose exec caddy caddy reload --config /etc/caddy/Caddyfile
```
### Configuration Validation
```bash
# Validate Caddyfile syntax
docker-compose exec caddy caddy validate --config /etc/caddy/Caddyfile
```
### Common Issues
<AccordionGroup>
<Accordion title="Port 80/443 already in use">
```bash
# Check what's using the ports
netstat -tlnp | grep :80
netstat -tlnp | grep :443
```
Stop the conflicting service or change Caddy's ports in docker-compose.yml
</Accordion>
<Accordion title="DNS not pointing to server">
```bash
# Check DNS resolution
nslookup your-domain.com
```
Verify your domain's A record points to the correct IP address
</Accordion>
<Accordion title="Let's Encrypt rate limits">
Use staging environment for testing:
```caddy
your-domain.com {
tls {
ca https://acme-staging-v02.api.letsencrypt.org/directory
}
reverse_proxy trading_app:5000
}
```
</Accordion>
<Accordion title="Certificate validation fails">
- Ensure port 80 is accessible from the internet
- Verify DNS is propagated: `dig your-domain.com`
- Check firewall rules allow incoming connections
- Review Caddy logs for specific errors
</Accordion>
</AccordionGroup>
## Performance Tuning
### Enable HTTP/2 and HTTP/3
```caddy
your-domain.com {
protocols h1 h2 h3
reverse_proxy trading_app:5000
}
```
### Connection Limits
```caddy
your-domain.com {
reverse_proxy trading_app:5000 {
transport http {
max_conns_per_host 100
}
}
}
```
### Timeout Configuration
```caddy
your-domain.com {
reverse_proxy trading_app:5000 {
transport http {
read_timeout 30s
write_timeout 30s
}
}
}
```
## Security Best Practices
<CardGroup cols={2}>
<Card title="Strong TLS" icon="lock">
Use TLS 1.2+ with strong cipher suites (Caddy's default)
</Card>
<Card title="Security Headers" icon="shield-halved">
Add security headers like CSP, HSTS, X-Frame-Options
</Card>
<Card title="Rate Limiting" icon="gauge-high">
Implement rate limiting to prevent abuse
</Card>
<Card title="Access Control" icon="user-shield">
Use basic auth or OAuth for sensitive routes
</Card>
</CardGroup>
### Recommended Security Configuration
```caddy
your-domain.com {
reverse_proxy trading_app:5000
encode gzip
header {
# Security headers
Strict-Transport-Security "max-age=31536000; includeSubDomains; preload"
X-Content-Type-Options "nosniff"
X-Frame-Options "DENY"
X-XSS-Protection "1; mode=block"
Referrer-Policy "strict-origin-when-cross-origin"
Permissions-Policy "geolocation=(), microphone=(), camera=()"
# Hide server info
-Server
-X-Powered-By
}
}
```
## Additional Resources
<CardGroup cols={2}>
<Card title="Caddy Documentation" icon="book" href="https://caddyserver.com/docs/">
Official Caddy documentation
</Card>
<Card title="Caddyfile Syntax" icon="code" href="https://caddyserver.com/docs/caddyfile">
Learn Caddyfile syntax
</Card>
<Card title="Automatic HTTPS" icon="certificate" href="https://caddyserver.com/docs/automatic-https">
How Caddy handles HTTPS automatically
</Card>
<Card title="Docker Deployment" icon="docker" href="/guides/deployment/docker">
Back to Docker deployment guide
</Card>
</CardGroup>

View File

@@ -0,0 +1,426 @@
---
title: 'Docker Deployment'
description: 'Deploy the Trading Analysis Dashboard using Docker containers'
---
## Quick Start
<Steps>
<Step title="Install Prerequisites">
Install [Docker Desktop](https://www.docker.com/products/docker-desktop/) (includes Docker Compose)
</Step>
<Step title="Run Deployment Script">
<Tabs>
<Tab title="Windows">
```batch
deploy.bat
```
</Tab>
<Tab title="Linux/macOS">
```bash
chmod +x deploy.sh
./deploy.sh
```
</Tab>
</Tabs>
</Step>
<Step title="Manual Deployment (Alternative)">
```bash
# Copy environment file
cp .env.docker .env
# Build and start services
docker compose up -d
# Check status
docker compose ps
```
</Step>
</Steps>
## Services Overview
The deployment includes these services:
| Service | Port | Description |
|---------|------|-------------|
| **trading_app** | 8080 | Main Flask application |
| **postgres** | 5432 | PostgreSQL database |
| **caddy** | 80, 443 | Reverse proxy with automatic HTTPS |
## Access URLs
<CardGroup cols={2}>
<Card title="Production" icon="globe">
https://performance.miningwood.com
</Card>
<Card title="Main Application" icon="laptop">
http://localhost:8080
</Card>
<Card title="Via Caddy" icon="server">
http://localhost
</Card>
<Card title="Database" icon="database">
localhost:5432
</Card>
</CardGroup>
## Docker Compose Configuration
The complete `docker-compose.yml` file for the application:
```yaml docker-compose.yml
services:
server:
image: docker.gitea.com/gitea:latest
container_name: gitea
environment:
- USER_UID=${USER_UID}
- USER_GID=${USER_GID}
- GITEA__database__DB_TYPE=postgres
- GITEA__database__HOST=db:5432
- GITEA__database__NAME=${POSTGRES_USER}
- GITEA__database__USER=${POSTGRES_USER}
- GITEA__database__PASSWD=${POSTGRES_PASSWORD}
restart: always
networks:
- gitea
volumes:
- gitea:/data
- /etc/timezone:/etc/timezone:ro
- /etc/localtime:/etc/localtime:ro
ports:
- ${GITEA_HTTP_PORT:-3500}:3000
- ${GITEA_SSH_PORT:-2229}:22
depends_on:
- db
labels:
- diun.enable=true
healthcheck:
test:
- CMD
- curl
- -f
- http://localhost
interval: 10s
retries: 3
start_period: 30s
timeout: 10s
db:
image: docker.io/library/postgres:14
restart: always
environment:
- POSTGRES_USER=${POSTGRES_USER}
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
- POSTGRES_DB=${POSTGRES_DB}
networks:
- gitea
volumes:
- postgres:/var/lib/postgresql/data
runner:
image: gitea/act_runner:latest
container_name: gitea-runner
restart: always
networks:
- gitea
volumes:
- runner:/data
- /var/run/docker.sock:/var/run/docker.sock
- ./runner-config.yaml:/data/config.yaml:ro
environment:
- GITEA_INSTANCE_URL=http://server:3000
- GITEA_RUNNER_REGISTRATION_TOKEN=${GITEA_RUNNER_REGISTRATION_TOKEN}
- GITEA_RUNNER_NAME=docker-runner
- CONFIG_FILE=/data/config.yaml
command: >
sh -c "
if [ ! -f /data/.runner ]; then
act_runner register --no-interactive --instance http://server:3000 --token $${GITEA_RUNNER_REGISTRATION_TOKEN} --name docker-runner;
fi;
act_runner --config /data/config.yaml daemon
"
depends_on:
- server
labels:
- diun.enable=true
networks:
gitea:
external: false
volumes:
gitea:
postgres:
runner:
```
## Configuration
### Environment Variables
Edit the `.env` file to customize your deployment:
```env .env
# Database Configuration
DB_HOST=postgres
DB_PORT=5432
DB_NAME=mining_wood
DB_USER=trading_user
DB_PASSWORD=your_secure_password
# Flask Configuration
FLASK_SECRET_KEY=your-super-secret-key-change-this
FLASK_ENV=production
# Gitea Configuration
USER_UID=1000
USER_GID=1000
POSTGRES_USER=gitea
POSTGRES_PASSWORD=gitea_password
POSTGRES_DB=gitea
GITEA_HTTP_PORT=3500
GITEA_SSH_PORT=2229
GITEA_RUNNER_REGISTRATION_TOKEN=your_token_here
```
<Warning>
Always change default passwords before deploying to production!
</Warning>
### SSL/HTTPS Setup with Caddy
Caddy provides automatic HTTPS with Let's Encrypt:
<Tabs>
<Tab title="Local Development">
No setup needed - uses HTTP by default
</Tab>
<Tab title="Production with Domain">
```bash
# Edit Caddyfile and replace localhost with your domain
cp Caddyfile.production Caddyfile
# Edit the domain in Caddyfile: your-domain.com
```
Caddy will automatically get and renew SSL certificates!
</Tab>
</Tabs>
## Database Setup
The PostgreSQL database is automatically initialized with:
- **Database**: `mining_wood`
- **Schema**: `trading_analysis`
- **User**: `trading_user`
### Import Your Trading Data
After deployment, import your trading data:
<Steps>
<Step title="Access the database">
```bash
docker compose exec postgres psql -U trading_user -d mining_wood
```
</Step>
<Step title="Import your data">
```bash
# Copy your CSV files to the container
docker cp your-data.csv trading_app:/app/data/
# Run your import script
docker compose exec trading_app python your_import_script.py
```
</Step>
</Steps>
## Management Commands
### View Logs
```bash
# All services
docker compose logs -f
# Specific service
docker compose logs -f trading_app
docker compose logs -f postgres
docker compose logs -f caddy
```
### Restart Services
```bash
# Restart all services
docker compose restart
# Restart specific service
docker compose restart trading_app
```
### Stop/Start
```bash
# Stop all services
docker compose down
# Start services
docker compose up -d
# Stop and remove volumes (⚠️ removes database data)
docker compose down -v
```
### Update Application
```bash
# Pull latest images and restart
docker compose pull
docker compose up -d
```
### Database Backup
```bash
# Backup database
docker compose exec postgres pg_dump -U trading_user mining_wood > backup.sql
# Restore database
docker compose exec -T postgres psql -U trading_user mining_wood < backup.sql
```
## Security Considerations
### For Production Deployment
<CardGroup cols={2}>
<Card title="Change Passwords" icon="key">
Update `POSTGRES_PASSWORD` and `FLASK_SECRET_KEY` in docker compose.yml/.env
</Card>
<Card title="Enable HTTPS" icon="lock">
Configure SSL certificates and enable HTTPS redirect
</Card>
<Card title="Firewall" icon="shield">
Only expose necessary ports (80, 443). Restrict database access (5432)
</Card>
<Card title="Regular Updates" icon="rotate">
Keep Docker images updated and monitor security advisories
</Card>
</CardGroup>
## Production Deployment
### Domain Setup
<Steps>
<Step title="DNS Configuration">
- Point your domain to your server's IP address
- For performance.miningwood.com: Create an A record pointing to your server IP
</Step>
<Step title="Automatic SSL">
```bash
# Caddy handles SSL automatically with Let's Encrypt
# The domain is already configured for performance.miningwood.com
# Just deploy and Caddy will handle the rest
docker compose up -d
```
</Step>
<Step title="Environment">
- Domain is already set to `performance.miningwood.com` in `.env.docker`
- Set `FLASK_ENV=production`
- Use strong passwords
</Step>
</Steps>
### Monitoring
Consider adding monitoring services:
```yaml docker-compose.yml
# Add to docker compose.yml
prometheus:
image: prom/prometheus
ports:
- "9090:9090"
grafana:
image: grafana/grafana
ports:
- "3000:3000"
```
## Troubleshooting
<AccordionGroup>
<Accordion title="Application Won't Start">
```bash
# Check logs
docker compose logs trading_app
# Common issues:
# - Database connection failure
# - Missing environment variables
# - Port conflicts
```
</Accordion>
<Accordion title="Database Connection Issues">
```bash
# Check database status
docker compose exec postgres pg_isready -U trading_user
# Reset database
docker compose down -v
docker compose up -d
```
</Accordion>
<Accordion title="Performance Issues">
```bash
# Check resource usage
docker stats
# Scale services
docker compose up -d --scale trading_app=2
```
</Accordion>
<Accordion title="SSL Certificate Issues">
- Ensure DNS is pointing to correct server
- Wait a few minutes for certificate provisioning
- Check Caddy logs: `docker compose logs caddy`
</Accordion>
</AccordionGroup>
## Development Mode
To run in development mode:
```bash
# Use development override
docker compose -f docker compose.yml -f docker compose.dev.yml up -d
```
This enables:
- Live code reloading
- Debug mode
- Development tools
## Next Steps
<CardGroup cols={2}>
<Card title="Caddy Configuration" icon="server" href="/guides/deployment/caddy">
Learn more about Caddy reverse proxy setup
</Card>
<Card title="CI/CD Setup" icon="rocket" href="/guides/setup/cicd">
Automate deployments with CI/CD
</Card>
</CardGroup>

283
guides/setup/cicd.mdx Normal file
View File

@@ -0,0 +1,283 @@
---
title: 'CI/CD Setup with Gitea'
description: 'Set up continuous integration and deployment using Gitea Actions'
---
## Overview
This guide will help you set up continuous integration and continuous deployment (CI/CD) for your trading analysis application using Gitea Actions.
## Prerequisites
Before starting, ensure you have:
<CardGroup cols={2}>
<Card title="Gitea Server" icon="server">
Running and accessible Gitea instance
</Card>
<Card title="Production Server" icon="cloud">
Docker, Docker Compose, SSH access, and Git installed
</Card>
<Card title="Domain Name" icon="globe">
Domain pointing to your production server
</Card>
<Card title="SSH Keys" icon="key">
SSH key pair for deployment access
</Card>
</CardGroup>
## Step 1: Repository Setup
Push your code to Gitea and enable Actions:
```bash
git remote add origin https://your-gitea-instance.com/your-username/stocks-trading-analysis.git
git push -u origin main
```
<Steps>
<Step title="Enable Gitea Actions">
Go to Repository Settings → Actions and enable Actions for this repository
</Step>
</Steps>
## Step 2: Configure Repository Secrets
Navigate to your repository → Settings → Secrets and add the following secrets:
### Required Secrets
| Secret Name | Description | Example |
|-------------|-------------|---------|
| `SSH_PRIVATE_KEY` | SSH private key for production server access | `-----BEGIN OPENSSH PRIVATE KEY-----\n...` |
| `PRODUCTION_HOST` | Production server IP or hostname | `203.0.113.1` or `server.example.com` |
| `PRODUCTION_USER` | SSH username for production server | `ubuntu`, `root`, or your username |
| `DOMAIN` | Your production domain | `performance.miningwood.com` |
### Application Secrets
| Secret Name | Description | Example |
|-------------|-------------|---------|
| `FLASK_SECRET_KEY` | Flask session secret key | `your-very-secure-secret-key-here` |
| `POSTGRES_PASSWORD` | Production database password | `secure-database-password` |
| `GOOGLE_CLIENT_ID` | OAuth Google Client ID | `123456789.apps.googleusercontent.com` |
| `GOOGLE_CLIENT_SECRET` | OAuth Google Client Secret | `GOCSPX-your-client-secret` |
| `AUTHORIZED_USERS` | Comma-separated authorized emails | `admin@example.com,user@example.com` |
### Optional Notification Secrets
| Secret Name | Description |
|-------------|-------------|
| `SLACK_WEBHOOK_URL` | Slack webhook for notifications |
| `DISCORD_WEBHOOK_URL` | Discord webhook for notifications |
## Step 3: Production Server Setup
### Create Application Directory
```bash
# SSH into your production server
ssh your-user@your-production-server
# Create application directory
sudo mkdir -p /opt/stocks-trading-analysis
sudo chown $USER:$USER /opt/stocks-trading-analysis
cd /opt/stocks-trading-analysis
# Clone the repository
git clone https://your-gitea-instance.com/your-username/stocks-trading-analysis.git .
```
### Configure Environment Variables
```bash
# Copy the production environment template
cp .gitea/deployment/production.env .env
# Edit the environment file with your actual values
nano .env
```
Update the following values in `.env`:
- `POSTGRES_PASSWORD`: Set a secure database password
- `FLASK_SECRET_KEY`: Generate a secure secret key
- `GOOGLE_CLIENT_ID` & `GOOGLE_CLIENT_SECRET`: Your OAuth credentials
- `AUTHORIZED_USERS`: List of authorized email addresses
- `DOMAIN`: Your production domain name
### Initial Deployment
```bash
# Make deployment script executable
chmod +x .gitea/deployment/deploy.sh
# Run initial deployment
./deploy.sh
```
## Step 4: SSH Key Setup
### Generate SSH Key Pair (if needed)
```bash
# On your local machine or CI/CD runner
ssh-keygen -t ed25519 -C "gitea-actions-deployment" -f ~/.ssh/gitea_deploy_key
```
### Add Public Key to Production Server
```bash
# Copy public key to production server
ssh-copy-id -i ~/.ssh/gitea_deploy_key.pub your-user@your-production-server
# Or manually add to authorized_keys
cat ~/.ssh/gitea_deploy_key.pub | ssh your-user@your-production-server "mkdir -p ~/.ssh && cat >> ~/.ssh/authorized_keys"
```
### Add Private Key to Gitea Secrets
```bash
# Copy private key content
cat ~/.ssh/gitea_deploy_key
# Add this content to the SSH_PRIVATE_KEY secret in Gitea
```
## Step 5: Test the CI/CD Pipeline
### Trigger First Pipeline
<Steps>
<Step title="Make a change">
Make a small change to your code
</Step>
<Step title="Commit and push">
```bash
git add .
git commit -m "Test CI/CD pipeline"
git push origin main
```
</Step>
<Step title="Monitor the pipeline">
Check the Actions tab in your Gitea repository to see the pipeline running
</Step>
</Steps>
### Verify Deployment
<Tabs>
<Tab title="Web Check">
Visit `https://your-domain.com` to verify the application is running
</Tab>
<Tab title="Logs">
SSH to server and run `docker compose logs -f`
</Tab>
<Tab title="Services">
Run `docker compose ps` to check service status
</Tab>
</Tabs>
## Workflow Overview
### Automatic Triggers
- **Push to main/master**: Triggers full CI/CD pipeline with production deployment
- **Push to develop**: Triggers CI/CD pipeline with staging deployment (if configured)
- **Pull requests**: Triggers testing and build validation only
- **Schedule**: Security scans run weekly, cleanup runs weekly
### Manual Triggers
Navigate to Actions tab in your repository, click "Run workflow" on any workflow, select branch and run.
## Monitoring and Maintenance
### Check Application Health
```bash
# SSH to production server
ssh your-user@your-production-server
# Check service status
docker compose ps
# View logs
docker compose logs -f trading_app
# Check resource usage
docker stats
```
### Database Backups
Backups are automatically created during deployments and stored in `/opt/backups/stocks-app/`.
```bash
# Manual backup
docker compose exec postgres pg_dump -U trading_user mining_wood | gzip > backup_$(date +%Y%m%d_%H%M%S).sql.gz
# Restore from backup
gunzip -c backup_file.sql.gz | docker compose exec -T postgres psql -U trading_user mining_wood
```
### SSL Certificate
Caddy automatically handles SSL certificates. Check certificate status:
```bash
# Check certificate
echo | openssl s_client -servername your-domain.com -connect your-domain.com:443 2>/dev/null | openssl x509 -noout -dates
```
## Troubleshooting
<AccordionGroup>
<Accordion title="Pipeline fails at SSH step">
- Verify SSH key is correctly formatted in secrets
- Check server SSH configuration
- Ensure server is accessible from internet
</Accordion>
<Accordion title="Docker build fails">
- Check Dockerfile syntax
- Verify all dependencies in requirements.txt
- Check for file permission issues
</Accordion>
<Accordion title="Application doesn't start">
- Check environment variables in .env
- Verify database is running: `docker compose logs postgres`
- Check application logs: `docker compose logs trading_app`
</Accordion>
<Accordion title="SSL certificate issues">
- Ensure DNS is pointing to correct server
- Wait a few minutes for certificate provisioning
- Check Caddy logs: `docker compose logs caddy`
</Accordion>
</AccordionGroup>
## Security Best Practices
<Warning>
Remember to regularly rotate secrets and monitor deployment logs for suspicious activity.
</Warning>
1. **Regularly rotate secrets** (SSH keys, database passwords)
2. **Monitor deployment logs** for suspicious activity
3. **Keep dependencies updated** (run security scans)
4. **Use strong passwords** for all services
5. **Backup regularly** and test restore procedures
6. **Monitor server resources** and set up alerts
## Customization
You can customize the CI/CD pipeline by modifying files in `.gitea/workflows/`:
- `main.yml`: Main CI/CD pipeline
- `security.yml`: Security scanning
- `cleanup.yml`: Resource cleanup and maintenance
<Note>
Remember to test changes in a staging environment before deploying to production!
</Note>

262
guides/setup/multi-user.mdx Normal file
View File

@@ -0,0 +1,262 @@
---
title: 'Multi-User Support'
description: 'Configure multi-user support with separate brokerage accounts'
---
## Overview
The application supports multiple users, each with their own brokerage account numbers and transaction data. Users authenticate via Google OAuth and can set up their brokerage account number in their profile.
## Database Schema Changes
### New Tables
#### `trading_analysis.users`
Stores user information from OAuth:
| Column | Type | Description |
|--------|------|-------------|
| `id` | Primary Key | User identifier |
| `email` | Unique | User email address |
| `name` | String | User's full name |
| `google_sub` | String | Google OAuth subject ID |
| `picture_url` | String | Profile picture URL |
| `brokerage_account_number` | String | User's primary account |
| `is_active` | Boolean | Account active status |
| `created_at` | Timestamp | Creation date |
| `updated_at` | Timestamp | Last update date |
#### `trading_analysis.brokerage_accounts`
Cross-reference table for account numbers:
| Column | Type | Description |
|--------|------|-------------|
| `id` | Primary Key | Account identifier |
| `account_number` | Unique | Brokerage account number |
| `account_display_name` | String | Optional friendly name |
| `user_id` | Foreign Key | Links to users table |
| `is_primary` | Boolean | Primary account flag |
| `created_at` | Timestamp | Creation date |
| `updated_at` | Timestamp | Last update date |
### Updated Tables
All existing tables have been updated with a `brokerage_account_id` foreign key:
- `raw_transactions`
- `matched_trades`
- `dividend_transactions`
- `monthly_trading_summary`
- `monthly_dividend_summary`
- `monthly_combined_summary`
- `processing_log`
## Migration Process
To migrate an existing database to support multiple users:
### Step 1: Run the Migration Script
```bash
python migrate_to_multiuser.py
```
### Step 2: Set Environment Variables (optional)
```bash
export DEFAULT_MIGRATION_EMAIL="your-admin@example.com"
export DEFAULT_MIGRATION_NAME="Admin User"
export DEFAULT_BROKERAGE_ACCOUNT="YOUR_ACCOUNT_NUMBER"
```
<Info>
The migration script will create default values if these environment variables are not set.
</Info>
### What the Migration Does
<Steps>
<Step title="Create new tables">
Creates `users` and `brokerage_accounts` tables
</Step>
<Step title="Add foreign keys">
Adds `brokerage_account_id` columns to existing tables
</Step>
<Step title="Create default user">
Creates a default user and account for existing data
</Step>
<Step title="Update transactions">
Updates all existing transactions to reference the default account
</Step>
<Step title="Recreate views">
Recreates database views to work with the new schema
</Step>
</Steps>
## Application Changes
### User Profile Management
<CardGroup cols={2}>
<Card title="Profile Page" icon="user">
Users can now set their brokerage account number in their profile
</Card>
<Card title="Account Validation" icon="check">
CSV uploads require a valid brokerage account number
</Card>
<Card title="Multiple Accounts" icon="building-columns">
Users can have multiple brokerage accounts (future feature)
</Card>
<Card title="Data Isolation" icon="lock">
Users only see their own transaction data
</Card>
</CardGroup>
### Upload Process
1. **User Validation**: Checks that user has a brokerage account before allowing uploads
2. **Account Association**: All uploaded transactions are associated with the user's account
3. **Processing**: Modified `trading_analysis.py` to accept `--account-id` parameter
### Authentication Flow
<Steps>
<Step title="Login">
User logs in via Google OAuth
</Step>
<Step title="User Creation">
User record is created/updated in the database
</Step>
<Step title="Set Account">
User sets their brokerage account number in profile
</Step>
<Step title="Link Account">
Brokerage account record is created and linked to user
</Step>
<Step title="Upload Data">
CSV uploads are associated with the user's account
</Step>
</Steps>
## Database Queries
### User-Specific Data
All queries now need to filter by `brokerage_account_id`:
```sql
-- Get user's transactions
SELECT * FROM trading_analysis.raw_transactions rt
JOIN trading_analysis.brokerage_accounts ba ON rt.brokerage_account_id = ba.id
WHERE ba.user_id = ?;
-- Get user's trading performance
SELECT * FROM trading_analysis.v_trading_performance
WHERE user_email = 'user@example.com';
```
### Updated Views
Views now include user context:
- `v_current_positions` - Shows account and user information
- `v_trading_performance` - Includes user email and account number
## Configuration
### Environment Variables
```bash
# Migration Configuration
DEFAULT_MIGRATION_EMAIL=your-admin@example.com
DEFAULT_MIGRATION_NAME=Admin User
DEFAULT_BROKERAGE_ACCOUNT=YOUR_ACCOUNT_NUMBER
# OAuth Configuration (existing)
GOOGLE_CLIENT_ID=your-client-id
GOOGLE_CLIENT_SECRET=your-client-secret
AUTHORIZED_USERS=user1@example.com,user2@example.com
```
## Security Considerations
<Warning>
User data isolation is critical for multi-user environments. Always verify queries filter by the correct account ID.
</Warning>
1. **User Isolation**: Users can only see their own transaction data
2. **Account Validation**: Brokerage account numbers are validated before processing
3. **OAuth Integration**: User authentication is handled by Google OAuth
4. **Data Protection**: User data is isolated by account ID in all database operations
## Future Enhancements
<CardGroup cols={2}>
<Card title="Multiple Accounts" icon="layer-group">
Support for users with multiple brokerage accounts
</Card>
<Card title="Account Sharing" icon="share-nodes">
Allow users to share specific accounts with other users
</Card>
<Card title="Admin Interface" icon="user-shield">
Administrative interface for managing users and accounts
</Card>
<Card title="Data Export" icon="download">
User-specific data export functionality
</Card>
</CardGroup>
## Troubleshooting
<AccordionGroup>
<Accordion title="Migration Fails">
- Ensure database connection is working
- Verify you have proper permissions
- Check for existing foreign key constraints
- Review migration logs for specific errors
</Accordion>
<Accordion title="User Profile Issues">
- Check that OAuth is configured correctly
- Verify user email is in AUTHORIZED_USERS
- Check application logs for authentication errors
</Accordion>
<Accordion title="Upload Failures">
- Verify user has set brokerage account number in profile
- Check CSV format matches expected schema
- Review processing logs in `trading_analysis.log`
</Accordion>
<Accordion title="Data Not Showing">
- Ensure queries are filtering by correct account ID
- Verify user-account association is correct
- Check database views are updated
</Accordion>
</AccordionGroup>
### Database Verification
```sql
-- Check user-account associations
SELECT u.email, u.brokerage_account_number, ba.account_number, ba.is_primary
FROM trading_analysis.users u
LEFT JOIN trading_analysis.brokerage_accounts ba ON u.id = ba.user_id;
-- Check transaction associations
SELECT COUNT(*) as transaction_count, ba.account_number, u.email
FROM trading_analysis.raw_transactions rt
JOIN trading_analysis.brokerage_accounts ba ON rt.brokerage_account_id = ba.id
JOIN trading_analysis.users u ON ba.user_id = u.id
GROUP BY ba.account_number, u.email;
```
## Next Steps
<CardGroup cols={2}>
<Card title="Portfolio Management" icon="chart-line" href="/features/portfolio-management">
Set up portfolio tracking for your account
</Card>
<Card title="CSV Upload" icon="file-csv" href="/features/csv-upload">
Learn how to upload transaction data
</Card>
</CardGroup>

234
guides/setup/sso.mdx Normal file
View File

@@ -0,0 +1,234 @@
---
title: 'SSO Authentication Setup'
description: 'Configure Google OAuth 2.0 authentication for your Trading Analysis Dashboard'
---
## Overview
This guide will help you configure Google OAuth 2.0 authentication for secure access to your Trading Analysis Dashboard.
## Step 1: Create Google OAuth Application
<Steps>
<Step title="Access Google Cloud Console">
Visit [Google Cloud Console](https://console.cloud.google.com/) and sign in with your Google account
</Step>
<Step title="Create a New Project">
- Click "Select a project" → "New Project"
- Name: "Trading Dashboard"
- Click "Create"
</Step>
<Step title="Enable Google+ API">
- Go to "APIs & Services" → "Library"
- Search for "Google+ API" and enable it
- Also enable "Google Identity" if available
</Step>
<Step title="Create OAuth 2.0 Credentials">
- Go to "APIs & Services" → "Credentials"
- Click "Create Credentials" → "OAuth 2.0 Client IDs"
- Choose "Web application"
- Name: "Trading Dashboard Auth"
</Step>
<Step title="Configure Authorized URLs">
Add the following URLs:
**Authorized JavaScript origins:**
- `https://performance.miningwood.com`
- `http://localhost:8080` (for testing)
**Authorized redirect URIs:**
- `https://performance.miningwood.com/auth/callback`
- `http://localhost:8080/auth/callback` (for testing)
</Step>
<Step title="Copy Credentials">
Copy the "Client ID" and "Client Secret" for the next step
</Step>
</Steps>
## Step 2: Configure Environment Variables
Update your `.env.docker` file with the OAuth credentials:
```bash .env.docker
# OAuth Configuration
GOOGLE_CLIENT_ID=your-actual-client-id.apps.googleusercontent.com
GOOGLE_CLIENT_SECRET=your-actual-client-secret
# Authorized Users (your email addresses)
AUTHORIZED_USERS=your-email@gmail.com,admin@company.com
```
<Warning>
Never commit your `.env` files to version control. Keep them secure and out of your repository.
</Warning>
## Step 3: Update and Deploy
### Rebuild the application
```bash
docker compose build trading_app
docker compose restart trading_app
```
### Test the authentication
<Steps>
<Step title="Visit your application">
Navigate to `https://performance.miningwood.com`
</Step>
<Step title="Login">
You should be redirected to the login page. Click "Sign in with Google"
</Step>
<Step title="Authorize">
Authorize the application when prompted by Google
</Step>
<Step title="Access granted">
You should be redirected back and logged in successfully
</Step>
</Steps>
## Security Features
<CardGroup cols={2}>
<Card title="OAuth 2.0 with Google" icon="shield-check">
Industry standard authentication protocol
</Card>
<Card title="User Authorization" icon="users">
Only specific email addresses can access
</Card>
<Card title="Session Management" icon="clock">
Secure server-side sessions with expiration
</Card>
<Card title="HTTPS Enforcement" icon="lock">
All authentication over encrypted connections
</Card>
</CardGroup>
## User Management
### Add Users
Add email addresses to `AUTHORIZED_USERS` in `.env.docker`, separated by commas:
```bash
AUTHORIZED_USERS=user1@example.com,user2@example.com,user3@example.com
```
Then restart the application:
```bash
docker compose restart trading_app
```
### Remove Users
Remove email addresses from `AUTHORIZED_USERS` and restart the application.
<Note>
Leave `AUTHORIZED_USERS` empty to allow all users (not recommended for production)
</Note>
## Troubleshooting
<AccordionGroup>
<Accordion title="Authentication failed">
- Check that Client ID and Secret are correct in `.env.docker`
- Verify redirect URLs match exactly in Google Cloud Console
- Ensure Google+ API is enabled
- Check application logs: `docker compose logs trading_app`
</Accordion>
<Accordion title="Access denied">
- Verify your email is in `AUTHORIZED_USERS`
- Ensure email case matches exactly
- Check for extra spaces in the email list
</Accordion>
<Accordion title="Login loop">
- Clear browser cookies for your domain
- Verify Flask secret key is set in `.env.docker`
- Check session configuration in application logs
</Accordion>
<Accordion title="Callback URL mismatch">
Ensure the redirect URIs in Google Cloud Console match your deployment:
- Use `https://` for production
- Include the exact domain and path
- No trailing slashes
</Accordion>
</AccordionGroup>
## Alternative OAuth Providers
You can also configure other OAuth providers:
<Tabs>
<Tab title="GitHub OAuth">
```bash .env.docker
GITHUB_CLIENT_ID=your-github-client-id
GITHUB_CLIENT_SECRET=your-github-client-secret
```
1. Create OAuth App at https://github.com/settings/developers
2. Set Authorization callback URL to `https://your-domain.com/auth/callback`
</Tab>
<Tab title="Microsoft OAuth">
```bash .env.docker
MICROSOFT_CLIENT_ID=your-microsoft-client-id
MICROSOFT_CLIENT_SECRET=your-microsoft-client-secret
```
1. Register app at https://portal.azure.com
2. Add redirect URI in Authentication settings
</Tab>
</Tabs>
<Info>
Contact your administrator if you need help configuring alternative providers.
</Info>
## Testing OAuth Configuration
To test your OAuth setup locally:
```bash
# Start the application locally
docker compose up -d
# Check logs for any OAuth errors
docker compose logs -f trading_app
# Visit localhost
open http://localhost:8080
```
## Security Checklist
- [ ] OAuth credentials are stored in `.env` files, not in code
- [ ] `.env` files are in `.gitignore`
- [ ] `AUTHORIZED_USERS` list is properly configured
- [ ] HTTPS is enabled in production
- [ ] Strong `FLASK_SECRET_KEY` is set
- [ ] Redirect URIs are exact matches in Google Cloud Console
- [ ] Google+ API is enabled
## Next Steps
<CardGroup cols={2}>
<Card title="Multi-User Setup" icon="users" href="/guides/setup/multi-user">
Configure multi-user support with brokerage accounts
</Card>
<Card title="Deployment" icon="rocket" href="/guides/deployment/docker">
Deploy your application to production
</Card>
</CardGroup>

166
index.mdx
View File

@@ -1,97 +1,115 @@
---
title: "Introduction"
description: "Welcome to the new home for your documentation"
title: "Trading Analysis Dashboard"
description: "A comprehensive platform for analyzing trading performance and managing your investment portfolio"
---
## Setting up
## Welcome to Trading Analysis Dashboard
Get your documentation site up and running in minutes.
A modern, interactive web application for tracking and analyzing your stock trading performance with real-time portfolio management and comprehensive reporting.
<Card
title="Start here"
icon="rocket"
href="/quickstart"
horizontal
>
Follow our three step quickstart guide.
</Card>
## Make it yours
Design a docs site that looks great and empowers your users.
<Columns cols={2}>
<CardGroup cols={2}>
<Card
title="Edit locally"
icon="pen-to-square"
href="/development"
title="Quick Start"
icon="rocket"
href="/quickstart"
>
Edit your docs locally and preview them in real time.
Get up and running in minutes with our quick start guide
</Card>
<Card
title="Customize your site"
icon="palette"
href="/essentials/settings"
title="Portfolio Management"
icon="chart-line"
href="/features/portfolio-management"
>
Customize the design and colors of your site to match your brand.
</Card>
<Card
title="Set up navigation"
icon="map"
href="/essentials/navigation"
>
Organize your docs to help users find what they need and succeed with your product.
Track your holdings with real-time price updates from Finnhub
</Card>
<Card
title="API documentation"
icon="terminal"
title="Trading Analysis"
icon="magnifying-glass-chart"
href="/features/trading-analysis"
>
Analyze monthly trading performance and track P&L
</Card>
<Card
title="API Reference"
icon="code"
href="/api-reference/introduction"
>
Auto-generate API documentation from OpenAPI specifications.
Integrate with our comprehensive RESTful API
</Card>
</Columns>
</CardGroup>
## Create beautiful pages
## Key Features
Everything you need to create world-class documentation.
<AccordionGroup>
<Accordion title="📊 Portfolio Tracking" icon="chart-pie">
Track your stock, ETF, and mutual fund holdings with real-time price updates from Finnhub. View allocation charts, performance metrics, and gain/loss analysis.
</Accordion>
<Accordion title="📈 Trading Analysis" icon="chart-line">
Analyze monthly trading performance with detailed P&L breakdowns, win/loss ratios, and trade-by-trade analysis. View comprehensive reports with dividend tracking.
</Accordion>
<Accordion title="🔐 Secure Authentication" icon="lock">
Google OAuth 2.0 integration with user-specific data isolation. Support for multiple users with separate brokerage accounts.
</Accordion>
<Accordion title="📁 CSV Upload" icon="file-csv">
Easy CSV import with drag-and-drop support. Real-time processing feedback and upload history tracking.
</Accordion>
<Accordion title="🚀 Docker Deployment" icon="docker">
Complete Docker Compose setup with PostgreSQL database, Caddy reverse proxy, and automatic HTTPS with Let's Encrypt.
</Accordion>
<Accordion title="⚙️ CI/CD Ready" icon="robot">
Full Gitea Actions workflow for automated testing, building, and deployment with security scanning and rollback capabilities.
</Accordion>
</AccordionGroup>
<Columns cols={2}>
<Card
title="Write with MDX"
icon="pen-fancy"
href="/essentials/markdown"
>
Use MDX to style your docs pages.
</Card>
<Card
title="Code samples"
icon="code"
href="/essentials/code"
>
Add sample code to demonstrate how to use your product.
</Card>
<Card
title="Images"
icon="image"
href="/essentials/images"
>
Display images and other media.
</Card>
<Card
title="Reusable snippets"
icon="recycle"
href="/essentials/reusable-snippets"
>
Write once and reuse across your docs.
</Card>
</Columns>
## Technology Stack
## Need inspiration?
<CardGroup cols={3}>
<Card title="Flask" icon="python">
Python web framework
</Card>
<Card title="PostgreSQL" icon="database">
Relational database
</Card>
<Card title="Docker" icon="docker">
Containerization
</Card>
<Card title="Caddy" icon="server">
Reverse proxy & HTTPS
</Card>
<Card title="Finnhub" icon="chart-candlestick">
Real-time market data
</Card>
<Card title="OAuth 2.0" icon="shield-check">
Secure authentication
</Card>
</CardGroup>
## Getting Started
<Steps>
<Step title="Prerequisites">
Install Docker and Docker Compose, and obtain a Finnhub API key
</Step>
<Step title="Clone & Configure">
Clone the repository and set up your environment variables
</Step>
<Step title="Deploy">
Run the deployment script to start all services
</Step>
<Step title="Upload Data">
Import your trading data via CSV upload
</Step>
</Steps>
<Card
title="See complete examples"
icon="stars"
href="https://mintlify.com/customers"
title="Ready to dive in?"
icon="book-open"
href="/quickstart"
>
Browse our showcase of exceptional documentation sites.
Check out our comprehensive quickstart guide to get your dashboard running in minutes
</Card>

View File

@@ -1,80 +1,205 @@
---
title: "Quickstart"
description: "Start building awesome documentation in minutes"
description: "Get your Trading Analysis Dashboard up and running in minutes"
---
## Get started in three steps
## Get Started in Four Steps
Get your documentation site running locally and make your first customization.
Deploy your trading analysis dashboard and start tracking your portfolio performance.
### Step 1: Set up your local environment
### Step 1: Prerequisites
<AccordionGroup>
<Accordion icon="copy" title="Clone your docs locally">
During the onboarding process, you created a GitHub repository with your docs content if you didn't already have one. You can find a link to this repository in your [dashboard](https://dashboard.mintlify.com).
To clone the repository locally so that you can make and preview changes to your docs, follow the [Cloning a repository](https://docs.github.com/en/repositories/creating-and-managing-repositories/cloning-a-repository) guide in the GitHub docs.
<Accordion icon="docker" title="Install Docker">
Install [Docker Desktop](https://www.docker.com/products/docker-desktop/) which includes Docker Compose. This is required for running the application containers.
</Accordion>
<Accordion icon="rectangle-terminal" title="Start the preview server">
1. Install the Mintlify CLI: `npm i -g mint`
2. Navigate to your docs directory and run: `mint dev`
3. Open `http://localhost:3000` to see your docs live!
<Accordion icon="key" title="Get Finnhub API Key">
1. Register for a free account at [Finnhub.io](https://finnhub.io/register)
2. Navigate to your dashboard
3. Copy your API key - you'll need this for real-time price updates
<Tip>Your preview updates automatically as you edit files.</Tip>
<Tip>The free tier includes 60 API calls per minute and real-time US stock quotes</Tip>
</Accordion>
<Accordion icon="google" title="Set up Google OAuth (Optional)">
For secure authentication, create OAuth credentials:
1. Visit [Google Cloud Console](https://console.cloud.google.com/)
2. Create a new project
3. Enable Google+ API
4. Create OAuth 2.0 credentials
5. Copy Client ID and Client Secret
See the [SSO Setup Guide](/guides/setup/sso) for detailed instructions.
</Accordion>
</AccordionGroup>
### Step 2: Deploy your changes
### Step 2: Configure Environment
<AccordionGroup>
<Accordion icon="github" title="Install our GitHub app">
Install the Mintlify GitHub app from your [dashboard](https://dashboard.mintlify.com/settings/organization/github-app).
<Steps>
<Step title="Clone the repository">
```bash
git clone https://your-repo-url/trading-analysis-dashboard.git
cd trading-analysis-dashboard
```
</Step>
<Step title="Copy environment file">
```bash
cp .env.docker .env
```
</Step>
<Step title="Edit .env file">
Update the following values:
Our GitHub app automatically deploys your changes to your docs site, so you don't need to manage deployments yourself.
</Accordion>
<Accordion icon="palette" title="Update your site name and colors">
For a first change, let's update the name and colors of your docs site.
```env .env
# Finnhub Configuration
FINNHUB_API_KEY=your_finnhub_api_key_here
# Database Configuration
POSTGRES_PASSWORD=choose_secure_password
# Flask Configuration
FLASK_SECRET_KEY=generate_random_secret_key
# OAuth Configuration (Optional)
GOOGLE_CLIENT_ID=your_google_client_id
GOOGLE_CLIENT_SECRET=your_google_client_secret
AUTHORIZED_USERS=your-email@example.com
```
<Warning>Never commit the `.env` file to version control!</Warning>
</Step>
</Steps>
1. Open `docs.json` in your editor.
2. Change the `"name"` field to your project name.
3. Update the `"colors"` to match your brand.
4. Save and see your changes instantly at `http://localhost:3000`.
### Step 3: Deploy
<Tip>Try changing the primary color to see an immediate difference!</Tip>
</Accordion>
</AccordionGroup>
<Tabs>
<Tab title="Windows">
```batch
deploy.bat
```
</Tab>
<Tab title="Linux/macOS">
```bash
chmod +x deploy.sh
./deploy.sh
```
</Tab>
<Tab title="Manual">
```bash
docker compose up -d
docker compose ps
```
</Tab>
</Tabs>
### Step 3: Go live
<Check>
Wait for all containers to start. This may take a minute for first-time setup.
</Check>
<Accordion icon="rocket" title="Publish your docs">
1. Commit and push your changes.
2. Your docs will update and be live in moments!
</Accordion>
### Step 4: Access and Configure
## Next steps
<Steps>
<Step title="Open the application">
Navigate to `http://localhost:8080` in your browser
</Step>
<Step title="Login (if OAuth enabled)">
Click "Sign in with Google" and authorize the application
</Step>
<Step title="Set up your profile">
Add your brokerage account number in the profile page
</Step>
<Step title="Upload trading data">
Go to the Upload page and import your CSV transaction history
</Step>
</Steps>
Now that you have your docs running, explore these key features:
## Next Steps
Now that your dashboard is running, explore these features:
<CardGroup cols={2}>
<Card title="Write Content" icon="pen-to-square" href="/essentials/markdown">
Learn MDX syntax and start writing your documentation.
<Card title="Portfolio Management" icon="chart-line" href="/features/portfolio-management">
Add your holdings and track real-time performance
</Card>
<Card title="Customize style" icon="palette" href="/essentials/settings">
Make your docs match your brand perfectly.
<Card title="CSV Upload" icon="file-csv" href="/features/csv-upload">
Import your trading history from your broker
</Card>
<Card title="Add code examples" icon="square-code" href="/essentials/code">
Include syntax-highlighted code blocks.
<Card title="Trading Analysis" icon="magnifying-glass-chart" href="/features/trading-analysis">
Analyze monthly trading performance and P&L
</Card>
<Card title="API documentation" icon="code" href="/api-reference/introduction">
Auto-generate API docs from OpenAPI specs.
<Card title="API Reference" icon="code" href="/api-reference/introduction">
Integrate with the REST API
</Card>
</CardGroup>
## Common Tasks
### View Logs
```bash
# All services
docker compose logs -f
# Specific service
docker compose logs -f trading_app
```
### Restart Services
```bash
docker compose restart
```
### Update Application
```bash
docker compose pull
docker compose up -d
```
### Backup Database
```bash
docker compose exec postgres pg_dump -U trading_user mining_wood > backup.sql
```
## Troubleshooting
<AccordionGroup>
<Accordion title="Application won't start">
Check the logs for errors:
```bash
docker compose logs trading_app
```
Common issues:
- Missing environment variables
- Database connection failure
- Port 8080 already in use
</Accordion>
<Accordion title="Can't login">
- Verify OAuth credentials are correct in `.env`
- Check that your email is in `AUTHORIZED_USERS`
- Clear browser cookies and try again
</Accordion>
<Accordion title="Prices not updating">
- Verify `FINNHUB_API_KEY` is set correctly
- Check API quota hasn't been exceeded
- Review application logs for API errors
</Accordion>
</AccordionGroup>
<Note>
**Need help?** See our [full documentation](https://mintlify.com/docs) or join our [community](https://mintlify.com/community).
**Need help?** Check our [deployment guide](/guides/deployment/docker) or [setup guides](/guides/setup/cicd) for more detailed instructions.
</Note>

15
runner-config.yaml Normal file
View File

@@ -0,0 +1,15 @@
log:
level: info
runner:
capacity: 1
timeout: 3h
container:
# Use the gitea network so job containers can resolve the 'server' hostname
network: gitea_gitea
privileged: false
options: ""
workdir_parent: ""
valid_volumes: []
docker_host: ""