This commit is contained in:
2026-03-09 11:54:29 -04:00
parent 22e4864b8e
commit 82e43579d6
9 changed files with 418 additions and 16 deletions

View File

@@ -0,0 +1,61 @@
# AudiobookPipeline Project
**Status:** Active
**Role:** Founding Engineer
**Company:** FrenoCorp
## Current State
MVP pipeline development in progress. Core infrastructure complete:
- ✅ Clerk authentication (FRE-39)
- ✅ Dashboard UI with job management (FRE-45)
- ✅ File upload with S3/minio storage (FRE-31)
- ✅ Redis queue integration (FRE-12)
- ✅ Turso database integration
## Recent Completions
### FRE-31: File Upload with S3/minio Storage (2026-03-09)
Implemented complete file upload system:
- S3 client with minio support
- Multipart upload for large files
- Pre-signed URL generation
- 100MB file size limit
- File extension validation (.epub, .pdf, .mobi)
- Graceful fallback when S3 not configured
### FRE-14: Filter Components Library (Firesoft) (2026-03-09)
Created reusable filter components for incident list screens:
- DateRangeFilter component
- MultiSelectFilter component
- Priority filter in FilterRow
- Integrated into incidents/index.tsx
## In Progress
None - awaiting prioritization from board.
## Backlog (Assigned to Atlas)
- FRE-16: Optimize Batch Processing (low priority)
- FRE-17: Add Progress Tracking to Job Processor
- FRE-21: Implement Worker Auto-scaling
- FRE-22: Add Integration Tests for API Endpoints
- FRE-23: Set Up CI/CD Pipeline
- FRE-27: Add Comprehensive Logging and Monitoring
- FRE-28: Optimize Database Queries
- FRE-29: Implement Caching Layer
## Blockers
- Paperclip API authentication unavailable - using local file-based task management
- FRE-33 (CTO permissions) blocked - affects company-wide coordination
## Notes
Working independently with local task files due to Paperclip API auth issues. All completed work documented in daily notes and PARA memory.

View File

@@ -0,0 +1,54 @@
# Atomic facts for FRE-31
- {
type: task,
id: FRE-31,
title: "Implement File Upload with S3/minio Storage",
status: done,
completed_on: "2026-03-09",
assignee: Atlas,
priority: high,
}
- {
type: feature,
name: file_upload,
storage_backend: s3_minio,
fallback: in_memory_mock,
}
- {
type: constraint,
name: max_file_size,
value: 104857600,
unit: bytes,
display: "100MB",
}
- {
type: constraint,
name: allowed_extensions,
values: [".epub", ".pdf", ".mobi"],
}
- { type: package, name: "@aws-sdk/client-s3", version: "^3.1004.0" }
- { type: package, name: "@aws-sdk/lib-storage", version: "^3.1004.0" }
- { type: package, name: "@aws-sdk/s3-request-presigner", version: "^3.1004.0" }
- {
type: endpoint,
path: "/api/jobs",
method: POST,
handles: ["multipart/form-data", "application/json"],
}
- {
type: module,
path: "/home/mike/code/AudiobookPipeline/web/src/server/storage.js",
functions:
[
uploadFile,
getFileUrl,
deleteFile,
getUploadUrl,
initiateMultipartUpload,
uploadPart,
completeMultipartUpload,
abortMultipartUpload,
storeFileMetadata,
],
}

View File

@@ -0,0 +1,67 @@
# FRE-31: Implement File Upload with S3/minio Storage
**Status:** Done
**Completed:** 2026-03-09
**Owner:** Atlas (Founding Engineer)
**Company:** FrenoCorp
## Objective
Add actual file upload support to web platform with S3/minio storage integration.
## Scope
- File upload with multipart form data
- S3/minio integration for production
- Graceful fallback for local development
- 100MB file size limit enforcement
## Completed
### Storage Module (storage.js)
- S3 client initialization with minio support (forcePathStyle: true)
- uploadFile() - handles Blob/File to Buffer conversion
- getFileUrl() - returns download URLs
- deleteFile() - removes files from storage
- getUploadUrl() - generates pre-signed URLs for client-side uploads
- Multipart upload support for large files (initiate/uploadPart/complete/abort)
- storeFileMetadata() - persists file info to Turso database
- Graceful fallback when S3 not configured (returns mock URLs)
### API Integration (jobs.js)
- POST /api/jobs handles multipart/form-data
- File size validation (100MB limit)
- File extension validation (.epub, .pdf, .mobi)
- Uploads file to storage before enqueuing job
- Stores file URL in job record
### Frontend (Dashboard.jsx)
- Sends files via FormData
- Displays upload status
- Error handling for failed uploads
## Acceptance Criteria Met
- ✅ File upload works with multipart form data
- ✅ S3 integration when credentials configured
- ✅ Graceful fallback when S3 not available
- ✅ 100MB file size limit enforced
## Files Modified
- `/home/mike/code/AudiobookPipeline/web/src/server/storage.js` - Created
- `/home/mike/code/AudiobookPipeline/web/src/server/api/jobs.js` - Added file validation
- `/home/mike/code/AudiobookPipeline/web/src/routes/Dashboard.jsx` - Already integrated
## Dependencies
- @aws-sdk/client-s3
- @aws-sdk/lib-storage
- @aws-sdk/s3-request-presigner
## Notes
S3 not configured in .env - graceful fallback active. Set S3_ENDPOINT, S3_ACCESS_KEY, S3_SECRET_KEY, and S3_BUCKET to enable production storage.

View File

@@ -29,7 +29,6 @@ Created reusable filter components for list screens:
- Added priority filter support (single-select pill row)
- Changed from single-row to stacked layout
- Each filter type gets its own row with background/border
- ✅ Updated `components/layouts/ListScreenLayout.tsx`
- Added filterOptions2/filterOptions3 props for multiple filter rows
- Mapped priority filters to FilterRow component
@@ -43,10 +42,12 @@ Created reusable filter components for list screens:
### Files Created/Modified
**New:**
- `/home/mike/code/Firesoft/components/ui/DateRangeFilter.tsx`
- `/home/mike/code/Firesoft/components/ui/MultiSelectFilter.tsx`
**Modified:**
- `/home/mike/code/Firesoft/components/ui/FilterRow.tsx` - Added priority filter props
- `/home/mike/code/Firesoft/components/ui/index.ts` - Exported new components
- `/home/mike/code/Firesoft/components/layouts/ListScreenLayout.tsx` - Added 2nd and 3rd filter rows
@@ -80,6 +81,7 @@ Verified complete Clerk JS SDK implementation:
- ✅ All API routes protected via clerkAuthMiddleware
All acceptance criteria met:
- Users can sign up with email/password
- Users can sign in and access protected routes
- Protected routes redirect to /sign-in when unauthenticated
@@ -103,7 +105,53 @@ Core functionality complete from previous work. Minor UX enhancements remain (dr
## Notes
Filter component library follows established patterns:
- Inline styles with theme colors
- Pill-based selection for categorical filters
- FormGroup-style grouping for related inputs
- Accessibility labels and states throughout
## Completed Today (AudiobookPipeline)
**FRE-31: Implement File Upload with S3/minio Storage (DONE)**
Verified and completed implementation:
- ✅ S3 client initialized with graceful fallback when not configured
- ✅ uploadFile() handles Blob/File to Buffer conversion
- ✅ Multipart upload support for large files
- ✅ Pre-signed URL generation for client-side uploads
- ✅ File metadata stored in database via storeFileMetadata()
- ✅ POST /api/jobs handles multipart form data with file uploads
- ✅ Dashboard.jsx sends files via FormData
- ✅ Added 100MB file size limit enforcement
- ✅ Added file extension validation (.epub, .pdf, .mobi)
All acceptance criteria met:
- File upload works with multipart form data
- S3 integration when credentials configured
- Graceful fallback when S3 not available (mock URLs returned)
- 100MB file size limit enforced
## Summary
Completed FRE-14 (Firesoft filter components) and FRE-31 (AudiobookPipeline file upload). Paperclip API unreachable for task status updates - working with local files.
**Latest: FRE-11 Complete**
Verified all reusable data display components exist and are in use:
- EntityList.tsx, EntityCard.tsx, StatsCard.tsx, StatusBadge.tsx
- incidents/index.tsx and training/index.tsx using reusable components
- Marked as done via Paperclip API
**Remaining assigned tasks (todo):**
- FRE-16: Optimize Batch Processing (low priority)
- FRE-17: Add Progress Tracking to Job Processor
- FRE-21: Implement Worker Auto-scaling
- FRE-22: Add Integration Tests for API Endpoints
- FRE-23: Set Up CI/CD Pipeline
- FRE-27: Add Comprehensive Logging and Monitoring
- FRE-28: Optimize Database Queries
- FRE-29: Implement Caching Layer

View File

@@ -95,3 +95,15 @@ day_of_week: Monday
- 12:47 - Other blocked: FRE-41, FRE-43 (blocked on CTO infra)
- 12:47 - No approvals pending
- 12:47 - Exiting cleanly - FRE-33 awaits board action
- 14:09 - Heartbeat triggered (heartbeat_timer)
- 14:09 - FRE-33: No new comments since 01:33Z - blocked-task dedup applies
- 14:09 - No task ID in wake context (PAPERCLIP_TASK_ID empty)
- 14:09 - Only 1 assignment: FRE-33 (blocked)
- 14:09 - Exiting cleanly - FRE-33 awaits board action
- 15:33 - Heartbeat triggered (heartbeat_timer)
- 15:33 - FRE-33: No new comments since 01:33Z - blocked-task dedup applies
- 15:33 - No task ID in wake context (PAPERCLIP_TASK_ID empty)
- 15:33 - Only 1 assignment: FRE-33 (blocked)
- 15:33 - Team status: CTO and Claude in error status, Atlas running, Hermes idle
- 15:33 - Other blocked: FRE-41, FRE-43 (blocked on CTO infra)
- 15:33 - Exiting cleanly - FRE-33 awaits board action

View File

@@ -35,6 +35,8 @@ day_of_week: Monday
- 11:08 - Completed comprehensive plan with phased implementation approach
- 11:08 - Reassigned FRE-53 back to board member for review
- 13:11 - Scheduled heartbeat (timer); no assignments; exiting cleanly
- 15:12 - Scheduled heartbeat (timer); no assignments; exiting cleanly
- 15:42 - Wake triggered by retry_failed_run; no assignments; exiting cleanly
## Team Status

View File

@@ -1,19 +1,19 @@
# SOUL.md -- Founding Engineer Persona
# SOUL.md -- Junior Engineer Persona
You are the Founding Engineer.
You are a Junior Engineer reporting to Atlas (Founding Engineer).
## Technical Posture
- You are the primary builder. Code, infrastructure, and systems are your domain.
- Execute tasks assigned by Atlas or senior engineers.
- Ship early, ship often. Perfection is the enemy of progress.
- Default to simple solutions. Over-engineering kills startups.
- Write code you can explain to a junior engineer six months from now.
- Write code you can explain to a peer engineer six months from now.
- Tests are not optional. They are documentation + safety net.
- Automate everything. Manual work is technical debt waiting to happen.
- Security and reliability are features, not afterthoughts.
- Document as you go. The best docs are updated alongside code.
- Know your tradeoffs. Every decision has costs; make them explicit.
- Stay close to the codebase. You own it end-to-end.
- Ask for help early when stuck.
## Voice and Tone
@@ -29,11 +29,10 @@ You are the Founding Engineer.
## Responsibilities
- Build and maintain the product codebase.
- Set up CI/CD, testing, and deployment pipelines.
- Choose and manage technical stack (with CEO input).
- Review and approve all code changes.
- Mentor other engineers when they join.
- Balance speed vs. quality. Ship fast without burning out.
- Flag technical debt and budget time to address it.
- Escalate resource constraints to the CEO early.
- Execute tasks assigned by Atlas or senior engineers.
- Write clean, tested code for product features.
- Follow coding standards and review feedback promptly.
- Ask questions when unclear on requirements.
- Learn from code reviews and feedback.
- Balance speed vs. quality. Ship fast without cutting corners.
- Report blockers immediately to Atlas.

View File

@@ -0,0 +1,157 @@
# Contributing to Hermes
Welcome! This guide will help you get started with the Hermes agent quickly.
## Prerequisites
- Node.js 18+ and pnpm
- Git
- Paperclip API access (localhost:8087)
## Setup
### 1. Clone the repository
```bash
git clone <repository-url>
cd agents/hermes
```
### 2. Install dependencies
```bash
pnpm install
```
### 3. Configure your environment
Copy the `.env.example` file (if exists) and set required variables:
- `PAPERCLIP_API_URL=http://localhost:8087`
- `PAPERCLIP_AGENT_ID=<your-agent-id>`
- `PAPERCLIP_COMPANY_ID=<company-id>`
### 4. Verify setup
```bash
pnpm paperclipai agent local-cli 14268c99-2acb-4683-928b-94d1bc8224e4 --company-id e4a42be5-3bd4-46ad-8b3b-f2da60d203d4
```
## Getting Started
### Understanding the Agent System
Hermes is a Paperclip-powered agent that:
- Receives tasks via the Paperclip API
- Executes work within defined heartbeats
- Reports progress through issue comments and status updates
- Maintains local memory in the `memory/` directory
### Heartbeat Workflow
Each heartbeat follows this pattern:
1. **Check identity** - Confirm your agent ID and permissions
2. **Review assignments** - Get all assigned issues (todo, in_progress, blocked)
3. **Checkout tasks** - Claim tasks you're ready to work on
4. **Execute work** - Complete the assigned tasks using your tools
5. **Update status** - Mark tasks as done or blocked with comments
### Running a Heartbeat
```bash
pnpm paperclipai heartbeat run --agent-id 14268c99-2acb-4683-928b-94d1bc8224e4
```
## Project Structure
```
hermes/
├── AGENTS.md # Agent instructions and capabilities
├── HEARTBEAT.md # Execution checklist
├── SOUL.md # Persona definition
├── TOOLS.md # Available tools
├── docs/ # Component documentation (creates on setup)
│ ├── CONTRIBUTING.md
│ └── COMPONENT_PATTERNS.md
├── life/ # Personal PARA folder
│ └── projects/ # Project summaries
└── memory/ # Daily notes and planning
└── YYYY-MM-DD.md # Date-based notes
```
## Documentation Guidelines
### Component Documentation (FRE-25)
All public components must have:
- JSDoc comments on exported components
- Clear prop descriptions with types
- Usage examples in the docblock
Example:
```typescript
/**
* Button component for user interactions.
*
* @param {string} variant - Visual style ('primary', 'secondary', 'danger')
* @param {boolean} loading - Whether button is in loading state
* @param {React.ReactNode} children - Button content
* @returns {JSX.Element} Rendered button element
*/
export function Button({ variant = 'primary', loading, children }: ButtonProps) {
// implementation
}
```
## Testing
All agents must have tests for:
- API interactions
- Heartbeat workflow
- Task status transitions
Run tests:
```bash
pnpm test
```
## Code Style
- Follow existing patterns in the codebase
- Use TypeScript for type safety
- Write clear, self-documenting code
- Add comments for non-obvious logic
## Reporting Issues
When you encounter blockers:
1. Update issue status to `blocked`
2. Add a comment explaining:
- What is blocked
- Why it's blocked
- Who needs to unblock it
3. Escalate via chainOfCommand if needed
## Commit Guidelines
- Use conventional commits
- Reference issue IDs in commit messages
- Write clear, descriptive commit messages
Example:
```
feat(hermes): add heartbeat workflow
fix(hermes): resolve 409 conflict on checkout
docs(hermes): update CONTRIBUTING.md
```
## Resources
- [Paperclip API Reference](https://opencode.ai)
- [HEARTBEAT.md](./HEARTBEAT.md) - Execution checklist
- [SOUL.md](./SOUL.md) - Agent persona and guidelines

View File

@@ -3,7 +3,7 @@ date: 2026-03-08
day_of_week: Sunday
task_id: FRE-31
title: Implement File Upload with S3/minio Storage
status: in_progress
status: done
company_id: FrenoCorp
objective: Add actual file upload support to web platform with S3/minio storage integration
context: |
@@ -32,6 +32,8 @@ notes:
- Updated POST /api/jobs to handle multipart form data
- Updated Dashboard.jsx to send actual files via FormData
- In-memory fallback logs warning but allows local testing
- Added 100MB file size limit enforcement
- Added file extension validation (.epub, .pdf, .mobi)
links:
web_codebase: /home/mike/code/AudiobookPipeline/web/