Phase C: Prune FrenoCorp to only owned code after ShieldAI/Scripter migration
Removed ShieldAI artifacts: - apps/api/, apps/web/, apps/mobile/ - packages/ (all 8 shared packages) - services/voiceprint-ml/ - server/alerts/, server/webrtc/ - examples/ Removed Scripter artifacts: - marketing/ - tasks/ Updated root configs: - Renamed package.json from shieldsai-monorepo to frenocorp - Updated tsconfig.json to include agents/ instead of src/ - Updated vite.config.ts aliases to reference agents/, analysis/, plans/
This commit is contained in:
35
agents/cto/life/projects/fre-4529-dir-cleanup/items.yaml
Normal file
35
agents/cto/life/projects/fre-4529-dir-cleanup/items.yaml
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
items:
|
||||||
|
- id: fre-4529-analysis
|
||||||
|
type: task
|
||||||
|
content: Analyzed recent code influx in FrenoCorp repo from parallel agent work
|
||||||
|
status: completed
|
||||||
|
created_at: 2026-05-02
|
||||||
|
tags: [repo-cleanup, analysis, cto]
|
||||||
|
|
||||||
|
- id: fre-4529-plan
|
||||||
|
type: document
|
||||||
|
content: Created cleanup plan with 4 phases (inventory, structural, architecture, quality)
|
||||||
|
status: completed
|
||||||
|
created_at: 2026-05-02
|
||||||
|
tags: [repo-cleanup, planning]
|
||||||
|
|
||||||
|
- id: fre-4530-naming
|
||||||
|
type: child_issue
|
||||||
|
content: FRE-4530 - Resolve FrenoCorp vs ShieldAI naming
|
||||||
|
status: delegated
|
||||||
|
assignee: Founding Engineer
|
||||||
|
created_at: 2026-05-02
|
||||||
|
|
||||||
|
- id: fre-4531-structural
|
||||||
|
type: child_issue
|
||||||
|
content: FRE-4531 - Consolidate duplicates, move files, prune branches
|
||||||
|
status: delegated
|
||||||
|
assignee: Senior Engineer
|
||||||
|
created_at: 2026-05-02
|
||||||
|
|
||||||
|
- id: fre-4532-todo
|
||||||
|
type: child_issue
|
||||||
|
content: FRE-4532 - Audit TODO placeholders
|
||||||
|
status: delegated
|
||||||
|
assignee: Senior Engineer
|
||||||
|
created_at: 2026-05-02
|
||||||
15
agents/cto/life/projects/fre-4529-dir-cleanup/summary.md
Normal file
15
agents/cto/life/projects/fre-4529-dir-cleanup/summary.md
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
# FRE-4529: Repo Cleanup Analysis
|
||||||
|
|
||||||
|
**Status**: Complete
|
||||||
|
**Date**: 2026-05-02
|
||||||
|
**Role**: CTO
|
||||||
|
|
||||||
|
Diagnosed the "WTF happened" in the FrenoCorp repo. Found 7 structural problems from rapid parallel agent work. Created cleanup plan and 3 child issues.
|
||||||
|
|
||||||
|
## Key Outputs
|
||||||
|
- Analysis comment on [FRE-4529](/FRE/issues/FRE-4529)
|
||||||
|
- Cleanup plan document on [FRE-4529#document-plan](/FRE/issues/FRE-4529#document-plan)
|
||||||
|
- Child issues:
|
||||||
|
- [FRE-4530](/FRE/issues/FRE-4530) - Naming resolution
|
||||||
|
- [FRE-4531](/FRE/issues/FRE-4531) - Structural cleanup
|
||||||
|
- [FRE-4532](/FRE/issues/FRE-4532) - TODO audit
|
||||||
166
agents/cto/memory/2026-05-02.md
Normal file
166
agents/cto/memory/2026-05-02.md
Normal file
@@ -0,0 +1,166 @@
|
|||||||
|
# 2026-05-02
|
||||||
|
|
||||||
|
## FRE-4529 - WTF happened in the FrenoCorp Dir?
|
||||||
|
**Status**: In progress
|
||||||
|
**Wake reason**: issue_assigned
|
||||||
|
|
||||||
|
Analyzed the repository for the "WTF happened" issue. Found:
|
||||||
|
- Massive influx of ShieldAI-related code from multiple agents over the past week
|
||||||
|
- ~38 source files changed in today's auto-commit alone (2,040+ lines added)
|
||||||
|
- Multiple architectural issues: naming confusion, duplicate services, scattered code, TODO placeholders
|
||||||
|
|
||||||
|
Created cleanup plan document and posted to issue.
|
||||||
|
|
||||||
|
## Key Findings
|
||||||
|
1. Repo name is "FrenoCorp" but packages use `@shieldsai/` scope
|
||||||
|
2. Duplicated: call-analysis in `src/lib/call-analysis/` vs `server/webrtc/`
|
||||||
|
3. Mixed architecture patterns across services
|
||||||
|
4. Loose root-level files from individual agent work
|
||||||
|
5. Stale branches (`simple`, `complicated`) with divergent content
|
||||||
|
6. Empty `tasks/` directory
|
||||||
|
7. Multiple TODO placeholders in production code
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Second heartbeat: FRE-4529 reopened
|
||||||
|
|
||||||
|
**Wake**: issue_reopened_via_comment — user corrected: ShieldAI is at ~/code/ShieldAI
|
||||||
|
|
||||||
|
**Key correction**: ShieldAI = separate repo. FrenoCorp has **divergent independent implementations** of ShieldAI services, not copies. Services flattened as modules in `apps/api/src/services/` vs standalone microservices in ShieldAI.
|
||||||
|
|
||||||
|
**Revised findings**:
|
||||||
|
1. Overlap: spamshield, voiceprint, darkwatch, shared-billing, shared-notifications, shared-db, jobs, alerts, webrtc — all exist in both repos with *different code*
|
||||||
|
2. Wrong npm scope: `@shieldsai/` used in FrenoCorp packages
|
||||||
|
3. Unique to FrenoCorp (keep): apps/web, apps/mobile, src-tauri, agents/, server/trpc, server/websocket, shared-analytics, shared-auth, shared-ui, shared-utils
|
||||||
|
|
||||||
|
**Action**: Posted correction comment, updated plan doc (rev 2), raised board approval for direction choice (Option A: consume ShieldAI as dependency vs Option B: divorce cleanly). Awaiting board decision.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Third heartbeat: issue_continuation_needed
|
||||||
|
|
||||||
|
**Wake**: issue_continuation_needed — board approval still pending, no new comments.
|
||||||
|
|
||||||
|
**Action**: Set FRE-4529 to `blocked` with the board approval as the blocker. Child issues also stalled. Woke by harness but no actionable work until board decides.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Fourth heartbeat: issue_reopened_via_comment
|
||||||
|
|
||||||
|
**Wake**: Board gave direction! ShieldAI code was added mistakenly — move to ~/code/ShieldAI. Scripter code also in FrenoCorp — move to ~/code/scripter. Only agents/ and analysis/ stay. Favor newer for conflicts.
|
||||||
|
|
||||||
|
**Full inventory completed**:
|
||||||
|
- ShieldAI: apps/{api,web,mobile}, packages/ (8), services/voiceprint-ml, server/{alerts,webrtc}, examples/, root config
|
||||||
|
- Scripter: src/, src-tauri/, server/{trpc,websocket,types}, public/, brand/, marketing/, docs/, dist/, scripts/, index.html, root *.md
|
||||||
|
- Keep: agents/, analysis/, memory/, plans/
|
||||||
|
- pop: no code found in FrenoCorp
|
||||||
|
- Tasks empty — delete
|
||||||
|
|
||||||
|
**Action**: Posted comprehensive analysis comment and updated plan to rev 3. Asked board if src-tauri ("frenocorp-desktop") is FrenoCorp's own app or should merge into Scripter.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Fifth heartbeat: issue_children_completed
|
||||||
|
|
||||||
|
**Wake**: child issues FRE-4530/4531/4532 cancelled (old plan superseded). Proceeded to execute Phase A1.
|
||||||
|
|
||||||
|
**Phase A1 executed**: Transferred 45 files from FrenoCorp → ~/code/ShieldAI (commit 1e42c4a):
|
||||||
|
- Root configs, services (spamshield/voiceprint/darkwatch extras), VoicePrint ML, server/alerts, routes, 4 new packages, supplemented 2 existing packages, examples
|
||||||
|
- Favor newer policy applied correctly
|
||||||
|
|
||||||
|
**Created new child issues**:
|
||||||
|
- FRE-4533 → Senior Engineer: Merge apps/{api,web,mobile} + shared-db into ShieldAI
|
||||||
|
- FRE-4534 → Founding Engineer: Move Scripter code to ~/code/scripter
|
||||||
|
- Phase C (FrenoCorp prune) — blocked until A+B done
|
||||||
|
|
||||||
|
**Open question pending**: src-tauri identity (frenocorp-desktop vs scripter)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Sixth heartbeat: issue_assigned - FRE-4533 Phase A2
|
||||||
|
|
||||||
|
**Wake**: FRE-4533 assigned to CTO (was previously planned for Senior Engineer).
|
||||||
|
|
||||||
|
**Action**: Executed Phase A2 — merged apps/{api,web,mobile} and shared-db from FrenoCorp → ShieldAI.
|
||||||
|
|
||||||
|
**ShieldAI commits**:
|
||||||
|
- 1197fe4 (reverted): First attempt put files in apps/ not packages/
|
||||||
|
- e704a90 (final): Properly merged into packages/ (api, web, mobile, shared-db)
|
||||||
|
|
||||||
|
**Key details**:
|
||||||
|
- apps/api → merged into packages/api/ (added middleware, config, services/{darkwatch,spamshield,voiceprint}, tests, notifications route)
|
||||||
|
- apps/web → packages/web/ stub
|
||||||
|
- apps/mobile → packages/mobile/ stub
|
||||||
|
- packages/shared-db → preserved as-is alongside packages/db (needs reconciliation)
|
||||||
|
- ShieldAI's package.json and tsconfig for api were restored after accidental overwrite
|
||||||
|
|
||||||
|
**Follow-up items documented on issue**:
|
||||||
|
1. Reconcile shared-db with db (Prisma schema merge)
|
||||||
|
2. Fix server.ts correlationRoutes import
|
||||||
|
3. Build out web/mobile stubs
|
||||||
|
4. Phase C (FrenoCorp prune) — blocked on FRE-4534
|
||||||
|
|
||||||
|
## Sixth heartbeat (14:15 UTC) — FRE-4534 Phase B: Move Scripter code
|
||||||
|
|
||||||
|
**Wake reason**: issue_assigned
|
||||||
|
**Previous run**: Founding Engineer run "succeeded" but no detail captured.
|
||||||
|
|
||||||
|
### Analysis
|
||||||
|
- Scripter repo exists at ~/code/scripter (git@git.freno.me:Mike/Scripter.git) — independent project
|
||||||
|
- FrenoCorp scripter files still in place; need to move
|
||||||
|
- Policy: "favor newer" for conflicts
|
||||||
|
- server/trpc/ structure differs completely between repos (flat routers vs modular)
|
||||||
|
- server/types/project.ts imported by src/components/characters/ — cross-dependency
|
||||||
|
|
||||||
|
### Actions Taken
|
||||||
|
1. **Created child issue** [FRE-4535](/FRE/issues/FRE-4535) → Founding Engineer for overlap comparison work (src/, src-tauri/, server/trpc/, marketing/, public/)
|
||||||
|
2. **Moved non-overlapping files** from FrenoCorp → scripter, committed in both repos:
|
||||||
|
- brand/, scripts/, server/types/, server/websocket/, .eslintrc.json, FRE-4510-IMPLEMENTATION.md
|
||||||
|
- Merged .gitignore
|
||||||
|
3. **Removed from FrenoCorp** (git rm, commit 4e07718e6)
|
||||||
|
4. **Committed to scripter** (commit 5b128d9)
|
||||||
|
5. **Posted progress comment** on FRE-4534
|
||||||
|
|
||||||
|
### CTO Oversight
|
||||||
|
- 50 open issues total, 15 in in_review — code review pipeline is backed up
|
||||||
|
- Senior Engineer on FRE-4473 (blocked), Security Reviewer on FRE-4498
|
||||||
|
- Founding Engineer now has FRE-4535 as sole active task
|
||||||
|
- Critical issues all CMO/CEO-owned (Product Hunt launch)
|
||||||
|
|
||||||
|
### Next
|
||||||
|
- Founding Engineer picks up [FRE-4535](/FRE/issues/FRE-4535)
|
||||||
|
- Need Phase C issue (FrenoCorp prune) once A+B done
|
||||||
|
- Code review pipeline needs attention — 15 items queued
|
||||||
|
|
||||||
|
## Seventh heartbeat (continuation) — FRE-4534 Phase B follow-up
|
||||||
|
|
||||||
|
### Status
|
||||||
|
- **FRE-4535** (child, Founding Engineer) — checked out, in_progress. No completion yet.
|
||||||
|
- **FRE-4533** (Phase A2: ShieldAI merge) — **done** ✅
|
||||||
|
- **Created FRE-4536** (Phase C: FrenoCorp prune) — blocked on FRE-4534, assigned to Founding Engineer
|
||||||
|
- Blocked by: FRE-4534 via blockedByIssueIds
|
||||||
|
- Will auto-wake Founding Engineer when FRE-4534 completes
|
||||||
|
|
||||||
|
### Next
|
||||||
|
- Wait for Founding Engineer to complete FRE-4535
|
||||||
|
- FRE-4534 stays in_progress until FRE-4535 completes
|
||||||
|
- FRE-4536 auto-wakes when FRE-4534 → done
|
||||||
|
|
||||||
|
## Eighth heartbeat (liveness continuation) — FRE-4534 Phase B completed
|
||||||
|
|
||||||
|
### Actions
|
||||||
|
- FRE-4535 (Founding Engineer) still had no progress since creation → took over the overlap work directly
|
||||||
|
- **Remaining overlap items moved** from FrenoCorp → scripter:
|
||||||
|
- src/, src-tauri/, server/trpc/ (archived to legacy/), marketing/, docs/, public/, dist/, index.html
|
||||||
|
- Favor-newer applied: scripter's modular structure preserved, FrenoCorp extras merged
|
||||||
|
- **FrenoCorp cleaned**: `git rm` of all scripter files (373 deletions, commit 0cc005414)
|
||||||
|
- **Scripter updated**: 155 new files committed (df1360a)
|
||||||
|
- **FRE-4534 marked done** ✅
|
||||||
|
- **FRE-4535 marked cancelled** (superseded by direct CTO action)
|
||||||
|
- **FRE-4536** (Phase C) auto-woken → Founding Engineer now has it in_progress
|
||||||
|
|
||||||
|
### Final tally
|
||||||
|
Total scripter files moved to ~/code/scripter across all heartbeats:
|
||||||
|
- brand/, scripts/, server/types/, server/websocket/, .eslintrc.json, FRE-4510-IMPLEMENTATION.md (heartbeat 1)
|
||||||
|
- marketing/ (66 files), docs/, public/manifest.json, src-tauri extras, src/ extras, server/trpc/legacy/ (heartbeat 2)
|
||||||
|
- .gitignore merged
|
||||||
1
apps/api/node_modules/.vite/vitest/results.json
generated
vendored
1
apps/api/node_modules/.vite/vitest/results.json
generated
vendored
@@ -1 +0,0 @@
|
|||||||
{"version":"1.6.1","results":[[":src/__tests__/spam-rate-limit.test.ts",{"duration":41,"failed":false}]]}
|
|
||||||
@@ -1,29 +0,0 @@
|
|||||||
{
|
|
||||||
"name": "api",
|
|
||||||
"version": "0.1.0",
|
|
||||||
"private": true,
|
|
||||||
"type": "module",
|
|
||||||
"scripts": {
|
|
||||||
"dev": "tsx watch src/index.ts",
|
|
||||||
"build": "tsc",
|
|
||||||
"lint": "eslint src/"
|
|
||||||
},
|
|
||||||
"dependencies": {
|
|
||||||
"@fastify/cors": "^11.2.0",
|
|
||||||
"@fastify/helmet": "^13.0.2",
|
|
||||||
"@shieldsai/shared-analytics": "*",
|
|
||||||
"@shieldsai/shared-auth": "*",
|
|
||||||
"@shieldsai/shared-billing": "*",
|
|
||||||
"@shieldsai/shared-db": "*",
|
|
||||||
"@shieldsai/shared-notifications": "*",
|
|
||||||
"@shieldsai/shared-utils": "*",
|
|
||||||
"fastify": "^4.25.0",
|
|
||||||
"fastify-plugin": "^4.5.0",
|
|
||||||
"ioredis": "^5.3.0"
|
|
||||||
},
|
|
||||||
"devDependencies": {
|
|
||||||
"@types/node": "^25.6.0",
|
|
||||||
"tsx": "^4.7.1",
|
|
||||||
"typescript": "^5.3.3"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,144 +0,0 @@
|
|||||||
import { describe, it, expect, beforeEach, vi } from 'vitest';
|
|
||||||
import { SMSClassifierService } from '../services/spamshield/spamshield.service';
|
|
||||||
|
|
||||||
// Mock shared-db before anything else (Prisma client is not generated in test env)
|
|
||||||
vi.mock('@shieldsai/shared-db', () => ({
|
|
||||||
prisma: {},
|
|
||||||
SpamFeedback: {},
|
|
||||||
}));
|
|
||||||
|
|
||||||
// Mock the feature flags module to control enableMLClassifier
|
|
||||||
vi.mock('../services/spamshield/spamshield.config', () => ({
|
|
||||||
spamShieldEnv: {
|
|
||||||
SPAM_THRESHOLD_AUTO_BLOCK: 0.85,
|
|
||||||
SPAM_THRESHOLD_FLAG: 0.6,
|
|
||||||
},
|
|
||||||
spamFeatureFlags: {
|
|
||||||
enableMLClassifier: true,
|
|
||||||
},
|
|
||||||
SpamDecision: {
|
|
||||||
ALLOW: 'allow',
|
|
||||||
FLAG: 'flag',
|
|
||||||
BLOCK: 'block',
|
|
||||||
CHALLENGE: 'challenge',
|
|
||||||
},
|
|
||||||
SpamLayer: {
|
|
||||||
NUMBER_REPUTATION: 'number_reputation',
|
|
||||||
CONTENT_CLASSIFICATION: 'content_classification',
|
|
||||||
BEHAVIORAL_ANALYSIS: 'behavioral_analysis',
|
|
||||||
COMMUNITY_INTELLIGENCE: 'community_intelligence',
|
|
||||||
},
|
|
||||||
ConfidenceLevel: {
|
|
||||||
LOW: 'low',
|
|
||||||
MEDIUM: 'medium',
|
|
||||||
HIGH: 'high',
|
|
||||||
VERY_HIGH: 'very_high',
|
|
||||||
},
|
|
||||||
spamRateLimits: {},
|
|
||||||
}));
|
|
||||||
|
|
||||||
describe('SMSClassifierService', () => {
|
|
||||||
let classifier: SMSClassifierService;
|
|
||||||
let initializeCalls: number;
|
|
||||||
let initializeDelay: Promise<void>;
|
|
||||||
|
|
||||||
beforeEach(() => {
|
|
||||||
// Re-import after mock to get fresh module state
|
|
||||||
initializeCalls = 0;
|
|
||||||
initializeDelay = new Promise(resolve => setTimeout(resolve, 50));
|
|
||||||
|
|
||||||
classifier = new SMSClassifierService();
|
|
||||||
// Override initialize to track calls and add delay
|
|
||||||
classifier.initialize = async () => {
|
|
||||||
initializeCalls++;
|
|
||||||
await initializeDelay;
|
|
||||||
};
|
|
||||||
});
|
|
||||||
|
|
||||||
describe('initialization race condition', () => {
|
|
||||||
it('should call initialize only once under concurrent classify calls', async () => {
|
|
||||||
const promises = Array.from({ length: 10 }, () =>
|
|
||||||
classifier.classify('ACT NOW - Limited offer!'),
|
|
||||||
);
|
|
||||||
|
|
||||||
const results = await Promise.all(promises);
|
|
||||||
|
|
||||||
expect(initializeCalls).toBe(1);
|
|
||||||
expect(results).toHaveLength(10);
|
|
||||||
results.forEach(r => {
|
|
||||||
expect(r).toHaveProperty('isSpam');
|
|
||||||
expect(r).toHaveProperty('confidence');
|
|
||||||
expect(r).toHaveProperty('spamFeatures');
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should handle interleaved calls after partial initialization', async () => {
|
|
||||||
const batch1 = Array.from({ length: 5 }, () =>
|
|
||||||
classifier.classify('First batch message'),
|
|
||||||
);
|
|
||||||
|
|
||||||
await Promise.all(batch1);
|
|
||||||
|
|
||||||
expect(initializeCalls).toBe(1);
|
|
||||||
|
|
||||||
const batch2 = Array.from({ length: 5 }, () =>
|
|
||||||
classifier.classify('Second batch message'),
|
|
||||||
);
|
|
||||||
|
|
||||||
await Promise.all(batch2);
|
|
||||||
|
|
||||||
// initialize should still only have been called once
|
|
||||||
expect(initializeCalls).toBe(1);
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should return consistent results for same input under concurrency', async () => {
|
|
||||||
const text = 'URGENT: Click http://example.com now!';
|
|
||||||
const promises = Array.from({ length: 20 }, () =>
|
|
||||||
classifier.classify(text),
|
|
||||||
);
|
|
||||||
|
|
||||||
const results = await Promise.all(promises);
|
|
||||||
|
|
||||||
const firstResult = results[0];
|
|
||||||
results.forEach((r, i) => {
|
|
||||||
expect(r.isSpam).toBe(firstResult.isSpam);
|
|
||||||
expect(r.confidence).toBe(firstResult.confidence);
|
|
||||||
expect(r.spamFeatures).toEqual(firstResult.spamFeatures);
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should handle rapid sequential calls without re-initializing', async () => {
|
|
||||||
for (let i = 0; i < 50; i++) {
|
|
||||||
await classifier.classify(`Message ${i}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
expect(initializeCalls).toBe(1);
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
describe('feature extraction', () => {
|
|
||||||
it('should detect URL presence', async () => {
|
|
||||||
const result = await classifier.classify('Visit www.example.com');
|
|
||||||
expect(result.spamFeatures).toContain('url_present');
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should detect urgency keywords', async () => {
|
|
||||||
const result = await classifier.classify('Act now! This offer is urgent.');
|
|
||||||
expect(result.spamFeatures).toContain('urgency_keyword');
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should detect excessive capitalization', async () => {
|
|
||||||
const result = await classifier.classify('BUY THIS NOW!!!');
|
|
||||||
expect(result.spamFeatures).toContain('excessive_caps');
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should detect multiple features', async () => {
|
|
||||||
const result = await classifier.classify(
|
|
||||||
'URGENT: Visit www.example.com NOW!!!',
|
|
||||||
);
|
|
||||||
expect(result.spamFeatures).toContain('url_present');
|
|
||||||
expect(result.spamFeatures).toContain('urgency_keyword');
|
|
||||||
expect(result.spamFeatures).toContain('excessive_caps');
|
|
||||||
});
|
|
||||||
});
|
|
||||||
});
|
|
||||||
@@ -1,98 +0,0 @@
|
|||||||
import { describe, it, expect, beforeAll, afterAll, beforeEach, afterEach } from 'vitest';
|
|
||||||
import { RedisRateLimiter } from '../middleware/spam-rate-limit.middleware';
|
|
||||||
import { redis } from '../config/redis';
|
|
||||||
|
|
||||||
describe('RedisRateLimiter', () => {
|
|
||||||
const testKey = 'test-client';
|
|
||||||
const limiter = new RedisRateLimiter();
|
|
||||||
|
|
||||||
beforeAll(async () => {
|
|
||||||
await redis.connect();
|
|
||||||
});
|
|
||||||
|
|
||||||
afterAll(async () => {
|
|
||||||
await redis.quit();
|
|
||||||
});
|
|
||||||
|
|
||||||
beforeEach(async () => {
|
|
||||||
await redis.del('spamshield:ratelimit:test-client');
|
|
||||||
await redis.del('spamshield:ratelimit:daily:test-client');
|
|
||||||
});
|
|
||||||
|
|
||||||
afterEach(async () => {
|
|
||||||
await redis.del('spamshield:ratelimit:test-client');
|
|
||||||
await redis.del('spamshield:ratelimit:daily:test-client');
|
|
||||||
});
|
|
||||||
|
|
||||||
describe('checkLimit (per-minute)', () => {
|
|
||||||
it('should allow requests within the limit', async () => {
|
|
||||||
const result = await limiter.checkLimit(testKey, 60, 10);
|
|
||||||
|
|
||||||
expect(result.remaining).toBe(9);
|
|
||||||
expect(result.retryAfter).toBeUndefined();
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should decrement remaining on each request', async () => {
|
|
||||||
const result1 = await limiter.checkLimit(testKey, 60, 10);
|
|
||||||
const result2 = await limiter.checkLimit(testKey, 60, 10);
|
|
||||||
|
|
||||||
expect(result1.remaining).toBe(9);
|
|
||||||
expect(result2.remaining).toBe(8);
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should exceed limit after max requests', async () => {
|
|
||||||
for (let i = 0; i < 10; i++) {
|
|
||||||
await limiter.checkLimit(testKey, 60, 10);
|
|
||||||
}
|
|
||||||
|
|
||||||
const result = await limiter.checkLimit(testKey, 60, 10);
|
|
||||||
|
|
||||||
expect(result.remaining).toBe(0);
|
|
||||||
expect(result.retryAfter).toBeGreaterThan(0);
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should return retry-after when limit is exceeded', async () => {
|
|
||||||
for (let i = 0; i < 10; i++) {
|
|
||||||
await limiter.checkLimit(testKey, 60, 10);
|
|
||||||
}
|
|
||||||
|
|
||||||
const result = await limiter.checkLimit(testKey, 60, 10);
|
|
||||||
|
|
||||||
expect(result.retryAfter).toBeGreaterThan(0);
|
|
||||||
expect(result.retryAfter).toBeLessThanOrEqual(60000);
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
describe('checkDailyLimit', () => {
|
|
||||||
it('should allow requests within daily limit', async () => {
|
|
||||||
const result = await limiter.checkDailyLimit(testKey, 100);
|
|
||||||
|
|
||||||
expect(result.remaining).toBe(99);
|
|
||||||
expect(result.retryAfter).toBeUndefined();
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should exceed daily limit after max requests', async () => {
|
|
||||||
for (let i = 0; i < 100; i++) {
|
|
||||||
await limiter.checkDailyLimit(testKey, 100);
|
|
||||||
}
|
|
||||||
|
|
||||||
const result = await limiter.checkDailyLimit(testKey, 100);
|
|
||||||
|
|
||||||
expect(result.remaining).toBe(0);
|
|
||||||
expect(result.retryAfter).toBeGreaterThan(0);
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
describe('reset', () => {
|
|
||||||
it('should clear the rate limit counter', async () => {
|
|
||||||
await limiter.checkLimit(testKey, 60, 10);
|
|
||||||
await limiter.checkLimit(testKey, 60, 10);
|
|
||||||
|
|
||||||
await limiter.reset(testKey);
|
|
||||||
|
|
||||||
const result = await limiter.checkLimit(testKey, 60, 10);
|
|
||||||
|
|
||||||
expect(result.remaining).toBe(9);
|
|
||||||
});
|
|
||||||
});
|
|
||||||
});
|
|
||||||
@@ -1,55 +0,0 @@
|
|||||||
import { z } from 'zod';
|
|
||||||
|
|
||||||
// Environment variables
|
|
||||||
const envSchema = z.object({
|
|
||||||
NODE_ENV: z.enum(['development', 'production', 'test']).default('development'),
|
|
||||||
PORT: z.string().transform(Number).default(3000),
|
|
||||||
HOST: z.string().default('0.0.0.0'),
|
|
||||||
API_RATE_LIMIT_WINDOW: z.string().transform(Number).default(60000), // 1 minute
|
|
||||||
API_RATE_LIMIT_MAX_REQUESTS: z.string().transform(Number).default(100),
|
|
||||||
CORS_ORIGIN: z.string().default('http://localhost:5173'),
|
|
||||||
});
|
|
||||||
|
|
||||||
export const apiEnv = envSchema.parse({
|
|
||||||
NODE_ENV: process.env.NODE_ENV,
|
|
||||||
PORT: process.env.PORT,
|
|
||||||
HOST: process.env.HOST,
|
|
||||||
API_RATE_LIMIT_WINDOW: process.env.API_RATE_LIMIT_WINDOW,
|
|
||||||
API_RATE_LIMIT_MAX_REQUESTS: process.env.API_RATE_LIMIT_MAX_REQUESTS,
|
|
||||||
CORS_ORIGIN: process.env.CORS_ORIGIN,
|
|
||||||
});
|
|
||||||
|
|
||||||
// Rate limit configuration by tier
|
|
||||||
export const rateLimitConfig = {
|
|
||||||
basic: {
|
|
||||||
windowMs: 60000, // 1 minute
|
|
||||||
maxRequests: 100,
|
|
||||||
},
|
|
||||||
plus: {
|
|
||||||
windowMs: 60000,
|
|
||||||
maxRequests: 500,
|
|
||||||
},
|
|
||||||
premium: {
|
|
||||||
windowMs: 60000,
|
|
||||||
maxRequests: 2000,
|
|
||||||
},
|
|
||||||
};
|
|
||||||
|
|
||||||
// API versioning configuration
|
|
||||||
export const apiVersioning = {
|
|
||||||
defaultVersion: '1',
|
|
||||||
headerName: 'X-API-Version',
|
|
||||||
queryParam: 'api-version',
|
|
||||||
};
|
|
||||||
|
|
||||||
// Logging configuration
|
|
||||||
export const loggingConfig = {
|
|
||||||
level: apiEnv.NODE_ENV === 'production' ? 'info' : 'debug',
|
|
||||||
transport: apiEnv.NODE_ENV === 'development' ? {
|
|
||||||
target: 'pino-pretty',
|
|
||||||
options: {
|
|
||||||
colorize: true,
|
|
||||||
translateTime: true,
|
|
||||||
},
|
|
||||||
} : undefined,
|
|
||||||
};
|
|
||||||
@@ -1,18 +0,0 @@
|
|||||||
import { Redis } from 'ioredis';
|
|
||||||
|
|
||||||
const redisHost = process.env.REDIS_HOST || 'localhost';
|
|
||||||
const redisPort = parseInt(process.env.REDIS_PORT || '6379', 10);
|
|
||||||
|
|
||||||
export const redis = new Redis({
|
|
||||||
host: redisHost,
|
|
||||||
port: redisPort,
|
|
||||||
retryStrategy: (times: number) => Math.min(times * 50, 2000),
|
|
||||||
lazyConnect: true,
|
|
||||||
});
|
|
||||||
|
|
||||||
export async function getRedisConnection(): Promise<Redis> {
|
|
||||||
if (redis.status === 'wait' || redis.status === 'connecting') {
|
|
||||||
await redis.connect();
|
|
||||||
}
|
|
||||||
return redis;
|
|
||||||
}
|
|
||||||
@@ -1,106 +0,0 @@
|
|||||||
import Fastify from 'fastify';
|
|
||||||
import cors from '@fastify/cors';
|
|
||||||
import helmet from '@fastify/helmet';
|
|
||||||
import { authMiddleware } from './middleware/auth.middleware';
|
|
||||||
import { rateLimitMiddleware } from './middleware/rate-limit.middleware';
|
|
||||||
import { spamRateLimitMiddleware } from './middleware/spam-rate-limit.middleware';
|
|
||||||
import { errorHandlingMiddleware } from './middleware/error-handling.middleware';
|
|
||||||
import { loggingMiddleware } from './middleware/logging.middleware';
|
|
||||||
import { apiEnv, loggingConfig } from './config/api.config';
|
|
||||||
import { routes } from './routes';
|
|
||||||
|
|
||||||
const fastify = Fastify({
|
|
||||||
logger: loggingConfig,
|
|
||||||
ignoreTrailingSlash: true,
|
|
||||||
maxParamLength: 500,
|
|
||||||
});
|
|
||||||
|
|
||||||
// Register plugins
|
|
||||||
async function registerPlugins() {
|
|
||||||
// CORS configuration
|
|
||||||
await fastify.register(cors, {
|
|
||||||
origin: apiEnv.CORS_ORIGIN,
|
|
||||||
methods: ['GET', 'POST', 'PUT', 'DELETE', 'PATCH', 'OPTIONS'],
|
|
||||||
credentials: true,
|
|
||||||
});
|
|
||||||
|
|
||||||
// Security headers
|
|
||||||
await fastify.register(helmet, {
|
|
||||||
global: true,
|
|
||||||
contentSecurityPolicy: false,
|
|
||||||
});
|
|
||||||
|
|
||||||
// Rate limiting
|
|
||||||
await fastify.register(rateLimitMiddleware);
|
|
||||||
|
|
||||||
// SpamShield rate limiting (Redis-backed)
|
|
||||||
await fastify.register(spamRateLimitMiddleware);
|
|
||||||
|
|
||||||
// Authentication
|
|
||||||
await fastify.register(authMiddleware);
|
|
||||||
|
|
||||||
// Logging
|
|
||||||
await fastify.register(loggingMiddleware);
|
|
||||||
|
|
||||||
// Error handling
|
|
||||||
await fastify.register(errorHandlingMiddleware);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Register routes
|
|
||||||
async function registerRoutes() {
|
|
||||||
await fastify.register(routes, { prefix: '/api/v1' });
|
|
||||||
}
|
|
||||||
|
|
||||||
// Health check endpoint
|
|
||||||
fastify.get('/health', async () => {
|
|
||||||
return { status: 'ok', timestamp: new Date().toISOString() };
|
|
||||||
});
|
|
||||||
|
|
||||||
// Root endpoint
|
|
||||||
fastify.get('/', async () => {
|
|
||||||
return {
|
|
||||||
name: 'FrenoCorp API Gateway',
|
|
||||||
version: '1.0.0',
|
|
||||||
environment: apiEnv.NODE_ENV,
|
|
||||||
};
|
|
||||||
});
|
|
||||||
|
|
||||||
// Start server
|
|
||||||
async function start() {
|
|
||||||
await registerPlugins();
|
|
||||||
await registerRoutes();
|
|
||||||
|
|
||||||
try {
|
|
||||||
await fastify.listen({
|
|
||||||
port: apiEnv.PORT,
|
|
||||||
host: apiEnv.HOST,
|
|
||||||
});
|
|
||||||
|
|
||||||
console.log(`🚀 API Gateway running at http://${apiEnv.HOST}:${apiEnv.PORT}`);
|
|
||||||
console.log(`📝 Environment: ${apiEnv.NODE_ENV}`);
|
|
||||||
console.log(`📊 Rate limit window: ${apiEnv.API_RATE_LIMIT_WINDOW}ms`);
|
|
||||||
console.log(`📈 Max requests: ${apiEnv.API_RATE_LIMIT_MAX_REQUESTS}`);
|
|
||||||
} catch (err) {
|
|
||||||
console.error(err);
|
|
||||||
process.exit(1);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Graceful shutdown
|
|
||||||
const gracefulShutdown = async (signal: string) => {
|
|
||||||
console.log(`\n🛑 ${signal} received, shutting down gracefully...`);
|
|
||||||
await fastify.close();
|
|
||||||
console.log('✅ Server closed');
|
|
||||||
process.exit(0);
|
|
||||||
};
|
|
||||||
|
|
||||||
process.on('SIGINT', () => gracefulShutdown('SIGINT'));
|
|
||||||
process.on('SIGTERM', () => gracefulShutdown('SIGTERM'));
|
|
||||||
|
|
||||||
// Export for testing
|
|
||||||
export { fastify };
|
|
||||||
|
|
||||||
// Start if running directly
|
|
||||||
if (process.argv[1] === new URL(import.meta.url).pathname) {
|
|
||||||
start();
|
|
||||||
}
|
|
||||||
@@ -1,86 +0,0 @@
|
|||||||
import { FastifyInstance, FastifyRequest, FastifyReply } from 'fastify';
|
|
||||||
|
|
||||||
export interface AuthRequest extends FastifyRequest {
|
|
||||||
user?: {
|
|
||||||
id: string;
|
|
||||||
email: string;
|
|
||||||
role: string;
|
|
||||||
organizationId?: string;
|
|
||||||
};
|
|
||||||
apiKey?: string;
|
|
||||||
authType: 'jwt' | 'api-key' | 'anonymous';
|
|
||||||
}
|
|
||||||
|
|
||||||
export async function authMiddleware(fastify: FastifyInstance) {
|
|
||||||
// Authentication hook
|
|
||||||
fastify.addHook('onRequest', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const authReq = request as AuthRequest;
|
|
||||||
// Skip auth for health checks and root
|
|
||||||
const publicRoutes = ['/', '/health'];
|
|
||||||
if (publicRoutes.some((route) => request.url.startsWith(route))) {
|
|
||||||
authReq.authType = 'anonymous';
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Try JWT authentication first
|
|
||||||
const authHeader = request.headers.authorization;
|
|
||||||
if (authHeader?.startsWith('Bearer ')) {
|
|
||||||
const token = authHeader.slice(7);
|
|
||||||
try {
|
|
||||||
// In production, decode and verify JWT
|
|
||||||
// For now, we'll attach a placeholder user
|
|
||||||
authReq.user = {
|
|
||||||
id: 'user-placeholder',
|
|
||||||
email: 'user@example.com',
|
|
||||||
role: 'user',
|
|
||||||
};
|
|
||||||
authReq.authType = 'jwt';
|
|
||||||
return;
|
|
||||||
} catch (err) {
|
|
||||||
// JWT invalid, continue to API key check
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Try API key authentication
|
|
||||||
const apiKey = request.headers['x-api-key'] as string | undefined;
|
|
||||||
if (apiKey) {
|
|
||||||
// In production, validate API key against database
|
|
||||||
authReq.apiKey = apiKey;
|
|
||||||
authReq.user = {
|
|
||||||
id: `api-${apiKey}`,
|
|
||||||
email: `api-${apiKey}@services.internal`,
|
|
||||||
role: 'service',
|
|
||||||
};
|
|
||||||
authReq.authType = 'api-key';
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// No auth found - attach anonymous user
|
|
||||||
authReq.authType = 'anonymous';
|
|
||||||
authReq.user = {
|
|
||||||
id: 'anonymous',
|
|
||||||
email: 'anonymous@unknown',
|
|
||||||
role: 'anonymous',
|
|
||||||
};
|
|
||||||
});
|
|
||||||
|
|
||||||
// Create auth decorator for route-level protection
|
|
||||||
fastify.decorate('requireAuth', async (request: AuthRequest) => {
|
|
||||||
if (request.authType === 'anonymous') {
|
|
||||||
throw { statusCode: 401, message: 'Authentication required' };
|
|
||||||
}
|
|
||||||
return true;
|
|
||||||
});
|
|
||||||
|
|
||||||
fastify.decorate('requireRole', (allowedRoles: string[]) => {
|
|
||||||
return async (request: AuthRequest) => {
|
|
||||||
if (!request.user?.role || !allowedRoles.includes(request.user.role)) {
|
|
||||||
throw {
|
|
||||||
statusCode: 403,
|
|
||||||
message: `Role ${request.user?.role} not in allowed roles: ${allowedRoles.join(', ')}`,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
return true;
|
|
||||||
};
|
|
||||||
});
|
|
||||||
}
|
|
||||||
@@ -1,62 +0,0 @@
|
|||||||
import { FastifyInstance, FastifyRequest, FastifyReply } from 'fastify';
|
|
||||||
|
|
||||||
export interface ErrorResponse {
|
|
||||||
error: string;
|
|
||||||
message: string;
|
|
||||||
statusCode: number;
|
|
||||||
code?: string;
|
|
||||||
details?: Record<string, unknown>;
|
|
||||||
timestamp: string;
|
|
||||||
path: string;
|
|
||||||
}
|
|
||||||
|
|
||||||
export async function errorHandlingMiddleware(fastify: FastifyInstance) {
|
|
||||||
// Custom error handler
|
|
||||||
fastify.setErrorHandler((error, request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const response: ErrorResponse = {
|
|
||||||
error: error.name || 'Internal Server Error',
|
|
||||||
message: error.message || 'An unexpected error occurred',
|
|
||||||
statusCode: error.statusCode || 500,
|
|
||||||
code: (error as any).code,
|
|
||||||
timestamp: new Date().toISOString(),
|
|
||||||
path: request.url,
|
|
||||||
};
|
|
||||||
|
|
||||||
// Log error
|
|
||||||
fastify.log.error({
|
|
||||||
error: response,
|
|
||||||
stack: error.stack,
|
|
||||||
method: request.method,
|
|
||||||
userAgent: request.headers['user-agent'],
|
|
||||||
});
|
|
||||||
|
|
||||||
// Send standardized error response
|
|
||||||
reply.status(response.statusCode).send(response);
|
|
||||||
});
|
|
||||||
|
|
||||||
// 404 handler
|
|
||||||
fastify.setNotFoundHandler((request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
reply.status(404).send({
|
|
||||||
error: 'Not Found',
|
|
||||||
message: `Route ${request.method} ${request.url} not found`,
|
|
||||||
statusCode: 404,
|
|
||||||
timestamp: new Date().toISOString(),
|
|
||||||
path: request.url,
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
// Validation error handler
|
|
||||||
fastify.addHook('onError', async (request: FastifyRequest, reply: FastifyReply, error) => {
|
|
||||||
if (error.validation) {
|
|
||||||
reply.status(400).send({
|
|
||||||
error: 'Validation Error',
|
|
||||||
message: 'Request validation failed',
|
|
||||||
statusCode: 400,
|
|
||||||
code: 'VALIDATION_ERROR',
|
|
||||||
details: error.validation,
|
|
||||||
timestamp: new Date().toISOString(),
|
|
||||||
path: request.url,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
@@ -1,66 +0,0 @@
|
|||||||
import { FastifyInstance, FastifyRequest, FastifyReply } from 'fastify';
|
|
||||||
|
|
||||||
export interface RequestLog {
|
|
||||||
method: string;
|
|
||||||
url: string;
|
|
||||||
statusCode: number;
|
|
||||||
responseTime: number;
|
|
||||||
requestId: string;
|
|
||||||
userAgent?: string;
|
|
||||||
clientIp: string;
|
|
||||||
requestIdHeader?: string;
|
|
||||||
}
|
|
||||||
|
|
||||||
export async function loggingMiddleware(fastify: FastifyInstance) {
|
|
||||||
// Generate request ID if not present
|
|
||||||
fastify.addHook('onRequest', (request: FastifyRequest, reply: FastifyReply, done) => {
|
|
||||||
const requestId =
|
|
||||||
request.headers['x-request-id'] ||
|
|
||||||
request.headers['x-correlation-id'] ||
|
|
||||||
`req-${Date.now()}-${Math.random().toString(36).slice(2, 9)}`;
|
|
||||||
|
|
||||||
request.headers['x-request-id'] = requestId;
|
|
||||||
(request as any).requestId = requestId;
|
|
||||||
|
|
||||||
done();
|
|
||||||
});
|
|
||||||
|
|
||||||
// Log request start
|
|
||||||
fastify.addHook('onRequest', (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
fastify.log.info({
|
|
||||||
event: 'request_start',
|
|
||||||
method: request.method,
|
|
||||||
url: request.url,
|
|
||||||
requestId: (request as any).requestId,
|
|
||||||
userAgent: request.headers['user-agent'],
|
|
||||||
clientIp: request.ip || request.headers['x-forwarded-for'] || 'unknown',
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
// Log response
|
|
||||||
fastify.addHook('onResponse', (request: FastifyRequest, reply: FastifyReply, done) => {
|
|
||||||
const log: RequestLog = {
|
|
||||||
method: request.method,
|
|
||||||
url: request.url,
|
|
||||||
statusCode: reply.statusCode,
|
|
||||||
responseTime: reply.elapsedTime,
|
|
||||||
requestId: (request as any).requestId,
|
|
||||||
userAgent: request.headers['user-agent'],
|
|
||||||
clientIp: request.ip || request.headers['x-forwarded-for'] || 'unknown',
|
|
||||||
requestIdHeader: request.headers['x-request-id'] as string,
|
|
||||||
};
|
|
||||||
|
|
||||||
// Log based on status code
|
|
||||||
if (reply.statusCode < 300) {
|
|
||||||
fastify.log.info(log);
|
|
||||||
} else if (reply.statusCode < 400) {
|
|
||||||
fastify.log.warn(log);
|
|
||||||
} else if (reply.statusCode < 500) {
|
|
||||||
fastify.log.warn(log);
|
|
||||||
} else {
|
|
||||||
fastify.log.error(log);
|
|
||||||
}
|
|
||||||
|
|
||||||
done();
|
|
||||||
});
|
|
||||||
}
|
|
||||||
@@ -1,116 +0,0 @@
|
|||||||
import { FastifyInstance, FastifyRequest, FastifyReply } from 'fastify';
|
|
||||||
import { apiEnv, rateLimitConfig } from '../config/api.config';
|
|
||||||
|
|
||||||
// Simple in-memory rate limiter
|
|
||||||
// In production, this should use Redis or similar distributed store
|
|
||||||
class RateLimiter {
|
|
||||||
private store: Map<string, { count: number; resetTime: number }>;
|
|
||||||
|
|
||||||
constructor() {
|
|
||||||
this.store = new Map();
|
|
||||||
}
|
|
||||||
|
|
||||||
async checkLimit(
|
|
||||||
key: string,
|
|
||||||
windowMs: number,
|
|
||||||
maxRequests: number
|
|
||||||
): Promise<{ remaining: number; resetTime: number; retryAfter?: number }> {
|
|
||||||
const now = Date.now();
|
|
||||||
const current = this.store.get(key);
|
|
||||||
|
|
||||||
if (!current || now > current.resetTime) {
|
|
||||||
// Reset window
|
|
||||||
this.store.set(key, {
|
|
||||||
count: 1,
|
|
||||||
resetTime: now + windowMs,
|
|
||||||
});
|
|
||||||
|
|
||||||
return {
|
|
||||||
remaining: maxRequests - 1,
|
|
||||||
resetTime: now + windowMs,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
// Increment counter
|
|
||||||
current.count++;
|
|
||||||
this.store.set(key, current);
|
|
||||||
|
|
||||||
const remaining = maxRequests - current.count;
|
|
||||||
|
|
||||||
if (current.count > maxRequests) {
|
|
||||||
return {
|
|
||||||
remaining: 0,
|
|
||||||
resetTime: current.resetTime,
|
|
||||||
retryAfter: current.resetTime - now,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
return {
|
|
||||||
remaining,
|
|
||||||
resetTime: current.resetTime,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
reset(key: string) {
|
|
||||||
this.store.delete(key);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const rateLimiter = new RateLimiter();
|
|
||||||
|
|
||||||
export async function rateLimitMiddleware(fastify: FastifyInstance) {
|
|
||||||
fastify.addHook('preHandler', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
// Skip rate limiting for health checks
|
|
||||||
if (request.url === '/health') {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Get client identifier (IP or API key)
|
|
||||||
const clientIp = request.ip || request.headers['x-forwarded-for'] || 'unknown';
|
|
||||||
const apiKey = request.headers['x-api-key'] as string | undefined;
|
|
||||||
const key = apiKey ? `api:${apiKey}` : `ip:${clientIp}`;
|
|
||||||
|
|
||||||
// Determine tier based on API key or default to basic
|
|
||||||
let tier = 'basic';
|
|
||||||
if (apiKey) {
|
|
||||||
// In production, fetch tier from user/service lookup
|
|
||||||
// For now, use a simple heuristic based on key format
|
|
||||||
if (apiKey.startsWith('premium_')) {
|
|
||||||
tier = 'premium';
|
|
||||||
} else if (apiKey.startsWith('plus_')) {
|
|
||||||
tier = 'plus';
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const config = rateLimitConfig[tier as keyof typeof rateLimitConfig];
|
|
||||||
const result = await rateLimiter.checkLimit(
|
|
||||||
key,
|
|
||||||
config.windowMs,
|
|
||||||
config.maxRequests
|
|
||||||
);
|
|
||||||
|
|
||||||
// Set rate limit headers
|
|
||||||
reply.header('X-RateLimit-Limit', config.maxRequests);
|
|
||||||
reply.header('X-RateLimit-Remaining', result.remaining);
|
|
||||||
reply.header('X-RateLimit-Reset', Math.ceil(result.resetTime / 1000));
|
|
||||||
|
|
||||||
if (result.retryAfter) {
|
|
||||||
reply.header('Retry-After', Math.ceil(result.retryAfter / 1000));
|
|
||||||
reply.code(429); // Too Many Requests
|
|
||||||
|
|
||||||
return {
|
|
||||||
error: 'Too Many Requests',
|
|
||||||
message: `Rate limit exceeded. Try again in ${Math.ceil(result.retryAfter / 1000)}s`,
|
|
||||||
tier,
|
|
||||||
limit: config.maxRequests,
|
|
||||||
reset: new Date(result.resetTime).toISOString(),
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
// Add tier info to request for downstream use
|
|
||||||
(request as any).rateLimitTier = tier;
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Export for testing
|
|
||||||
export { rateLimiter };
|
|
||||||
@@ -1,164 +0,0 @@
|
|||||||
import { FastifyInstance, FastifyRequest, FastifyReply } from 'fastify';
|
|
||||||
import { redis } from '../config/redis';
|
|
||||||
import { spamRateLimits } from '../services/spamshield/spamshield.config';
|
|
||||||
|
|
||||||
const REDIS_PREFIX = 'spamshield:ratelimit';
|
|
||||||
|
|
||||||
class RedisRateLimiter {
|
|
||||||
async checkLimit(
|
|
||||||
key: string,
|
|
||||||
windowSeconds: number,
|
|
||||||
maxRequests: number
|
|
||||||
): Promise<{
|
|
||||||
remaining: number;
|
|
||||||
resetTime: number;
|
|
||||||
retryAfter?: number;
|
|
||||||
}> {
|
|
||||||
const redisKey = `${REDIS_PREFIX}:${key}`;
|
|
||||||
const now = Date.now();
|
|
||||||
|
|
||||||
const current = await redis.get(redisKey);
|
|
||||||
const windowStart = now - (now % (windowSeconds * 1000));
|
|
||||||
const resetTime = windowStart + windowSeconds * 1000;
|
|
||||||
|
|
||||||
if (!current) {
|
|
||||||
const expirySeconds = Math.ceil((resetTime - now) / 1000);
|
|
||||||
await redis.set(redisKey, '1', 'EX', expirySeconds);
|
|
||||||
|
|
||||||
return {
|
|
||||||
remaining: maxRequests - 1,
|
|
||||||
resetTime,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
const count = parseInt(current, 10) + 1;
|
|
||||||
await redis.set(redisKey, String(count), 'EX', Math.ceil((resetTime - now) / 1000));
|
|
||||||
|
|
||||||
const remaining = maxRequests - count;
|
|
||||||
|
|
||||||
if (count > maxRequests) {
|
|
||||||
return {
|
|
||||||
remaining: 0,
|
|
||||||
resetTime,
|
|
||||||
retryAfter: resetTime - now,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
return {
|
|
||||||
remaining,
|
|
||||||
resetTime,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
async checkDailyLimit(
|
|
||||||
key: string,
|
|
||||||
maxPerDay: number
|
|
||||||
): Promise<{
|
|
||||||
remaining: number;
|
|
||||||
retryAfter?: number;
|
|
||||||
}> {
|
|
||||||
const redisKey = `${REDIS_PREFIX}:daily:${key}`;
|
|
||||||
const now = Date.now();
|
|
||||||
const dayStart = new Date(now);
|
|
||||||
dayStart.setHours(0, 0, 0, 0);
|
|
||||||
const dayEnd = new Date(dayStart);
|
|
||||||
dayEnd.setDate(dayEnd.getDate() + 1);
|
|
||||||
const resetTime = dayEnd.getTime();
|
|
||||||
|
|
||||||
const current = await redis.get(redisKey);
|
|
||||||
const expirySeconds = Math.ceil((resetTime - now) / 1000);
|
|
||||||
|
|
||||||
if (!current) {
|
|
||||||
await redis.set(redisKey, '1', 'EX', expirySeconds);
|
|
||||||
|
|
||||||
return {
|
|
||||||
remaining: maxPerDay - 1,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
const count = parseInt(current, 10) + 1;
|
|
||||||
await redis.set(redisKey, String(count), 'EX', expirySeconds);
|
|
||||||
|
|
||||||
const remaining = maxPerDay - count;
|
|
||||||
|
|
||||||
if (count > maxPerDay) {
|
|
||||||
return {
|
|
||||||
remaining: 0,
|
|
||||||
retryAfter: resetTime - now,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
return {
|
|
||||||
remaining,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
reset(key: string) {
|
|
||||||
const redisKey = `${REDIS_PREFIX}:${key}`;
|
|
||||||
return redis.del(redisKey);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
export const spamRateLimiter = new RedisRateLimiter();
|
|
||||||
|
|
||||||
export async function spamRateLimitMiddleware(fastify: FastifyInstance) {
|
|
||||||
fastify.addHook('preHandler', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const url = request.url || '';
|
|
||||||
|
|
||||||
if (!url.startsWith('/spamshield')) {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
const clientIp = request.ip || (request.headers['x-forwarded-for'] as string) || 'unknown';
|
|
||||||
const apiKey = request.headers['x-api-key'] as string | undefined;
|
|
||||||
const key = apiKey ? `api:${apiKey}` : `ip:${clientIp}`;
|
|
||||||
|
|
||||||
let tier = 'basic';
|
|
||||||
if (apiKey) {
|
|
||||||
if (apiKey.startsWith('premium_')) {
|
|
||||||
tier = 'premium';
|
|
||||||
} else if (apiKey.startsWith('plus_')) {
|
|
||||||
tier = 'plus';
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const config = spamRateLimits[tier as keyof typeof spamRateLimits];
|
|
||||||
|
|
||||||
const minuteResult = await spamRateLimiter.checkLimit(
|
|
||||||
key,
|
|
||||||
60,
|
|
||||||
config.analysesPerMinute
|
|
||||||
);
|
|
||||||
|
|
||||||
const dailyResult = await spamRateLimiter.checkDailyLimit(
|
|
||||||
key,
|
|
||||||
config.analysesPerDay
|
|
||||||
);
|
|
||||||
|
|
||||||
reply.header('X-RateLimit-Limit', config.analysesPerMinute);
|
|
||||||
reply.header('X-RateLimit-Remaining', minuteResult.remaining);
|
|
||||||
reply.header('X-RateLimit-Reset', Math.ceil(minuteResult.resetTime / 1000));
|
|
||||||
reply.header('X-RateLimit-Daily-Limit', config.analysesPerDay);
|
|
||||||
reply.header('X-RateLimit-Daily-Remaining', dailyResult.remaining);
|
|
||||||
|
|
||||||
const retryAfter = minuteResult.retryAfter || dailyResult.retryAfter;
|
|
||||||
|
|
||||||
if (retryAfter) {
|
|
||||||
reply.header('Retry-After', Math.ceil(retryAfter / 1000));
|
|
||||||
reply.code(429);
|
|
||||||
|
|
||||||
return {
|
|
||||||
error: 'Too Many Requests',
|
|
||||||
message: `Spam analysis rate limit exceeded. Try again in ${Math.ceil(retryAfter / 1000)}s`,
|
|
||||||
tier,
|
|
||||||
limit: config.analysesPerMinute,
|
|
||||||
dailyLimit: config.analysesPerDay,
|
|
||||||
reset: new Date(minuteResult.resetTime).toISOString(),
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
(request as any).spamRateLimitTier = tier;
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
export { RedisRateLimiter };
|
|
||||||
@@ -1,285 +0,0 @@
|
|||||||
import { FastifyInstance, FastifyRequest, FastifyReply } from 'fastify';
|
|
||||||
import { prisma, SubscriptionTier } from '@shieldsai/shared-db';
|
|
||||||
import { tierConfig, SubscriptionTier as BillingTier } from '@shieldsai/shared-billing';
|
|
||||||
import {
|
|
||||||
watchlistService,
|
|
||||||
scanService,
|
|
||||||
schedulerService,
|
|
||||||
webhookService,
|
|
||||||
} from '../services/darkwatch';
|
|
||||||
|
|
||||||
export async function darkwatchRoutes(fastify: FastifyInstance) {
|
|
||||||
const authed = async (
|
|
||||||
request: FastifyRequest,
|
|
||||||
reply: FastifyReply
|
|
||||||
): Promise<string | null> => {
|
|
||||||
const authReq = request as FastifyRequest & { user?: { id: string } };
|
|
||||||
const userId = authReq.user?.id;
|
|
||||||
if (!userId) {
|
|
||||||
reply.code(401).send({ error: 'User ID required' });
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
const subscription = await prisma.subscription.findFirst({
|
|
||||||
where: { userId, status: 'active' },
|
|
||||||
select: { id: true, tier: true },
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!subscription) {
|
|
||||||
reply.code(404).send({ error: 'Active subscription not found' });
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
return subscription.id;
|
|
||||||
};
|
|
||||||
|
|
||||||
// GET /darkwatch/watchlist - List watchlist items
|
|
||||||
fastify.get('/watchlist', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const subscriptionId = await authed(request, reply);
|
|
||||||
if (!subscriptionId) return;
|
|
||||||
|
|
||||||
try {
|
|
||||||
const items = await watchlistService.getItems(subscriptionId);
|
|
||||||
return reply.send({ items });
|
|
||||||
} catch (error) {
|
|
||||||
const message = error instanceof Error ? error.message : 'Failed to list watchlist';
|
|
||||||
return reply.code(500).send({ error: message });
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// POST /darkwatch/watchlist - Add watchlist item
|
|
||||||
fastify.post('/watchlist', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const authReq = request as FastifyRequest & { user?: { id: string } };
|
|
||||||
const userId = authReq.user?.id;
|
|
||||||
if (!userId) {
|
|
||||||
return reply.code(401).send({ error: 'User ID required' });
|
|
||||||
}
|
|
||||||
|
|
||||||
const subscription = await prisma.subscription.findFirst({
|
|
||||||
where: { userId, status: 'active' },
|
|
||||||
select: { id: true, tier: true },
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!subscription) {
|
|
||||||
return reply.code(404).send({ error: 'Active subscription not found' });
|
|
||||||
}
|
|
||||||
|
|
||||||
const body = request.body as { type: string; value: string };
|
|
||||||
|
|
||||||
if (!body.type || !body.value) {
|
|
||||||
return reply.code(400).send({ error: 'type and value are required' });
|
|
||||||
}
|
|
||||||
|
|
||||||
const maxItems = tierConfig[subscription.tier as BillingTier].features.maxWatchlistItems;
|
|
||||||
|
|
||||||
try {
|
|
||||||
const item = await watchlistService.addItem(
|
|
||||||
subscription.id,
|
|
||||||
body.type,
|
|
||||||
body.value,
|
|
||||||
maxItems
|
|
||||||
);
|
|
||||||
return reply.code(201).send({ item });
|
|
||||||
} catch (error) {
|
|
||||||
const message = error instanceof Error ? error.message : 'Failed to add watchlist item';
|
|
||||||
return reply.code(422).send({ error: message });
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// DELETE /darkwatch/watchlist/:id - Remove watchlist item
|
|
||||||
fastify.delete('/watchlist/:id', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const subscriptionId = await authed(request, reply);
|
|
||||||
if (!subscriptionId) return;
|
|
||||||
|
|
||||||
const id = (request.params as { id: string }).id;
|
|
||||||
|
|
||||||
try {
|
|
||||||
const item = await watchlistService.removeItem(id, subscriptionId);
|
|
||||||
return reply.send({ item });
|
|
||||||
} catch (error) {
|
|
||||||
const message = error instanceof Error ? error.message : 'Failed to remove watchlist item';
|
|
||||||
return reply.code(422).send({ error: message });
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// POST /darkwatch/scan - Trigger on-demand scan
|
|
||||||
fastify.post('/scan', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const subscriptionId = await authed(request, reply);
|
|
||||||
if (!subscriptionId) return;
|
|
||||||
|
|
||||||
try {
|
|
||||||
const job = await schedulerService.enqueueOnDemandScan(subscriptionId);
|
|
||||||
return reply.send({
|
|
||||||
job: {
|
|
||||||
id: job?.id,
|
|
||||||
status: 'queued',
|
|
||||||
},
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
const message = error instanceof Error ? error.message : 'Failed to trigger scan';
|
|
||||||
return reply.code(422).send({ error: message });
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// GET /darkwatch/scan/schedule - Get scan schedule
|
|
||||||
fastify.get('/scan/schedule', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const subscriptionId = await authed(request, reply);
|
|
||||||
if (!subscriptionId) return;
|
|
||||||
|
|
||||||
try {
|
|
||||||
const schedule = await schedulerService.getScanSchedule(subscriptionId);
|
|
||||||
return reply.send({ schedule });
|
|
||||||
} catch (error) {
|
|
||||||
const message = error instanceof Error ? error.message : 'Failed to get schedule';
|
|
||||||
return reply.code(500).send({ error: message });
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// GET /darkwatch/exposures - List exposures
|
|
||||||
fastify.get('/exposures', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const subscriptionId = await authed(request, reply);
|
|
||||||
if (!subscriptionId) return;
|
|
||||||
|
|
||||||
try {
|
|
||||||
const exposures = await prisma.exposure.findMany({
|
|
||||||
where: { subscriptionId },
|
|
||||||
orderBy: { detectedAt: 'desc' },
|
|
||||||
take: 50,
|
|
||||||
include: {
|
|
||||||
watchlistItem: true,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
return reply.send({ exposures });
|
|
||||||
} catch (error) {
|
|
||||||
const message = error instanceof Error ? error.message : 'Failed to list exposures';
|
|
||||||
return reply.code(500).send({ error: message });
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// GET /darkwatch/alerts - List alerts
|
|
||||||
fastify.get('/alerts', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const authReq = request as FastifyRequest & { user?: { id: string } };
|
|
||||||
const userId = authReq.user?.id;
|
|
||||||
if (!userId) {
|
|
||||||
return reply.code(401).send({ error: 'User ID required' });
|
|
||||||
}
|
|
||||||
|
|
||||||
try {
|
|
||||||
const alerts = await prisma.alert.findMany({
|
|
||||||
where: { userId },
|
|
||||||
orderBy: { createdAt: 'desc' },
|
|
||||||
take: 50,
|
|
||||||
include: {
|
|
||||||
exposure: true,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
return reply.send({ alerts });
|
|
||||||
} catch (error) {
|
|
||||||
const message = error instanceof Error ? error.message : 'Failed to list alerts';
|
|
||||||
return reply.code(500).send({ error: message });
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// PATCH /darkwatch/alerts/:id/read - Mark alert as read
|
|
||||||
fastify.patch('/alerts/:id/read', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const authReq = request as FastifyRequest & { user?: { id: string } };
|
|
||||||
const userId = authReq.user?.id;
|
|
||||||
if (!userId) {
|
|
||||||
return reply.code(401).send({ error: 'User ID required' });
|
|
||||||
}
|
|
||||||
|
|
||||||
const id = (request.params as { id: string }).id;
|
|
||||||
|
|
||||||
try {
|
|
||||||
const alert = await prisma.alert.update({
|
|
||||||
where: { id },
|
|
||||||
data: { isRead: true, readAt: new Date() },
|
|
||||||
});
|
|
||||||
return reply.send({ alert });
|
|
||||||
} catch (error) {
|
|
||||||
const message = error instanceof Error ? error.message : 'Failed to mark alert as read';
|
|
||||||
return reply.code(422).send({ error: message });
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// POST /darkwatch/webhook - External webhook receiver
|
|
||||||
fastify.post('/webhook', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const body = request.body as Record<string, unknown>;
|
|
||||||
|
|
||||||
const source = typeof body.source === 'string' ? body.source : '';
|
|
||||||
const identifier = typeof body.identifier === 'string' ? body.identifier : '';
|
|
||||||
const identifierType = typeof body.identifierType === 'string' ? body.identifierType : '';
|
|
||||||
const metadata = body.metadata as Record<string, unknown> | undefined;
|
|
||||||
const timestamp = typeof body.timestamp === 'string' ? body.timestamp : new Date().toISOString();
|
|
||||||
|
|
||||||
if (!source || !identifier || !identifierType) {
|
|
||||||
return reply.code(400).send({
|
|
||||||
error: 'source, identifier, and identifierType are required',
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
const signature = request.headers['x-webhook-signature'] as string | undefined;
|
|
||||||
const webhookTimestamp = request.headers['x-webhook-timestamp'] as string | undefined;
|
|
||||||
|
|
||||||
if (!signature || !webhookTimestamp) {
|
|
||||||
return reply.code(401).send({ error: 'Webhook signature and timestamp required' });
|
|
||||||
}
|
|
||||||
|
|
||||||
const valid = await webhookService.verifyWebhookSignature(
|
|
||||||
JSON.stringify(body),
|
|
||||||
signature,
|
|
||||||
webhookTimestamp
|
|
||||||
);
|
|
||||||
if (!valid) {
|
|
||||||
return reply.code(401).send({ error: 'Invalid webhook signature' });
|
|
||||||
}
|
|
||||||
|
|
||||||
try {
|
|
||||||
const result = await webhookService.processExternalWebhook({
|
|
||||||
source,
|
|
||||||
identifier,
|
|
||||||
identifierType,
|
|
||||||
metadata,
|
|
||||||
timestamp,
|
|
||||||
});
|
|
||||||
|
|
||||||
return reply.send({
|
|
||||||
processed: true,
|
|
||||||
exposuresCreated: result.exposuresCreated,
|
|
||||||
alertsCreated: result.alertsCreated,
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
const message = error instanceof Error ? error.message : 'Webhook processing failed';
|
|
||||||
console.error('[DarkWatch:Webhook] Error:', message);
|
|
||||||
return reply.code(500).send({ error: 'Webhook processing failed' });
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// POST /darkwatch/scheduler/init - Initialize scheduled scans for all subscriptions
|
|
||||||
fastify.post('/scheduler/init', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
try {
|
|
||||||
const jobsEnqueued = await schedulerService.scheduleSubscriptionScans();
|
|
||||||
return reply.send({
|
|
||||||
scheduled: jobsEnqueued.length,
|
|
||||||
jobs: jobsEnqueued,
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
const message = error instanceof Error ? error.message : 'Scheduler init failed';
|
|
||||||
return reply.code(500).send({ error: message });
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// POST /darkwatch/scheduler/reschedule - Reschedule all scans
|
|
||||||
fastify.post('/scheduler/reschedule', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
try {
|
|
||||||
const jobsEnqueued = await schedulerService.rescheduleAll();
|
|
||||||
return reply.send({
|
|
||||||
rescheduled: jobsEnqueued.length,
|
|
||||||
jobs: jobsEnqueued,
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
const message = error instanceof Error ? error.message : 'Scheduler reschedule failed';
|
|
||||||
return reply.code(500).send({ error: message });
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
@@ -1,142 +0,0 @@
|
|||||||
import { FastifyInstance, FastifyRequest, FastifyReply } from 'fastify';
|
|
||||||
import { authMiddleware, AuthRequest } from './auth.middleware';
|
|
||||||
import { voiceprintRoutes } from './voiceprint.routes';
|
|
||||||
import { spamshieldRoutes } from './spamshield.routes';
|
|
||||||
import { darkwatchRoutes } from './darkwatch.routes';
|
|
||||||
|
|
||||||
export async function routes(fastify: FastifyInstance) {
|
|
||||||
// Authenticated routes group
|
|
||||||
fastify.register(
|
|
||||||
async (authenticated) => {
|
|
||||||
// Add auth requirement
|
|
||||||
authenticated.addHook('onRequest', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
await fastify.requireAuth(request as AuthRequest);
|
|
||||||
});
|
|
||||||
|
|
||||||
// Example authenticated endpoint
|
|
||||||
authenticated.get('/user/me', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const authReq = request as AuthRequest;
|
|
||||||
return {
|
|
||||||
user: authReq.user,
|
|
||||||
authType: authReq.authType,
|
|
||||||
};
|
|
||||||
});
|
|
||||||
|
|
||||||
// Example service endpoint
|
|
||||||
authenticated.get('/services', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
return {
|
|
||||||
services: [
|
|
||||||
{
|
|
||||||
name: 'user-service',
|
|
||||||
url: '/api/v1/services/user',
|
|
||||||
status: 'healthy',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: 'billing-service',
|
|
||||||
url: '/api/v1/services/billing',
|
|
||||||
status: 'healthy',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: 'notification-service',
|
|
||||||
url: '/api/v1/services/notifications',
|
|
||||||
status: 'healthy',
|
|
||||||
},
|
|
||||||
],
|
|
||||||
};
|
|
||||||
});
|
|
||||||
},
|
|
||||||
{ prefix: '/auth' }
|
|
||||||
);
|
|
||||||
|
|
||||||
// Public API routes
|
|
||||||
fastify.register(
|
|
||||||
async (publicRouter) => {
|
|
||||||
// Version info
|
|
||||||
publicRouter.get('/info', async () => {
|
|
||||||
return {
|
|
||||||
version: '1.0.0',
|
|
||||||
environment: process.env.NODE_ENV || 'development',
|
|
||||||
build: process.env.npm_package_version || 'unknown',
|
|
||||||
};
|
|
||||||
});
|
|
||||||
|
|
||||||
// API documentation
|
|
||||||
publicRouter.get('/docs', async () => {
|
|
||||||
return {
|
|
||||||
title: 'FrenoCorp API Gateway',
|
|
||||||
version: '1.0.0',
|
|
||||||
endpoints: {
|
|
||||||
public: [
|
|
||||||
{ method: 'GET', path: '/', description: 'Root endpoint' },
|
|
||||||
{ method: 'GET', path: '/health', description: 'Health check' },
|
|
||||||
{ method: 'GET', path: '/api/v1/info', description: 'API version info' },
|
|
||||||
{ method: 'GET', path: '/api/v1/docs', description: 'API documentation' },
|
|
||||||
],
|
|
||||||
authenticated: [
|
|
||||||
{ method: 'GET', path: '/api/v1/auth/user/me', description: 'Get current user' },
|
|
||||||
{ method: 'GET', path: '/api/v1/auth/services', description: 'List available services' },
|
|
||||||
],
|
|
||||||
},
|
|
||||||
};
|
|
||||||
});
|
|
||||||
},
|
|
||||||
{ prefix: '/api/v1' }
|
|
||||||
);
|
|
||||||
|
|
||||||
// Service proxy placeholder (for future microservice routing)
|
|
||||||
fastify.register(
|
|
||||||
async (services) => {
|
|
||||||
services.get('/services/user', async (request, reply) => {
|
|
||||||
// In production, proxy to actual user service
|
|
||||||
return {
|
|
||||||
service: 'user-service',
|
|
||||||
message: 'User service endpoint',
|
|
||||||
timestamp: new Date().toISOString(),
|
|
||||||
};
|
|
||||||
});
|
|
||||||
|
|
||||||
services.get('/services/billing', async (request, reply) => {
|
|
||||||
// In production, proxy to actual billing service
|
|
||||||
return {
|
|
||||||
service: 'billing-service',
|
|
||||||
message: 'Billing service endpoint',
|
|
||||||
timestamp: new Date().toISOString(),
|
|
||||||
};
|
|
||||||
});
|
|
||||||
|
|
||||||
services.get('/services/notifications', async (request, reply) => {
|
|
||||||
// In production, proxy to actual notification service
|
|
||||||
return {
|
|
||||||
service: 'notification-service',
|
|
||||||
message: 'Notification service endpoint',
|
|
||||||
timestamp: new Date().toISOString(),
|
|
||||||
};
|
|
||||||
});
|
|
||||||
},
|
|
||||||
{ prefix: '/api/v1/services' }
|
|
||||||
);
|
|
||||||
|
|
||||||
// VoicePrint service routes
|
|
||||||
fastify.register(
|
|
||||||
async (voiceprintRouter) => {
|
|
||||||
await voiceprintRoutes(voiceprintRouter);
|
|
||||||
},
|
|
||||||
{ prefix: '/voiceprint' }
|
|
||||||
);
|
|
||||||
|
|
||||||
// SpamShield service routes
|
|
||||||
fastify.register(
|
|
||||||
async (spamshieldRouter) => {
|
|
||||||
await spamshieldRoutes(spamshieldRouter);
|
|
||||||
},
|
|
||||||
{ prefix: '/spamshield' }
|
|
||||||
);
|
|
||||||
|
|
||||||
// DarkWatch service routes
|
|
||||||
fastify.register(
|
|
||||||
async (darkwatchRouter) => {
|
|
||||||
await darkwatchRoutes(darkwatchRouter);
|
|
||||||
},
|
|
||||||
{ prefix: '/darkwatch' }
|
|
||||||
);
|
|
||||||
}
|
|
||||||
@@ -1,213 +0,0 @@
|
|||||||
import { FastifyInstance } from 'fastify';
|
|
||||||
import { NotificationService } from '@shieldsai/shared-notifications';
|
|
||||||
|
|
||||||
export async function notificationRoutes(fastify: FastifyInstance): Promise<void> {
|
|
||||||
let notificationService: NotificationService | undefined;
|
|
||||||
|
|
||||||
// Initialize notification service (will be injected via config)
|
|
||||||
fastify.addHook('onReady', async () => {
|
|
||||||
// Notification service will be initialized from config
|
|
||||||
notificationService = fastify.notificationService;
|
|
||||||
});
|
|
||||||
|
|
||||||
/**
|
|
||||||
* POST /api/v1/notifications/send
|
|
||||||
* Send a notification to a user
|
|
||||||
*/
|
|
||||||
fastify.post(
|
|
||||||
'/notifications/send',
|
|
||||||
{
|
|
||||||
schema: {
|
|
||||||
body: {
|
|
||||||
type: 'object',
|
|
||||||
required: ['userId', 'channel', 'subject', 'body'],
|
|
||||||
properties: {
|
|
||||||
userId: { type: 'string' },
|
|
||||||
channel: { type: 'string', enum: ['email', 'push', 'sms'] },
|
|
||||||
subject: { type: 'string' },
|
|
||||||
body: { type: 'string' },
|
|
||||||
email: { type: 'string' },
|
|
||||||
phone: { type: 'string' },
|
|
||||||
fcmToken: { type: 'string' },
|
|
||||||
apnsToken: { type: 'string' },
|
|
||||||
priority: { type: 'string', enum: ['low', 'normal', 'high', 'urgent'] },
|
|
||||||
metadata: { type: 'object' },
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
async (request, reply) => {
|
|
||||||
const { userId, channel, subject, body, priority, metadata } = request.body;
|
|
||||||
|
|
||||||
const recipient = {
|
|
||||||
userId,
|
|
||||||
email: request.body.email,
|
|
||||||
phone: request.body.phone,
|
|
||||||
fcmToken: request.body.fcmToken,
|
|
||||||
apnsToken: request.body.apnsToken,
|
|
||||||
};
|
|
||||||
|
|
||||||
try {
|
|
||||||
if (!notificationService) {
|
|
||||||
return reply.status(503).send({
|
|
||||||
success: false,
|
|
||||||
error: 'Notification service not initialized',
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
const notifications = await notificationService.sendMultiChannelNotification(
|
|
||||||
recipient,
|
|
||||||
channel,
|
|
||||||
subject,
|
|
||||||
body,
|
|
||||||
priority,
|
|
||||||
metadata
|
|
||||||
);
|
|
||||||
|
|
||||||
return reply.send({
|
|
||||||
success: true,
|
|
||||||
notifications,
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
return reply.status(500).send({
|
|
||||||
success: false,
|
|
||||||
error: error instanceof Error ? error.message : 'Unknown error',
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
/**
|
|
||||||
* GET /api/v1/notifications/:userId/preferences
|
|
||||||
* Get notification preferences for a user
|
|
||||||
*/
|
|
||||||
fastify.get(
|
|
||||||
'/notifications/:userId/preferences',
|
|
||||||
{
|
|
||||||
schema: {
|
|
||||||
params: {
|
|
||||||
type: 'object',
|
|
||||||
required: ['userId'],
|
|
||||||
properties: {
|
|
||||||
userId: { type: 'string' },
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
async (request, reply) => {
|
|
||||||
const { userId } = request.params;
|
|
||||||
|
|
||||||
try {
|
|
||||||
if (!notificationService) {
|
|
||||||
return reply.status(503).send({
|
|
||||||
success: false,
|
|
||||||
error: 'Notification service not initialized',
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
const preferences = await notificationService.getNotificationPreferences(userId);
|
|
||||||
|
|
||||||
return reply.send({
|
|
||||||
success: true,
|
|
||||||
preferences,
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
return reply.status(500).send({
|
|
||||||
success: false,
|
|
||||||
error: error instanceof Error ? error.message : 'Unknown error',
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
/**
|
|
||||||
* PUT /api/v1/notifications/:userId/preferences
|
|
||||||
* Update notification preferences for a user
|
|
||||||
*/
|
|
||||||
fastify.put(
|
|
||||||
'/notifications/:userId/preferences',
|
|
||||||
{
|
|
||||||
schema: {
|
|
||||||
params: {
|
|
||||||
type: 'object',
|
|
||||||
required: ['userId'],
|
|
||||||
properties: {
|
|
||||||
userId: { type: 'string' },
|
|
||||||
},
|
|
||||||
},
|
|
||||||
body: {
|
|
||||||
type: 'object',
|
|
||||||
properties: {
|
|
||||||
email: {
|
|
||||||
type: 'object',
|
|
||||||
properties: {
|
|
||||||
enabled: { type: 'boolean' },
|
|
||||||
categories: { type: 'array', items: { type: 'string' } },
|
|
||||||
},
|
|
||||||
},
|
|
||||||
push: {
|
|
||||||
type: 'object',
|
|
||||||
properties: {
|
|
||||||
enabled: { type: 'boolean' },
|
|
||||||
categories: { type: 'array', items: { type: 'string' } },
|
|
||||||
},
|
|
||||||
},
|
|
||||||
sms: {
|
|
||||||
type: 'object',
|
|
||||||
properties: {
|
|
||||||
enabled: { type: 'boolean' },
|
|
||||||
categories: { type: 'array', items: { type: 'string' } },
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
async (request, reply) => {
|
|
||||||
const { userId } = request.params;
|
|
||||||
const updates = request.body;
|
|
||||||
|
|
||||||
try {
|
|
||||||
// TODO: Update preferences in database
|
|
||||||
return reply.send({
|
|
||||||
success: true,
|
|
||||||
message: 'Preferences updated',
|
|
||||||
userId,
|
|
||||||
updates,
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
return reply.status(500).send({
|
|
||||||
success: false,
|
|
||||||
error: error instanceof Error ? error.message : 'Unknown error',
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
/**
|
|
||||||
* GET /api/v1/notifications/config
|
|
||||||
* Get notification configuration status
|
|
||||||
*/
|
|
||||||
fastify.get('/notifications/config', async (request, reply) => {
|
|
||||||
try {
|
|
||||||
if (!notificationService) {
|
|
||||||
return reply.status(503).send({
|
|
||||||
success: false,
|
|
||||||
error: 'Notification service not initialized',
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
const config = notificationService.getConfigSummary();
|
|
||||||
|
|
||||||
return reply.send({
|
|
||||||
success: true,
|
|
||||||
config,
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
return reply.status(500).send({
|
|
||||||
success: false,
|
|
||||||
error: error instanceof Error ? error.message : 'Unknown error',
|
|
||||||
});
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
@@ -1,252 +0,0 @@
|
|||||||
import { FastifyInstance, FastifyRequest, FastifyReply } from 'fastify';
|
|
||||||
import {
|
|
||||||
numberReputationService,
|
|
||||||
smsClassifierService,
|
|
||||||
callAnalysisService,
|
|
||||||
spamFeedbackService,
|
|
||||||
} from '../services/spamshield';
|
|
||||||
import { ErrorHandler, SpamErrorCode } from '../services/spamshield/spamshield.error-handler';
|
|
||||||
|
|
||||||
export async function spamshieldRoutes(fastify: FastifyInstance) {
|
|
||||||
// Classify SMS text
|
|
||||||
fastify.post('/sms/classify', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const authReq = request as FastifyRequest & { user?: { id: string } };
|
|
||||||
const userId = authReq.user?.id;
|
|
||||||
|
|
||||||
if (!userId) {
|
|
||||||
ErrorHandler.send(reply, SpamErrorCode.UNAUTHORIZED, 'User ID required', { status: 401 });
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
const body = request.body as { text: string };
|
|
||||||
|
|
||||||
const textValidation = ErrorHandler.validateRequiredField(body.text, 'text');
|
|
||||||
if (!textValidation.isValid && textValidation.error) {
|
|
||||||
ErrorHandler.send(reply, textValidation.error.code, textValidation.error.message, {
|
|
||||||
field: textValidation.error.field,
|
|
||||||
status: 400,
|
|
||||||
});
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
try {
|
|
||||||
const result = await smsClassifierService.classify(body.text);
|
|
||||||
return reply.send({
|
|
||||||
classification: {
|
|
||||||
isSpam: result.isSpam,
|
|
||||||
confidence: result.confidence,
|
|
||||||
spamFeatures: result.spamFeatures,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
ErrorHandler.send(reply, SpamErrorCode.CLASSIFICATION_FAILED, 'Classification failed', {
|
|
||||||
status: 422,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// Check number reputation
|
|
||||||
fastify.post('/number/reputation', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const authReq = request as FastifyRequest & { user?: { id: string } };
|
|
||||||
const userId = authReq.user?.id;
|
|
||||||
|
|
||||||
if (!userId) {
|
|
||||||
ErrorHandler.send(reply, SpamErrorCode.UNAUTHORIZED, 'User ID required', { status: 401 });
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
const body = request.body as { phoneNumber: string };
|
|
||||||
|
|
||||||
const phoneValidation = ErrorHandler.validateRequiredField(body.phoneNumber, 'phoneNumber');
|
|
||||||
if (!phoneValidation.isValid && phoneValidation.error) {
|
|
||||||
ErrorHandler.send(reply, phoneValidation.error.code, phoneValidation.error.message, {
|
|
||||||
field: phoneValidation.error.field,
|
|
||||||
status: 400,
|
|
||||||
});
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
try {
|
|
||||||
const result = await numberReputationService.checkReputation(body.phoneNumber);
|
|
||||||
return reply.send({
|
|
||||||
reputation: {
|
|
||||||
isSpam: result.isSpam,
|
|
||||||
confidence: result.confidence,
|
|
||||||
spamType: result.spamType,
|
|
||||||
reportCount: result.reportCount,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
ErrorHandler.send(reply, SpamErrorCode.REPUTATION_CHECK_FAILED, 'Reputation check failed', {
|
|
||||||
status: 422,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// Analyze incoming call
|
|
||||||
fastify.post('/call/analyze', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const authReq = request as FastifyRequest & { user?: { id: string } };
|
|
||||||
const userId = authReq.user?.id;
|
|
||||||
|
|
||||||
if (!userId) {
|
|
||||||
ErrorHandler.send(reply, SpamErrorCode.UNAUTHORIZED, 'User ID required', { status: 401 });
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
const body = request.body as {
|
|
||||||
phoneNumber: string;
|
|
||||||
duration?: number;
|
|
||||||
callTime: string;
|
|
||||||
isVoip?: boolean;
|
|
||||||
};
|
|
||||||
|
|
||||||
const phoneValidation = ErrorHandler.validateRequiredField(body.phoneNumber, 'phoneNumber');
|
|
||||||
const callTimeValidation = ErrorHandler.validateRequiredField(body.callTime, 'callTime');
|
|
||||||
|
|
||||||
if (!phoneValidation.isValid && phoneValidation.error) {
|
|
||||||
ErrorHandler.send(reply, phoneValidation.error.code, phoneValidation.error.message, {
|
|
||||||
field: phoneValidation.error.field,
|
|
||||||
status: 400,
|
|
||||||
});
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!callTimeValidation.isValid && callTimeValidation.error) {
|
|
||||||
ErrorHandler.send(reply, callTimeValidation.error.code, callTimeValidation.error.message, {
|
|
||||||
field: callTimeValidation.error.field,
|
|
||||||
status: 400,
|
|
||||||
});
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
try {
|
|
||||||
const result = await callAnalysisService.analyzeCall({
|
|
||||||
phoneNumber: body.phoneNumber,
|
|
||||||
duration: body.duration,
|
|
||||||
callTime: new Date(body.callTime),
|
|
||||||
isVoip: body.isVoip,
|
|
||||||
});
|
|
||||||
return reply.send({
|
|
||||||
analysis: {
|
|
||||||
decision: result.decision,
|
|
||||||
confidence: result.confidence,
|
|
||||||
reasons: result.reasons,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
ErrorHandler.send(reply, SpamErrorCode.ANALYSIS_FAILED, 'Call analysis failed', {
|
|
||||||
status: 422,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// Record spam feedback
|
|
||||||
fastify.post('/feedback', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const authReq = request as FastifyRequest & { user?: { id: string } };
|
|
||||||
const userId = authReq.user?.id;
|
|
||||||
|
|
||||||
if (!userId) {
|
|
||||||
ErrorHandler.send(reply, SpamErrorCode.UNAUTHORIZED, 'User ID required', { status: 401 });
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
const body = request.body as {
|
|
||||||
phoneNumber: string;
|
|
||||||
isSpam: boolean;
|
|
||||||
confidence?: number;
|
|
||||||
metadata?: Record<string, unknown>;
|
|
||||||
};
|
|
||||||
|
|
||||||
const phoneValidation = ErrorHandler.validateRequiredField(body.phoneNumber, 'phoneNumber');
|
|
||||||
if (!phoneValidation.isValid && phoneValidation.error) {
|
|
||||||
ErrorHandler.send(reply, phoneValidation.error.code, phoneValidation.error.message, {
|
|
||||||
field: phoneValidation.error.field,
|
|
||||||
status: 400,
|
|
||||||
});
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
const isSpamValidation = ErrorHandler.validateBooleanField(body.isSpam, 'isSpam');
|
|
||||||
if (!isSpamValidation.isValid && isSpamValidation.error) {
|
|
||||||
ErrorHandler.send(reply, isSpamValidation.error.code, isSpamValidation.error.message, {
|
|
||||||
field: isSpamValidation.error.field,
|
|
||||||
status: 400,
|
|
||||||
});
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
try {
|
|
||||||
const feedback = await spamFeedbackService.recordFeedback(
|
|
||||||
userId,
|
|
||||||
body.phoneNumber,
|
|
||||||
body.isSpam,
|
|
||||||
body.confidence,
|
|
||||||
body.metadata
|
|
||||||
);
|
|
||||||
return reply.code(201).send({
|
|
||||||
feedback: {
|
|
||||||
id: feedback.id,
|
|
||||||
phoneNumber: feedback.phoneNumber,
|
|
||||||
isSpam: feedback.isSpam,
|
|
||||||
createdAt: feedback.createdAt,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
ErrorHandler.send(reply, SpamErrorCode.FEEDBACK_RECORD_FAILED, 'Feedback recording failed', {
|
|
||||||
status: 422,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// Get spam history
|
|
||||||
fastify.get('/history', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const authReq = request as FastifyRequest & { user?: { id: string } };
|
|
||||||
const userId = authReq.user?.id;
|
|
||||||
|
|
||||||
if (!userId) {
|
|
||||||
ErrorHandler.send(reply, SpamErrorCode.UNAUTHORIZED, 'User ID required', { status: 401 });
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
const query = request.query as {
|
|
||||||
limit?: string;
|
|
||||||
isSpam?: string;
|
|
||||||
startDate?: string;
|
|
||||||
};
|
|
||||||
|
|
||||||
const results = await spamFeedbackService.getSpamHistory(userId, {
|
|
||||||
limit: query.limit ? parseInt(query.limit, 10) : undefined,
|
|
||||||
isSpam: query.isSpam !== undefined ? query.isSpam === 'true' : undefined,
|
|
||||||
startDate: query.startDate ? new Date(query.startDate) : undefined,
|
|
||||||
});
|
|
||||||
|
|
||||||
return reply.send({
|
|
||||||
history: results.map((r) => ({
|
|
||||||
id: r.id,
|
|
||||||
phoneNumber: r.phoneNumber,
|
|
||||||
isSpam: r.isSpam,
|
|
||||||
createdAt: r.createdAt,
|
|
||||||
})),
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
// Get spam statistics
|
|
||||||
fastify.get('/statistics', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const authReq = request as FastifyRequest & { user?: { id: string } };
|
|
||||||
const userId = authReq.user?.id;
|
|
||||||
|
|
||||||
if (!userId) {
|
|
||||||
ErrorHandler.send(reply, SpamErrorCode.UNAUTHORIZED, 'User ID required', { status: 401 });
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
try {
|
|
||||||
const stats = await spamFeedbackService.getStatistics(userId);
|
|
||||||
return reply.send({ statistics: stats });
|
|
||||||
} catch (error) {
|
|
||||||
ErrorHandler.send(reply, SpamErrorCode.ANALYSIS_FAILED, 'Statistics retrieval failed', {
|
|
||||||
status: 422,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
@@ -1,257 +0,0 @@
|
|||||||
import { FastifyInstance, FastifyRequest, FastifyReply } from 'fastify';
|
|
||||||
import {
|
|
||||||
voiceEnrollmentService,
|
|
||||||
analysisService,
|
|
||||||
batchAnalysisService,
|
|
||||||
voicePrintEnv,
|
|
||||||
AnalysisJobStatus,
|
|
||||||
} from '../services/voiceprint';
|
|
||||||
|
|
||||||
export async function voiceprintRoutes(fastify: FastifyInstance) {
|
|
||||||
// Enroll a new voice profile
|
|
||||||
fastify.post('/enroll', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const authReq = request as FastifyRequest & { user?: { id: string } };
|
|
||||||
const userId = authReq.user?.id;
|
|
||||||
|
|
||||||
if (!userId) {
|
|
||||||
return reply.code(401).send({ error: 'User ID required' });
|
|
||||||
}
|
|
||||||
|
|
||||||
const body = request.body as {
|
|
||||||
name: string;
|
|
||||||
audio: Buffer;
|
|
||||||
};
|
|
||||||
|
|
||||||
if (!body.name || !body.audio) {
|
|
||||||
return reply.code(400).send({ error: 'name and audio are required' });
|
|
||||||
}
|
|
||||||
|
|
||||||
try {
|
|
||||||
const enrollment = await voiceEnrollmentService.enroll(
|
|
||||||
userId,
|
|
||||||
body.name,
|
|
||||||
body.audio
|
|
||||||
);
|
|
||||||
return reply.code(201).send({
|
|
||||||
enrollment: {
|
|
||||||
id: enrollment.id,
|
|
||||||
name: enrollment.name,
|
|
||||||
isActive: enrollment.isActive,
|
|
||||||
createdAt: enrollment.createdAt,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
const message = error instanceof Error ? error.message : 'Enrollment failed';
|
|
||||||
return reply.code(422).send({ error: message });
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// List user's voice enrollments
|
|
||||||
fastify.get('/enrollments', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const authReq = request as FastifyRequest & { user?: { id: string } };
|
|
||||||
const userId = authReq.user?.id;
|
|
||||||
|
|
||||||
if (!userId) {
|
|
||||||
return reply.code(401).send({ error: 'User ID required' });
|
|
||||||
}
|
|
||||||
|
|
||||||
const isActive = request.query as { isActive?: string };
|
|
||||||
const limit = request.query as { limit?: string };
|
|
||||||
const offset = request.query as { offset?: string };
|
|
||||||
|
|
||||||
const enrollments = await voiceEnrollmentService.listEnrollments(userId, {
|
|
||||||
isActive: isActive.isActive !== undefined
|
|
||||||
? isActive.isActive === 'true'
|
|
||||||
: undefined,
|
|
||||||
limit: limit.limit ? parseInt(limit.limit, 10) : undefined,
|
|
||||||
offset: offset.offset ? parseInt(offset.offset, 10) : undefined,
|
|
||||||
});
|
|
||||||
|
|
||||||
return reply.send({
|
|
||||||
enrollments: enrollments.map((e) => ({
|
|
||||||
id: e.id,
|
|
||||||
name: e.name,
|
|
||||||
isActive: e.isActive,
|
|
||||||
createdAt: e.createdAt,
|
|
||||||
})),
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
// Remove an enrollment
|
|
||||||
fastify.delete('/enrollments/:id', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const authReq = request as FastifyRequest & { user?: { id: string } };
|
|
||||||
const userId = authReq.user?.id;
|
|
||||||
|
|
||||||
if (!userId) {
|
|
||||||
return reply.code(401).send({ error: 'User ID required' });
|
|
||||||
}
|
|
||||||
|
|
||||||
const enrollmentId = (request.params as { id: string }).id;
|
|
||||||
|
|
||||||
try {
|
|
||||||
const enrollment = await voiceEnrollmentService.removeEnrollment(
|
|
||||||
enrollmentId,
|
|
||||||
userId
|
|
||||||
);
|
|
||||||
return reply.send({
|
|
||||||
enrollment: {
|
|
||||||
id: enrollment.id,
|
|
||||||
name: enrollment.name,
|
|
||||||
isActive: enrollment.isActive,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
const message = error instanceof Error ? error.message : 'Removal failed';
|
|
||||||
return reply.code(404).send({ error: message });
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// Analyze a single audio file
|
|
||||||
fastify.post('/analyze', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const authReq = request as FastifyRequest & { user?: { id: string } };
|
|
||||||
const userId = authReq.user?.id;
|
|
||||||
|
|
||||||
if (!userId) {
|
|
||||||
return reply.code(401).send({ error: 'User ID required' });
|
|
||||||
}
|
|
||||||
|
|
||||||
const body = request.body as {
|
|
||||||
audio: Buffer;
|
|
||||||
enrollmentId?: string;
|
|
||||||
audioUrl?: string;
|
|
||||||
};
|
|
||||||
|
|
||||||
if (!body.audio) {
|
|
||||||
return reply.code(400).send({ error: 'audio is required' });
|
|
||||||
}
|
|
||||||
|
|
||||||
try {
|
|
||||||
const result = await analysisService.analyze(userId, body.audio, {
|
|
||||||
enrollmentId: body.enrollmentId,
|
|
||||||
audioUrl: body.audioUrl,
|
|
||||||
});
|
|
||||||
return reply.code(201).send({
|
|
||||||
analysis: {
|
|
||||||
id: result.id,
|
|
||||||
isSynthetic: result.isSynthetic,
|
|
||||||
confidence: result.confidence,
|
|
||||||
analysisResult: result.analysisResult,
|
|
||||||
createdAt: result.createdAt,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
const message = error instanceof Error ? error.message : 'Analysis failed';
|
|
||||||
return reply.code(422).send({ error: message });
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// Get analysis result by ID
|
|
||||||
fastify.get('/results/:id', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const authReq = request as FastifyRequest & { user?: { id: string } };
|
|
||||||
const userId = authReq.user?.id;
|
|
||||||
|
|
||||||
if (!userId) {
|
|
||||||
return reply.code(401).send({ error: 'User ID required' });
|
|
||||||
}
|
|
||||||
|
|
||||||
const analysisId = (request.params as { id: string }).id;
|
|
||||||
const result = await analysisService.getResult(analysisId, userId);
|
|
||||||
|
|
||||||
if (!result) {
|
|
||||||
return reply.code(404).send({ error: 'Analysis not found' });
|
|
||||||
}
|
|
||||||
|
|
||||||
return reply.send({
|
|
||||||
analysis: {
|
|
||||||
id: result.id,
|
|
||||||
isSynthetic: result.isSynthetic,
|
|
||||||
confidence: result.confidence,
|
|
||||||
analysisResult: result.analysisResult,
|
|
||||||
createdAt: result.createdAt,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
// Get analysis history
|
|
||||||
fastify.get('/history', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const authReq = request as FastifyRequest & { user?: { id: string } };
|
|
||||||
const userId = authReq.user?.id;
|
|
||||||
|
|
||||||
if (!userId) {
|
|
||||||
return reply.code(401).send({ error: 'User ID required' });
|
|
||||||
}
|
|
||||||
|
|
||||||
const query = request.query as {
|
|
||||||
limit?: string;
|
|
||||||
offset?: string;
|
|
||||||
isSynthetic?: string;
|
|
||||||
};
|
|
||||||
|
|
||||||
const results = await analysisService.getHistory(userId, {
|
|
||||||
limit: query.limit ? parseInt(query.limit, 10) : undefined,
|
|
||||||
offset: query.offset ? parseInt(query.offset, 10) : undefined,
|
|
||||||
isSynthetic: query.isSynthetic !== undefined
|
|
||||||
? query.isSynthetic === 'true'
|
|
||||||
: undefined,
|
|
||||||
});
|
|
||||||
|
|
||||||
return reply.send({
|
|
||||||
analyses: results.map((r) => ({
|
|
||||||
id: r.id,
|
|
||||||
isSynthetic: r.isSynthetic,
|
|
||||||
confidence: r.confidence,
|
|
||||||
createdAt: r.createdAt,
|
|
||||||
})),
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
// Batch analyze multiple audio files
|
|
||||||
fastify.post('/batch', async (request: FastifyRequest, reply: FastifyReply) => {
|
|
||||||
const authReq = request as FastifyRequest & { user?: { id: string } };
|
|
||||||
const userId = authReq.user?.id;
|
|
||||||
|
|
||||||
if (!userId) {
|
|
||||||
return reply.code(401).send({ error: 'User ID required' });
|
|
||||||
}
|
|
||||||
|
|
||||||
const body = request.body as {
|
|
||||||
files: Array<{
|
|
||||||
name: string;
|
|
||||||
audio: Buffer;
|
|
||||||
audioUrl?: string;
|
|
||||||
}>;
|
|
||||||
enrollmentId?: string;
|
|
||||||
};
|
|
||||||
|
|
||||||
if (!body.files || body.files.length === 0) {
|
|
||||||
return reply.code(400).send({ error: 'files array is required' });
|
|
||||||
}
|
|
||||||
|
|
||||||
try {
|
|
||||||
const result = await batchAnalysisService.analyzeBatch(
|
|
||||||
userId,
|
|
||||||
body.files.map((f) => ({
|
|
||||||
name: f.name,
|
|
||||||
buffer: f.audio,
|
|
||||||
audioUrl: f.audioUrl,
|
|
||||||
})),
|
|
||||||
{
|
|
||||||
enrollmentId: body.enrollmentId,
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
return reply.code(201).send({
|
|
||||||
jobId: result.jobId,
|
|
||||||
results: result.results.map((r) => ({
|
|
||||||
id: r.id,
|
|
||||||
isSynthetic: r.isSynthetic,
|
|
||||||
confidence: r.confidence,
|
|
||||||
})),
|
|
||||||
summary: result.summary,
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
const message = error instanceof Error ? error.message : 'Batch analysis failed';
|
|
||||||
return reply.code(422).send({ error: message });
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
@@ -1,174 +0,0 @@
|
|||||||
import { prisma, AlertType, AlertSeverity } from '@shieldsai/shared-db';
|
|
||||||
import {
|
|
||||||
NotificationService,
|
|
||||||
NotificationPriority,
|
|
||||||
loadNotificationConfig,
|
|
||||||
} from '@shieldsai/shared-notifications';
|
|
||||||
|
|
||||||
const ALERT_DEDUP_WINDOW_MS = 24 * 60 * 60 * 1000;
|
|
||||||
|
|
||||||
export class AlertPipeline {
|
|
||||||
private notificationService: NotificationService;
|
|
||||||
|
|
||||||
constructor() {
|
|
||||||
this.notificationService = new NotificationService(loadNotificationConfig());
|
|
||||||
}
|
|
||||||
|
|
||||||
async processNewExposures(exposureIds: string[]) {
|
|
||||||
const exposures = await prisma.exposure.findMany({
|
|
||||||
where: { id: { in: exposureIds }, isFirstTime: true },
|
|
||||||
include: {
|
|
||||||
subscription: {
|
|
||||||
select: {
|
|
||||||
id: true,
|
|
||||||
userId: true,
|
|
||||||
tier: true,
|
|
||||||
},
|
|
||||||
},
|
|
||||||
watchlistItem: true,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
const alertsCreated: Awaited<ReturnType<typeof prisma.alert.create>>[] = [];
|
|
||||||
|
|
||||||
for (const exposure of exposures) {
|
|
||||||
const dedupKey = `exposure:${exposure.subscriptionId}:${exposure.source}:${exposure.identifierHash}`;
|
|
||||||
|
|
||||||
const recentAlert = await prisma.alert.findFirst({
|
|
||||||
where: {
|
|
||||||
subscriptionId: exposure.subscriptionId,
|
|
||||||
type: AlertType.exposure_detected,
|
|
||||||
createdAt: {
|
|
||||||
gte: new Date(Date.now() - ALERT_DEDUP_WINDOW_MS),
|
|
||||||
},
|
|
||||||
},
|
|
||||||
orderBy: { createdAt: 'desc' },
|
|
||||||
});
|
|
||||||
|
|
||||||
if (recentAlert) {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
const alert = await prisma.alert.create({
|
|
||||||
data: {
|
|
||||||
subscriptionId: exposure.subscriptionId,
|
|
||||||
userId: exposure.subscription.userId,
|
|
||||||
exposureId: exposure.id,
|
|
||||||
type: AlertType.exposure_detected,
|
|
||||||
title: this.buildTitle(exposure),
|
|
||||||
message: this.buildMessage(exposure),
|
|
||||||
severity: this.mapSeverity(exposure.severity),
|
|
||||||
channel: this.getChannelsForTier(exposure.subscription.tier),
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
alertsCreated.push(alert);
|
|
||||||
|
|
||||||
await this.dispatchNotification(alert, exposure);
|
|
||||||
}
|
|
||||||
|
|
||||||
return alertsCreated;
|
|
||||||
}
|
|
||||||
|
|
||||||
async dispatchScanCompleteAlert(
|
|
||||||
subscriptionId: string,
|
|
||||||
userId: string,
|
|
||||||
exposuresFound: number
|
|
||||||
) {
|
|
||||||
const subscription = await prisma.subscription.findUnique({
|
|
||||||
where: { id: subscriptionId },
|
|
||||||
select: { tier: true },
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!subscription) return;
|
|
||||||
|
|
||||||
const alert = await prisma.alert.create({
|
|
||||||
data: {
|
|
||||||
subscriptionId,
|
|
||||||
userId,
|
|
||||||
type: AlertType.scan_complete,
|
|
||||||
title: 'DarkWatch Scan Complete',
|
|
||||||
message: `Scan found ${exposuresFound} new exposure${exposuresFound === 1 ? '' : 's'}.`,
|
|
||||||
severity: exposuresFound > 0 ? 'warning' : 'info',
|
|
||||||
channel: this.getChannelsForTier(subscription.tier),
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
await this.dispatchNotification(alert, {
|
|
||||||
source: 'hibp',
|
|
||||||
severity: 'info',
|
|
||||||
identifier: '',
|
|
||||||
dataType: 'email',
|
|
||||||
} as any);
|
|
||||||
|
|
||||||
return alert;
|
|
||||||
}
|
|
||||||
|
|
||||||
private async dispatchNotification(
|
|
||||||
alert: {
|
|
||||||
userId: string;
|
|
||||||
channel: string[];
|
|
||||||
title: string;
|
|
||||||
message: string;
|
|
||||||
severity: AlertSeverity;
|
|
||||||
},
|
|
||||||
exposure: { source: string; severity: string; identifier: string; dataType: string }
|
|
||||||
) {
|
|
||||||
try {
|
|
||||||
if (!this.notificationService.isFullyConfigured()) return;
|
|
||||||
|
|
||||||
await this.notificationService.sendMultiChannelNotification(
|
|
||||||
{
|
|
||||||
userId: alert.userId,
|
|
||||||
},
|
|
||||||
alert.channel as any,
|
|
||||||
alert.title,
|
|
||||||
`<p>${alert.message}</p>
|
|
||||||
<p><strong>Source:</strong> ${exposure.source}</p>
|
|
||||||
<p><strong>Severity:</strong> ${exposure.severity}</p>
|
|
||||||
<p><strong>Type:</strong> ${exposure.dataType}</p>`,
|
|
||||||
alert.severity === 'critical'
|
|
||||||
? NotificationPriority.HIGH
|
|
||||||
: NotificationPriority.NORMAL
|
|
||||||
);
|
|
||||||
} catch (error) {
|
|
||||||
console.error('[AlertPipeline] Notification dispatch error:', error);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private buildTitle(exposure: {
|
|
||||||
source: string;
|
|
||||||
dataType: string;
|
|
||||||
severity: string;
|
|
||||||
}): string {
|
|
||||||
return `${exposure.severity.toUpperCase()}: ${exposure.dataType} exposure on ${exposure.source}`;
|
|
||||||
}
|
|
||||||
|
|
||||||
private buildMessage(exposure: {
|
|
||||||
identifier: string;
|
|
||||||
source: string;
|
|
||||||
severity: string;
|
|
||||||
dataType: string;
|
|
||||||
}): string {
|
|
||||||
const masked = exposure.identifier.includes('@')
|
|
||||||
? exposure.identifier.replace(/(?<=.{2}).*(?=@)/, '***')
|
|
||||||
: exposure.identifier.slice(0, 3) + '***';
|
|
||||||
|
|
||||||
return `Your ${exposure.dataType} (${masked}) was found in a ${exposure.source} breach with ${exposure.severity} severity.`;
|
|
||||||
}
|
|
||||||
|
|
||||||
private mapSeverity(severity: string): AlertSeverity {
|
|
||||||
return severity as AlertSeverity;
|
|
||||||
}
|
|
||||||
|
|
||||||
private getChannelsForTier(tier: string): string[] {
|
|
||||||
const channelMap: Record<string, string[]> = {
|
|
||||||
basic: ['email'],
|
|
||||||
plus: ['email', 'push'],
|
|
||||||
premium: ['email', 'push', 'sms'],
|
|
||||||
};
|
|
||||||
return channelMap[tier] || ['email'];
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
export const alertPipeline = new AlertPipeline();
|
|
||||||
@@ -1,5 +0,0 @@
|
|||||||
export { watchlistService } from './watchlist.service';
|
|
||||||
export { scanService } from './scan.service';
|
|
||||||
export { schedulerService } from './scheduler.service';
|
|
||||||
export { webhookService } from './webhook.service';
|
|
||||||
export { alertPipeline } from './alert.pipeline';
|
|
||||||
@@ -1,220 +0,0 @@
|
|||||||
import { prisma, ExposureSource, ExposureSeverity, WatchlistType } from '@shieldsai/shared-db';
|
|
||||||
import { createHash } from 'crypto';
|
|
||||||
|
|
||||||
function hashIdentifier(identifier: string): string {
|
|
||||||
return createHash('sha256').update(identifier.toLowerCase().trim()).digest('hex');
|
|
||||||
}
|
|
||||||
|
|
||||||
function determineSeverity(
|
|
||||||
source: ExposureSource,
|
|
||||||
dataType: WatchlistType
|
|
||||||
): ExposureSeverity {
|
|
||||||
const criticalSources = [ExposureSource.darkWebForum, ExposureSource.honeypot];
|
|
||||||
const warningSources = [ExposureSource.hibp, ExposureSource.shodan];
|
|
||||||
const criticalTypes = [WatchlistType.ssn];
|
|
||||||
|
|
||||||
if (criticalTypes.includes(dataType)) return ExposureSeverity.critical;
|
|
||||||
if (criticalSources.includes(source)) return ExposureSeverity.critical;
|
|
||||||
if (warningSources.includes(source)) return ExposureSeverity.warning;
|
|
||||||
return ExposureSeverity.info;
|
|
||||||
}
|
|
||||||
|
|
||||||
export class ScanService {
|
|
||||||
async checkHIBP(email: string): Promise<{ exposed: boolean; sources: string[] }> {
|
|
||||||
try {
|
|
||||||
const response = await fetch(
|
|
||||||
`https://hibp.com/api/v2/${encodeURIComponent(email)}`,
|
|
||||||
{
|
|
||||||
headers: {
|
|
||||||
'hibp-api-key': process.env.HIBP_API_KEY || '',
|
|
||||||
Accept: 'application/json',
|
|
||||||
},
|
|
||||||
signal: AbortSignal.timeout(15000),
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
if (response.status === 404) {
|
|
||||||
return { exposed: false, sources: [] };
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!response.ok) {
|
|
||||||
console.error(`[ScanService:HIBP] Status ${response.status} for ${email}`);
|
|
||||||
return { exposed: false, sources: [] };
|
|
||||||
}
|
|
||||||
|
|
||||||
const data = await response.json();
|
|
||||||
const sources = Array.isArray(data)
|
|
||||||
? data.map((p: { Name: string }) => p.Name)
|
|
||||||
: [];
|
|
||||||
|
|
||||||
return { exposed: sources.length > 0, sources };
|
|
||||||
} catch (error) {
|
|
||||||
console.error('[ScanService:HIBP] Error:', error);
|
|
||||||
return { exposed: false, sources: [] };
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async checkShodan(domain: string): Promise<{ exposed: boolean; ports: string[]; ips: string[] }> {
|
|
||||||
try {
|
|
||||||
const response = await fetch(
|
|
||||||
`https://api.shodan.io/shodan/host/${encodeURIComponent(domain)}`,
|
|
||||||
{
|
|
||||||
headers: {
|
|
||||||
Authorization: `Bearer ${process.env.SHODAN_API_KEY || ''}`,
|
|
||||||
},
|
|
||||||
signal: AbortSignal.timeout(15000),
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
if (response.status === 404) {
|
|
||||||
return { exposed: false, ports: [], ips: [] };
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!response.ok) {
|
|
||||||
console.error(`[ScanService:Shodan] Status ${response.status} for ${domain}`);
|
|
||||||
return { exposed: false, ports: [], ips: [] };
|
|
||||||
}
|
|
||||||
|
|
||||||
const data = await response.json();
|
|
||||||
return {
|
|
||||||
exposed: !!data.ip_str,
|
|
||||||
ports: data.ports?.map(String) || [],
|
|
||||||
ips: [data.ip_str || ''],
|
|
||||||
};
|
|
||||||
} catch (error) {
|
|
||||||
console.error('[ScanService:Shodan] Error:', error);
|
|
||||||
return { exposed: false, ports: [], ips: [] };
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async processSubscriptionScan(
|
|
||||||
subscriptionId: string,
|
|
||||||
watchlistItems: Awaited<ReturnType<ScanService['getWatchlistItems']>>
|
|
||||||
): Promise<{ exposuresCreated: number; exposuresUpdated: number }> {
|
|
||||||
let exposuresCreated = 0;
|
|
||||||
let exposuresUpdated = 0;
|
|
||||||
|
|
||||||
for (const item of watchlistItems) {
|
|
||||||
const identifier = item.value;
|
|
||||||
const identifierHash = hashIdentifier(identifier);
|
|
||||||
|
|
||||||
switch (item.type) {
|
|
||||||
case WatchlistType.email: {
|
|
||||||
const hibpResult = await this.checkHIBP(identifier);
|
|
||||||
if (hibpResult.exposed) {
|
|
||||||
for (const source of hibpResult.sources) {
|
|
||||||
const existing = await prisma.exposure.findFirst({
|
|
||||||
where: {
|
|
||||||
subscriptionId,
|
|
||||||
source: ExposureSource.hibp,
|
|
||||||
identifierHash,
|
|
||||||
metadata: { path: ['dbName'], equals: source },
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
if (existing) {
|
|
||||||
await prisma.exposure.update({
|
|
||||||
where: { id: existing.id },
|
|
||||||
data: { detectedAt: new Date() },
|
|
||||||
});
|
|
||||||
exposuresUpdated++;
|
|
||||||
} else {
|
|
||||||
await prisma.exposure.create({
|
|
||||||
data: {
|
|
||||||
subscriptionId,
|
|
||||||
watchlistItemId: item.id,
|
|
||||||
source: ExposureSource.hibp,
|
|
||||||
dataType: item.type,
|
|
||||||
identifier,
|
|
||||||
identifierHash,
|
|
||||||
severity: determineSeverity(ExposureSource.hibp, item.type),
|
|
||||||
isFirstTime: true,
|
|
||||||
metadata: { dbName: source },
|
|
||||||
detectedAt: new Date(),
|
|
||||||
},
|
|
||||||
});
|
|
||||||
exposuresCreated++;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
|
|
||||||
case WatchlistType.domain: {
|
|
||||||
const shodanResult = await this.checkShodan(identifier);
|
|
||||||
if (shodanResult.exposed) {
|
|
||||||
const existing = await prisma.exposure.findFirst({
|
|
||||||
where: {
|
|
||||||
subscriptionId,
|
|
||||||
source: ExposureSource.shodan,
|
|
||||||
identifierHash,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
if (existing) {
|
|
||||||
await prisma.exposure.update({
|
|
||||||
where: { id: existing.id },
|
|
||||||
data: {
|
|
||||||
detectedAt: new Date(),
|
|
||||||
metadata: { ports: shodanResult.ports, ips: shodanResult.ips },
|
|
||||||
},
|
|
||||||
});
|
|
||||||
exposuresUpdated++;
|
|
||||||
} else {
|
|
||||||
await prisma.exposure.create({
|
|
||||||
data: {
|
|
||||||
subscriptionId,
|
|
||||||
watchlistItemId: item.id,
|
|
||||||
source: ExposureSource.shodan,
|
|
||||||
dataType: item.type,
|
|
||||||
identifier,
|
|
||||||
identifierHash,
|
|
||||||
severity: determineSeverity(ExposureSource.shodan, item.type),
|
|
||||||
isFirstTime: true,
|
|
||||||
metadata: { ports: shodanResult.ports, ips: shodanResult.ips },
|
|
||||||
detectedAt: new Date(),
|
|
||||||
},
|
|
||||||
});
|
|
||||||
exposuresCreated++;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
|
|
||||||
default: {
|
|
||||||
const existing = await prisma.exposure.findFirst({
|
|
||||||
where: { subscriptionId, watchlistItemId: item.id, identifierHash },
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!existing) {
|
|
||||||
await prisma.exposure.create({
|
|
||||||
data: {
|
|
||||||
subscriptionId,
|
|
||||||
watchlistItemId: item.id,
|
|
||||||
source: ExposureSource.darkWebForum,
|
|
||||||
dataType: item.type,
|
|
||||||
identifier,
|
|
||||||
identifierHash,
|
|
||||||
severity: determineSeverity(ExposureSource.darkWebForum, item.type),
|
|
||||||
isFirstTime: true,
|
|
||||||
detectedAt: new Date(),
|
|
||||||
},
|
|
||||||
});
|
|
||||||
exposuresCreated++;
|
|
||||||
}
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return { exposuresCreated, exposuresUpdated };
|
|
||||||
}
|
|
||||||
|
|
||||||
async getWatchlistItems(subscriptionId: string) {
|
|
||||||
return prisma.watchlistItem.findMany({
|
|
||||||
where: { subscriptionId, isActive: true },
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
export const scanService = new ScanService();
|
|
||||||
@@ -1,155 +0,0 @@
|
|||||||
import { prisma, SubscriptionTier, SubscriptionStatus } from '@shieldsai/shared-db';
|
|
||||||
import { tierConfig } from '@shieldsai/shared-billing';
|
|
||||||
import { darkwatchScanQueue } from '@shieldsai/jobs';
|
|
||||||
import { randomUUID } from 'crypto';
|
|
||||||
|
|
||||||
const CRON_EXPRESSIONS = {
|
|
||||||
daily: '0 0 * * *',
|
|
||||||
hourly: '0 * * * *',
|
|
||||||
realtime: null,
|
|
||||||
};
|
|
||||||
|
|
||||||
export class SchedulerService {
|
|
||||||
async scheduleSubscriptionScans() {
|
|
||||||
const activeSubscriptions = await prisma.subscription.findMany({
|
|
||||||
where: {
|
|
||||||
tier: { in: [SubscriptionTier.basic, SubscriptionTier.plus, SubscriptionTier.premium] },
|
|
||||||
status: SubscriptionStatus.active,
|
|
||||||
},
|
|
||||||
select: {
|
|
||||||
id: true,
|
|
||||||
tier: true,
|
|
||||||
userId: true,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
const jobsEnqueued = [];
|
|
||||||
|
|
||||||
for (const subscription of activeSubscriptions) {
|
|
||||||
const frequency = tierConfig[subscription.tier].features.darkWebScanFrequency;
|
|
||||||
const cron = CRON_EXPRESSIONS[frequency];
|
|
||||||
|
|
||||||
if (!cron) {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
const jobKey = `scheduled-scan:${subscription.id}`;
|
|
||||||
|
|
||||||
try {
|
|
||||||
await darkwatchScanQueue.add(
|
|
||||||
'scheduled-scan',
|
|
||||||
{
|
|
||||||
subscriptionId: subscription.id,
|
|
||||||
tier: subscription.tier,
|
|
||||||
scanType: 'scheduled',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
jobId: jobKey,
|
|
||||||
repeat: {
|
|
||||||
every: frequency === 'daily'
|
|
||||||
? 24 * 60 * 60 * 1000
|
|
||||||
: 60 * 60 * 1000,
|
|
||||||
},
|
|
||||||
priority: subscription.tier === SubscriptionTier.premium ? 1 : 3,
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
jobsEnqueued.push({
|
|
||||||
subscriptionId: subscription.id,
|
|
||||||
tier: subscription.tier,
|
|
||||||
frequency,
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
if ((error as Error).message?.includes('Duplicate')) {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
console.error(
|
|
||||||
`[SchedulerService] Failed to schedule scan for ${subscription.id}:`,
|
|
||||||
error
|
|
||||||
);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return jobsEnqueued;
|
|
||||||
}
|
|
||||||
|
|
||||||
async enqueueOnDemandScan(subscriptionId: string) {
|
|
||||||
const subscription = await prisma.subscription.findUnique({
|
|
||||||
where: { id: subscriptionId },
|
|
||||||
select: { id: true, tier: true },
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!subscription) {
|
|
||||||
throw new Error(`Subscription ${subscriptionId} not found`);
|
|
||||||
}
|
|
||||||
|
|
||||||
return darkwatchScanQueue.add(
|
|
||||||
'on-demand-scan',
|
|
||||||
{
|
|
||||||
subscriptionId,
|
|
||||||
tier: subscription.tier,
|
|
||||||
scanType: 'on-demand',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
priority: 1,
|
|
||||||
jobId: `on-demand-scan:${subscriptionId}:${randomUUID()}`,
|
|
||||||
}
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
async enqueueRealtimeTrigger(subscriptionId: string, sourceData: Record<string, unknown>) {
|
|
||||||
const subscription = await prisma.subscription.findUnique({
|
|
||||||
where: { id: subscriptionId },
|
|
||||||
select: { id: true, tier: true },
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!subscription || subscription.tier !== SubscriptionTier.premium) {
|
|
||||||
throw new Error('Realtime triggers require Premium tier');
|
|
||||||
}
|
|
||||||
|
|
||||||
return darkwatchScanQueue.add(
|
|
||||||
'realtime-trigger',
|
|
||||||
{
|
|
||||||
subscriptionId,
|
|
||||||
tier: subscription.tier,
|
|
||||||
scanType: 'realtime',
|
|
||||||
sourceData,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
priority: 0,
|
|
||||||
jobId: `realtime-trigger:${subscriptionId}:${randomUUID()}`,
|
|
||||||
}
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
async rescheduleAll() {
|
|
||||||
const repeatableJobs = await darkwatchScanQueue.getRepeatableJobs();
|
|
||||||
|
|
||||||
for (const job of repeatableJobs) {
|
|
||||||
await darkwatchScanQueue.removeRepeatableByKey(job.key);
|
|
||||||
}
|
|
||||||
|
|
||||||
return this.scheduleSubscriptionScans();
|
|
||||||
}
|
|
||||||
|
|
||||||
async getScanSchedule(subscriptionId: string) {
|
|
||||||
const subscription = await prisma.subscription.findUnique({
|
|
||||||
where: { id: subscriptionId },
|
|
||||||
select: { tier: true },
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!subscription) return null;
|
|
||||||
|
|
||||||
const frequency = tierConfig[subscription.tier].features.darkWebScanFrequency;
|
|
||||||
|
|
||||||
return {
|
|
||||||
subscriptionId,
|
|
||||||
tier: subscription.tier,
|
|
||||||
frequency,
|
|
||||||
cron: CRON_EXPRESSIONS[frequency],
|
|
||||||
nextRun: frequency === 'realtime' ? 'event-driven' : CRON_EXPRESSIONS[frequency],
|
|
||||||
};
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
export const schedulerService = new SchedulerService();
|
|
||||||
@@ -1,97 +0,0 @@
|
|||||||
import { prisma, WatchlistType } from '@shieldsai/shared-db';
|
|
||||||
import { createHash } from 'crypto';
|
|
||||||
|
|
||||||
export function normalizeValue(type: WatchlistType, value: string): string {
|
|
||||||
const trimmed = value.trim().toLowerCase();
|
|
||||||
switch (type) {
|
|
||||||
case WatchlistType.email:
|
|
||||||
return trimmed.replace(/\s+/g, '');
|
|
||||||
case WatchlistType.phoneNumber:
|
|
||||||
return trimmed.replace(/[\s\-\(\)]/g, '');
|
|
||||||
case WatchlistType.ssn:
|
|
||||||
return trimmed.replace(/-/g, '');
|
|
||||||
case WatchlistType.address:
|
|
||||||
return trimmed;
|
|
||||||
case WatchlistType.domain:
|
|
||||||
return trimmed.replace(/^https?:\/\//, '').replace(/\/.*$/, '');
|
|
||||||
default:
|
|
||||||
return trimmed;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
export function hashValue(value: string): string {
|
|
||||||
return createHash('sha256').update(value).digest('hex');
|
|
||||||
}
|
|
||||||
|
|
||||||
export class WatchlistService {
|
|
||||||
async addItem(
|
|
||||||
subscriptionId: string,
|
|
||||||
type: WatchlistType,
|
|
||||||
value: string,
|
|
||||||
maxItems: number
|
|
||||||
) {
|
|
||||||
const normalized = normalizeValue(type, value);
|
|
||||||
const itemHash = hashValue(normalized);
|
|
||||||
|
|
||||||
const currentCount = await prisma.watchlistItem.count({
|
|
||||||
where: { subscriptionId, isActive: true },
|
|
||||||
});
|
|
||||||
|
|
||||||
if (currentCount >= maxItems) {
|
|
||||||
throw new Error(
|
|
||||||
`Watchlist limit reached (${maxItems} items). Upgrade your plan to add more.`
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
const existing = await prisma.watchlistItem.findFirst({
|
|
||||||
where: { subscriptionId, type, hash: itemHash },
|
|
||||||
});
|
|
||||||
|
|
||||||
if (existing) {
|
|
||||||
if (!existing.isActive) {
|
|
||||||
return prisma.watchlistItem.update({
|
|
||||||
where: { id: existing.id },
|
|
||||||
data: { isActive: true },
|
|
||||||
});
|
|
||||||
}
|
|
||||||
return existing;
|
|
||||||
}
|
|
||||||
|
|
||||||
return prisma.watchlistItem.create({
|
|
||||||
data: {
|
|
||||||
subscriptionId,
|
|
||||||
type,
|
|
||||||
value: normalized,
|
|
||||||
hash: itemHash,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
async getItems(subscriptionId: string) {
|
|
||||||
return prisma.watchlistItem.findMany({
|
|
||||||
where: { subscriptionId, isActive: true },
|
|
||||||
orderBy: { createdAt: 'desc' },
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
async removeItem(id: string, subscriptionId: string) {
|
|
||||||
return prisma.watchlistItem.update({
|
|
||||||
where: { id },
|
|
||||||
data: { isActive: false },
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
async getActiveItemsForScan(subscriptionId: string) {
|
|
||||||
return prisma.watchlistItem.findMany({
|
|
||||||
where: { subscriptionId, isActive: true },
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
async getItemCount(subscriptionId: string) {
|
|
||||||
return prisma.watchlistItem.count({
|
|
||||||
where: { subscriptionId, isActive: true },
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
export const watchlistService = new WatchlistService();
|
|
||||||
@@ -1,226 +0,0 @@
|
|||||||
import { prisma, ExposureSource, ExposureSeverity, WatchlistType, AlertType, AlertSeverity } from '@shieldsai/shared-db';
|
|
||||||
import { createHash } from 'crypto';
|
|
||||||
import { mixpanelService, EventType } from '@shieldsai/shared-analytics';
|
|
||||||
|
|
||||||
function hashIdentifier(identifier: string): string {
|
|
||||||
return createHash('sha256').update(identifier.toLowerCase().trim()).digest('hex');
|
|
||||||
}
|
|
||||||
|
|
||||||
function determineSeverity(
|
|
||||||
source: ExposureSource,
|
|
||||||
dataType: WatchlistType
|
|
||||||
): ExposureSeverity {
|
|
||||||
const criticalSources = [ExposureSource.darkWebForum, ExposureSource.honeypot];
|
|
||||||
const warningSources = [ExposureSource.hibp, ExposureSource.shodan];
|
|
||||||
const criticalTypes = [WatchlistType.ssn];
|
|
||||||
|
|
||||||
if (criticalTypes.includes(dataType)) return ExposureSeverity.critical;
|
|
||||||
if (criticalSources.includes(source)) return ExposureSeverity.critical;
|
|
||||||
if (warningSources.includes(source)) return ExposureSeverity.warning;
|
|
||||||
return ExposureSeverity.info;
|
|
||||||
}
|
|
||||||
|
|
||||||
export interface WebhookPayload {
|
|
||||||
source: string;
|
|
||||||
identifier: string;
|
|
||||||
identifierType: string;
|
|
||||||
metadata?: Record<string, unknown>;
|
|
||||||
timestamp?: string;
|
|
||||||
}
|
|
||||||
|
|
||||||
export class WebhookService {
|
|
||||||
async processExternalWebhook(payload: WebhookPayload): Promise<{
|
|
||||||
exposuresCreated: number;
|
|
||||||
alertsCreated: number;
|
|
||||||
}> {
|
|
||||||
const source = this.mapSource(payload.source);
|
|
||||||
const dataType = this.mapDataType(payload.identifierType);
|
|
||||||
const identifier = payload.identifier.toLowerCase().trim();
|
|
||||||
const identifierHash = hashIdentifier(identifier);
|
|
||||||
const severity = determineSeverity(source, dataType);
|
|
||||||
|
|
||||||
const matchingItems = await prisma.watchlistItem.findMany({
|
|
||||||
where: {
|
|
||||||
isActive: true,
|
|
||||||
OR: [
|
|
||||||
{ hash: identifierHash, type: dataType },
|
|
||||||
{ value: identifier, type: dataType },
|
|
||||||
],
|
|
||||||
},
|
|
||||||
include: {
|
|
||||||
subscription: {
|
|
||||||
select: {
|
|
||||||
id: true,
|
|
||||||
tier: true,
|
|
||||||
userId: true,
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
let exposuresCreated = 0;
|
|
||||||
let alertsCreated = 0;
|
|
||||||
|
|
||||||
for (const item of matchingItems) {
|
|
||||||
const existing = await prisma.exposure.findFirst({
|
|
||||||
where: {
|
|
||||||
subscriptionId: item.subscriptionId,
|
|
||||||
source,
|
|
||||||
identifierHash,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
if (existing) {
|
|
||||||
await prisma.exposure.update({
|
|
||||||
where: { id: existing.id },
|
|
||||||
data: { detectedAt: new Date() },
|
|
||||||
});
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
const exposure = await prisma.exposure.create({
|
|
||||||
data: {
|
|
||||||
subscriptionId: item.subscriptionId,
|
|
||||||
watchlistItemId: item.id,
|
|
||||||
source,
|
|
||||||
dataType,
|
|
||||||
identifier,
|
|
||||||
identifierHash,
|
|
||||||
severity,
|
|
||||||
isFirstTime: true,
|
|
||||||
metadata: payload.metadata || {},
|
|
||||||
detectedAt: new Date(),
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
exposuresCreated++;
|
|
||||||
|
|
||||||
const alertChannels = this.getAlertChannelsForTier(item.subscription.tier);
|
|
||||||
|
|
||||||
await prisma.alert.create({
|
|
||||||
data: {
|
|
||||||
subscriptionId: item.subscriptionId,
|
|
||||||
userId: item.subscription.userId,
|
|
||||||
exposureId: exposure.id,
|
|
||||||
type: AlertType.exposure_detected,
|
|
||||||
title: `New Exposure Detected: ${this.getSourceLabel(source)}`,
|
|
||||||
message: this.buildAlertMessage(identifier, source, severity),
|
|
||||||
severity: this.mapAlertSeverity(severity),
|
|
||||||
channel: alertChannels,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
alertsCreated++;
|
|
||||||
|
|
||||||
await mixpanelService.track(EventType.EXPOSURE_DETECTED, {
|
|
||||||
userId: item.subscription.userId,
|
|
||||||
exposureType: dataType,
|
|
||||||
severity,
|
|
||||||
source,
|
|
||||||
subscriptionTier: item.subscription.tier,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
return { exposuresCreated, alertsCreated };
|
|
||||||
}
|
|
||||||
|
|
||||||
async verifyWebhookSignature(
|
|
||||||
body: string,
|
|
||||||
signature: string,
|
|
||||||
timestamp: string
|
|
||||||
): Promise<boolean> {
|
|
||||||
const webhookSecret = process.env.DARKWATCH_WEBHOOK_SECRET;
|
|
||||||
if (!webhookSecret) {
|
|
||||||
console.warn('[WebhookService] DARKWATCH_WEBHOOK_SECRET not set — signature verification skipped');
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
|
|
||||||
const expected = createHash('sha256')
|
|
||||||
.update(`${timestamp}:${body}`)
|
|
||||||
.digest('hex');
|
|
||||||
|
|
||||||
return expected === signature;
|
|
||||||
}
|
|
||||||
|
|
||||||
private mapSource(source: string): ExposureSource {
|
|
||||||
const sourceMap: Record<string, ExposureSource> = {
|
|
||||||
hibp: ExposureSource.hibp,
|
|
||||||
'haveibeenpwned': ExposureSource.hibp,
|
|
||||||
securitytrails: ExposureSource.securityTrails,
|
|
||||||
censys: ExposureSource.censys,
|
|
||||||
'darkweb-forum': ExposureSource.darkWebForum,
|
|
||||||
'darkweb': ExposureSource.darkWebForum,
|
|
||||||
shodan: ExposureSource.shodan,
|
|
||||||
honeypot: ExposureSource.honeypot,
|
|
||||||
};
|
|
||||||
|
|
||||||
const normalized = source.toLowerCase().replace(/\s+/g, '');
|
|
||||||
const mapped = sourceMap[normalized];
|
|
||||||
if (!mapped) {
|
|
||||||
console.warn(`[WebhookService] Unknown source "${source}", falling back to darkWebForum`);
|
|
||||||
}
|
|
||||||
return mapped || ExposureSource.darkWebForum;
|
|
||||||
}
|
|
||||||
|
|
||||||
private mapDataType(type: string): WatchlistType {
|
|
||||||
const typeMap: Record<string, WatchlistType> = {
|
|
||||||
email: WatchlistType.email,
|
|
||||||
phone: WatchlistType.phoneNumber,
|
|
||||||
phonenumber: WatchlistType.phoneNumber,
|
|
||||||
ssn: WatchlistType.ssn,
|
|
||||||
address: WatchlistType.address,
|
|
||||||
domain: WatchlistType.domain,
|
|
||||||
};
|
|
||||||
|
|
||||||
const normalized = type.toLowerCase().trim();
|
|
||||||
return typeMap[normalized] || WatchlistType.email;
|
|
||||||
}
|
|
||||||
|
|
||||||
private getAlertChannelsForTier(tier: string): string[] {
|
|
||||||
const channelMap: Record<string, string[]> = {
|
|
||||||
basic: ['email'],
|
|
||||||
plus: ['email', 'push'],
|
|
||||||
premium: ['email', 'push', 'sms'],
|
|
||||||
};
|
|
||||||
return channelMap[tier] || ['email'];
|
|
||||||
}
|
|
||||||
|
|
||||||
private mapAlertSeverity(severity: ExposureSeverity): AlertSeverity {
|
|
||||||
return severity as AlertSeverity;
|
|
||||||
}
|
|
||||||
|
|
||||||
private getSourceLabel(source: ExposureSource): string {
|
|
||||||
const labels: Record<ExposureSource, string> = {
|
|
||||||
[ExposureSource.hibp]: 'Have I Been Pwned',
|
|
||||||
[ExposureSource.securityTrails]: 'SecurityTrails',
|
|
||||||
[ExposureSource.censys]: 'Censys',
|
|
||||||
[ExposureSource.darkWebForum]: 'Dark Web Forum',
|
|
||||||
[ExposureSource.shodan]: 'Shodan',
|
|
||||||
[ExposureSource.honeypot]: 'Honeypot',
|
|
||||||
};
|
|
||||||
return labels[source] || source;
|
|
||||||
}
|
|
||||||
|
|
||||||
private buildAlertMessage(
|
|
||||||
identifier: string,
|
|
||||||
source: ExposureSource,
|
|
||||||
severity: ExposureSeverity
|
|
||||||
): string {
|
|
||||||
const masked = this.maskIdentifier(identifier);
|
|
||||||
return `${severity.toUpperCase()}: "${masked}" found in ${this.getSourceLabel(source)}.`;
|
|
||||||
}
|
|
||||||
|
|
||||||
private maskIdentifier(identifier: string): string {
|
|
||||||
if (identifier.includes('@')) {
|
|
||||||
const [user, domain] = identifier.split('@');
|
|
||||||
const maskedUser = user.slice(0, 2) + '***' + user.slice(-1);
|
|
||||||
return `${maskedUser}@${domain}`;
|
|
||||||
}
|
|
||||||
if (identifier.length > 8) {
|
|
||||||
return identifier.slice(0, 3) + '***' + identifier.slice(-2);
|
|
||||||
}
|
|
||||||
return identifier;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
export const webhookService = new WebhookService();
|
|
||||||
@@ -1,227 +0,0 @@
|
|||||||
/**
|
|
||||||
* Feature Flag Management System
|
|
||||||
* Centralized feature flag handling with type safety and runtime updates
|
|
||||||
*/
|
|
||||||
|
|
||||||
import type { z } from 'zod';
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Type for feature flag values
|
|
||||||
*/
|
|
||||||
export type FeatureFlagValue = boolean | string | number;
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Interface for a feature flag definition
|
|
||||||
*/
|
|
||||||
export interface FeatureFlag<T = FeatureFlagValue> {
|
|
||||||
key: string;
|
|
||||||
defaultValue: T;
|
|
||||||
description?: string;
|
|
||||||
allowedValues?: T[]; // For enum-like flags
|
|
||||||
category?: string;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Feature flag registry - stores all defined flags
|
|
||||||
*/
|
|
||||||
export interface FeatureFlagRegistry {
|
|
||||||
[key: string]: FeatureFlag;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Feature flag resolver - handles flag resolution logic
|
|
||||||
*/
|
|
||||||
export class FeatureFlagResolver {
|
|
||||||
private flags: FeatureFlagRegistry;
|
|
||||||
private resolvedCache: Map<string, FeatureFlagValue> = new Map();
|
|
||||||
|
|
||||||
constructor(flags: FeatureFlagRegistry) {
|
|
||||||
this.flags = flags;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Resolve a feature flag value
|
|
||||||
* Priority: Environment > Cache > Default
|
|
||||||
*/
|
|
||||||
resolve<T>(key: string, defaultValue: T): T {
|
|
||||||
// Check cache first
|
|
||||||
if (this.resolvedCache.has(key)) {
|
|
||||||
return this.resolvedCache.get(key)! as T;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check environment variable (allows runtime updates)
|
|
||||||
const envValue = process.env[`FLAG_${key.toUpperCase()}`];
|
|
||||||
if (envValue !== undefined) {
|
|
||||||
// Try to parse as JSON first, then as boolean, then as string
|
|
||||||
let parsed: FeatureFlagValue;
|
|
||||||
try {
|
|
||||||
parsed = JSON.parse(envValue);
|
|
||||||
} catch {
|
|
||||||
parsed = envValue.toLowerCase() === 'true' ? true :
|
|
||||||
envValue.toLowerCase() === 'false' ? false :
|
|
||||||
envValue;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Validate against allowed values if defined
|
|
||||||
const flag = this.flags[key];
|
|
||||||
if (flag && flag.allowedValues && !flag.allowedValues.includes(parsed)) {
|
|
||||||
console.warn(`Invalid value for flag ${key}: ${parsed}. Using default.`);
|
|
||||||
parsed = defaultValue as FeatureFlagValue;
|
|
||||||
}
|
|
||||||
|
|
||||||
this.resolvedCache.set(key, parsed);
|
|
||||||
return parsed as T;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Use cached value if available
|
|
||||||
if (this.resolvedCache.has(key)) {
|
|
||||||
return this.resolvedCache.get(key)! as T;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Return default
|
|
||||||
this.resolvedCache.set(key, defaultValue as FeatureFlagValue);
|
|
||||||
return defaultValue as T;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check if a flag is enabled (boolean check)
|
|
||||||
*/
|
|
||||||
isEnabled<T>(key: string, defaultValue: T): T {
|
|
||||||
return this.resolve(key, defaultValue) as T;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get flag definition
|
|
||||||
*/
|
|
||||||
getDefinition(key: string): FeatureFlag | undefined {
|
|
||||||
return this.flags[key];
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* List all registered flags
|
|
||||||
*/
|
|
||||||
getAllFlags(): FeatureFlagRegistry {
|
|
||||||
return { ...this.flags };
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Clear the resolution cache (useful for testing)
|
|
||||||
*/
|
|
||||||
clearCache(): void {
|
|
||||||
this.resolvedCache.clear();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Feature flag configuration with pre-defined flags
|
|
||||||
*/
|
|
||||||
export const featureFlags: FeatureFlagRegistry = {
|
|
||||||
// SpamShield Feature Flags
|
|
||||||
'spamshield.enable.number.reputation': {
|
|
||||||
key: 'spamshield_enable_number_reputation',
|
|
||||||
defaultValue: true,
|
|
||||||
description: 'Enable number reputation checking (Hiya API integration)',
|
|
||||||
category: 'spamshield',
|
|
||||||
},
|
|
||||||
'spamshield.enable.content.classification': {
|
|
||||||
key: 'spamshield_enable_content_classification',
|
|
||||||
defaultValue: true,
|
|
||||||
description: 'Enable SMS content classification (BERT model)',
|
|
||||||
category: 'spamshield',
|
|
||||||
},
|
|
||||||
'spamshield.enable.behavioral.analysis': {
|
|
||||||
key: 'spamshield_enable_behavioral_analysis',
|
|
||||||
defaultValue: true,
|
|
||||||
description: 'Enable call behavioral analysis',
|
|
||||||
category: 'spamshield',
|
|
||||||
},
|
|
||||||
'spamshield.enable.community.intelligence': {
|
|
||||||
key: 'spamshield_enable_community_intelligence',
|
|
||||||
defaultValue: true,
|
|
||||||
description: 'Enable community intelligence sharing',
|
|
||||||
category: 'spamshield',
|
|
||||||
},
|
|
||||||
'spamshield.enable.real.time.blocking': {
|
|
||||||
key: 'spamshield_enable_real_time_blocking',
|
|
||||||
defaultValue: true,
|
|
||||||
description: 'Enable real-time spam blocking',
|
|
||||||
category: 'spamshield',
|
|
||||||
},
|
|
||||||
'spamshield.enable.multiple.sources': {
|
|
||||||
key: 'spamshield_enable_multiple_sources',
|
|
||||||
defaultValue: false,
|
|
||||||
description: 'Enable multiple reputation source aggregation (Truecaller, etc.)',
|
|
||||||
category: 'spamshield',
|
|
||||||
},
|
|
||||||
'spamshield.enable.ml.classifier': {
|
|
||||||
key: 'spamshield_enable_ml_classifier',
|
|
||||||
defaultValue: false,
|
|
||||||
description: 'Enable ML-based spam classification',
|
|
||||||
category: 'spamshield',
|
|
||||||
},
|
|
||||||
|
|
||||||
// VoicePrint Feature Flags
|
|
||||||
'voiceprint.enable.ml.service': {
|
|
||||||
key: 'voiceprint_enable_ml_service',
|
|
||||||
defaultValue: false,
|
|
||||||
description: 'Enable ML service integration for voice analysis',
|
|
||||||
category: 'voiceprint',
|
|
||||||
},
|
|
||||||
'voiceprint.enable.faiss.index': {
|
|
||||||
key: 'voiceprint_enable_faiss_index',
|
|
||||||
defaultValue: true,
|
|
||||||
description: 'Enable FAISS index for voice matching',
|
|
||||||
category: 'voiceprint',
|
|
||||||
},
|
|
||||||
'voiceprint.enable.batch.analysis': {
|
|
||||||
key: 'voiceprint_enable_batch_analysis',
|
|
||||||
defaultValue: true,
|
|
||||||
description: 'Enable batch voice analysis',
|
|
||||||
category: 'voiceprint',
|
|
||||||
},
|
|
||||||
'voiceprint.enable.realtime.analysis': {
|
|
||||||
key: 'voiceprint_enable_realtime_analysis',
|
|
||||||
defaultValue: false,
|
|
||||||
description: 'Enable real-time voice analysis',
|
|
||||||
category: 'voiceprint',
|
|
||||||
},
|
|
||||||
'voiceprint.enable.mock.model': {
|
|
||||||
key: 'voiceprint_enable_mock_model',
|
|
||||||
defaultValue: true,
|
|
||||||
description: 'Enable mock model for development',
|
|
||||||
category: 'voiceprint',
|
|
||||||
},
|
|
||||||
|
|
||||||
// General Platform Flags
|
|
||||||
'platform.enable.audit.logs': {
|
|
||||||
key: 'platform_enable_audit_logs',
|
|
||||||
defaultValue: true,
|
|
||||||
description: 'Enable comprehensive audit logging',
|
|
||||||
category: 'platform',
|
|
||||||
},
|
|
||||||
'platform.enable.kpi.tracking': {
|
|
||||||
key: 'platform_enable_kpi_tracking',
|
|
||||||
defaultValue: true,
|
|
||||||
description: 'Enable KPI snapshot tracking',
|
|
||||||
category: 'platform',
|
|
||||||
},
|
|
||||||
};
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Create a resolver instance with the default flags
|
|
||||||
*/
|
|
||||||
export const featureFlagResolver = new FeatureFlagResolver(featureFlags);
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Convenience function for quick flag checks
|
|
||||||
*/
|
|
||||||
export function isFeatureEnabled<T>(key: string, defaultValue: T): T {
|
|
||||||
return featureFlagResolver.isEnabled(key, defaultValue);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check if a flag is enabled with type safety
|
|
||||||
*/
|
|
||||||
export function checkFlag<T>(key: string, defaultValue: T): T {
|
|
||||||
return featureFlagResolver.resolve(key, defaultValue);
|
|
||||||
}
|
|
||||||
@@ -1,26 +0,0 @@
|
|||||||
// Config
|
|
||||||
export {
|
|
||||||
spamShieldEnv,
|
|
||||||
SpamLayer,
|
|
||||||
SpamDecision,
|
|
||||||
ConfidenceLevel,
|
|
||||||
spamFeatureFlags,
|
|
||||||
spamRateLimits,
|
|
||||||
checkFlag,
|
|
||||||
isFeatureEnabled,
|
|
||||||
} from './spamshield.config';
|
|
||||||
|
|
||||||
// Feature flags
|
|
||||||
export * from './feature-flags';
|
|
||||||
|
|
||||||
// Services
|
|
||||||
export {
|
|
||||||
NumberReputationService,
|
|
||||||
SMSClassifierService,
|
|
||||||
CallAnalysisService,
|
|
||||||
SpamFeedbackService,
|
|
||||||
numberReputationService,
|
|
||||||
smsClassifierService,
|
|
||||||
callAnalysisService,
|
|
||||||
spamFeedbackService,
|
|
||||||
} from './spamshield.service';
|
|
||||||
@@ -1,118 +0,0 @@
|
|||||||
import { createHash } from 'crypto';
|
|
||||||
|
|
||||||
export type AuditClassificationType = 'sms' | 'call';
|
|
||||||
|
|
||||||
export interface AuditClassificationEntry {
|
|
||||||
id: string;
|
|
||||||
timestamp: string;
|
|
||||||
type: AuditClassificationType;
|
|
||||||
phoneNumberHash: string;
|
|
||||||
decision: 'spam' | 'ham' | 'block' | 'flag' | 'allow';
|
|
||||||
confidence: number;
|
|
||||||
reasons: string[];
|
|
||||||
featureFlags: Record<string, boolean>;
|
|
||||||
metadata?: Record<string, unknown>;
|
|
||||||
}
|
|
||||||
|
|
||||||
const MAX_AUDIT_LOG_SIZE = 10_000;
|
|
||||||
|
|
||||||
class AuditLogger {
|
|
||||||
private entries: AuditClassificationEntry[] = [];
|
|
||||||
|
|
||||||
logClassification(entry: Omit<AuditClassificationEntry, 'id' | 'timestamp'>): AuditClassificationEntry {
|
|
||||||
const record: AuditClassificationEntry = {
|
|
||||||
id: `audit-${Date.now()}-${Math.random().toString(36).slice(2, 8)}`,
|
|
||||||
timestamp: new Date().toISOString(),
|
|
||||||
...entry,
|
|
||||||
};
|
|
||||||
|
|
||||||
this.entries.push(record);
|
|
||||||
|
|
||||||
if (this.entries.length > MAX_AUDIT_LOG_SIZE) {
|
|
||||||
this.entries.shift();
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log(
|
|
||||||
`[SpamShield:Audit] type=${record.type} decision=${record.decision} ` +
|
|
||||||
`confidence=${record.confidence.toFixed(3)} reasons=${record.reasons.join(',') || 'none'} ` +
|
|
||||||
`phoneHash=${record.phoneNumberHash}`
|
|
||||||
);
|
|
||||||
|
|
||||||
return record;
|
|
||||||
}
|
|
||||||
|
|
||||||
getEntries(
|
|
||||||
filters?: {
|
|
||||||
type?: AuditClassificationType;
|
|
||||||
decision?: string;
|
|
||||||
startDate?: Date;
|
|
||||||
endDate?: Date;
|
|
||||||
limit?: number;
|
|
||||||
}
|
|
||||||
): AuditClassificationEntry[] {
|
|
||||||
let results = this.entries;
|
|
||||||
|
|
||||||
if (filters?.type) {
|
|
||||||
results = results.filter(e => e.type === filters.type);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (filters?.decision) {
|
|
||||||
results = results.filter(e => e.decision === filters.decision);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (filters?.startDate) {
|
|
||||||
results = results.filter(e => new Date(e.timestamp) >= filters.startDate!);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (filters?.endDate) {
|
|
||||||
results = results.filter(e => new Date(e.timestamp) <= filters.endDate!);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (filters?.limit) {
|
|
||||||
results = results.slice(-filters.limit);
|
|
||||||
}
|
|
||||||
|
|
||||||
return results;
|
|
||||||
}
|
|
||||||
|
|
||||||
getSummary(): {
|
|
||||||
totalEntries: number;
|
|
||||||
spamCount: number;
|
|
||||||
hamCount: number;
|
|
||||||
blockCount: number;
|
|
||||||
flagCount: number;
|
|
||||||
allowCount: number;
|
|
||||||
avgConfidence: number;
|
|
||||||
} {
|
|
||||||
const spamCount = this.entries.filter(e => e.decision === 'spam' || e.decision === 'block').length;
|
|
||||||
const hamCount = this.entries.filter(e => e.decision === 'ham' || e.decision === 'allow').length;
|
|
||||||
const blockCount = this.entries.filter(e => e.decision === 'block').length;
|
|
||||||
const flagCount = this.entries.filter(e => e.decision === 'flag').length;
|
|
||||||
const allowCount = this.entries.filter(e => e.decision === 'allow').length;
|
|
||||||
const avgConfidence =
|
|
||||||
this.entries.length > 0
|
|
||||||
? this.entries.reduce((s, e) => s + e.confidence, 0) / this.entries.length
|
|
||||||
: 0;
|
|
||||||
|
|
||||||
return {
|
|
||||||
totalEntries: this.entries.length,
|
|
||||||
spamCount,
|
|
||||||
hamCount,
|
|
||||||
blockCount,
|
|
||||||
flagCount,
|
|
||||||
allowCount,
|
|
||||||
avgConfidence: Math.round(avgConfidence * 1000) / 1000,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
clear(): void {
|
|
||||||
this.entries = [];
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
export const spamAuditLogger = new AuditLogger();
|
|
||||||
|
|
||||||
export function hashPhoneNumber(phoneNumber: string): string {
|
|
||||||
const hash = createHash('sha256').update(phoneNumber.trim()).digest('hex');
|
|
||||||
return `sha256_${hash}`;
|
|
||||||
}
|
|
||||||
@@ -1,163 +0,0 @@
|
|||||||
import { z } from 'zod';
|
|
||||||
import { checkFlag } from './feature-flags';
|
|
||||||
|
|
||||||
// Environment variables for SpamShield
|
|
||||||
const envSchema = z.object({
|
|
||||||
HIYA_API_KEY: z.string(),
|
|
||||||
HIYA_API_URL: z.string().default('https://api.hiya.com/v1'),
|
|
||||||
TRUECALLER_API_KEY: z.string().optional(),
|
|
||||||
BERT_MODEL_PATH: z.string().default('./models/spam-classifier'),
|
|
||||||
SPAM_THRESHOLD_AUTO_BLOCK: z.string().transform(Number).default(0.85),
|
|
||||||
SPAM_THRESHOLD_FLAG: z.string().transform(Number).default(0.6),
|
|
||||||
CALL_ANALYSIS_TIMEOUT_MS: z.string().transform(Number).default(200),
|
|
||||||
});
|
|
||||||
|
|
||||||
export const spamShieldEnv = envSchema.parse({
|
|
||||||
HIYA_API_KEY: process.env.HIYA_API_KEY,
|
|
||||||
HIYA_API_URL: process.env.HIYA_API_URL,
|
|
||||||
TRUECALLER_API_KEY: process.env.TRUECALLER_API_KEY,
|
|
||||||
BERT_MODEL_PATH: process.env.BERT_MODEL_PATH,
|
|
||||||
SPAM_THRESHOLD_AUTO_BLOCK: process.env.SPAM_THRESHOLD_AUTO_BLOCK,
|
|
||||||
SPAM_THRESHOLD_FLAG: process.env.SPAM_THRESHOLD_FLAG,
|
|
||||||
CALL_ANALYSIS_TIMEOUT_MS: process.env.CALL_ANALYSIS_TIMEOUT_MS,
|
|
||||||
});
|
|
||||||
|
|
||||||
// Spam detection layers
|
|
||||||
export enum SpamLayer {
|
|
||||||
NUMBER_REPUTATION = 'number_reputation',
|
|
||||||
CONTENT_CLASSIFICATION = 'content_classification',
|
|
||||||
BEHAVIORAL_ANALYSIS = 'behavioral_analysis',
|
|
||||||
COMMUNITY_INTELLIGENCE = 'community_intelligence',
|
|
||||||
}
|
|
||||||
|
|
||||||
// Spam decision types
|
|
||||||
export enum SpamDecision {
|
|
||||||
ALLOW = 'allow',
|
|
||||||
FLAG = 'flag',
|
|
||||||
BLOCK = 'block',
|
|
||||||
CHALLENGE = 'challenge',
|
|
||||||
}
|
|
||||||
|
|
||||||
// Confidence levels
|
|
||||||
export enum ConfidenceLevel {
|
|
||||||
LOW = 'low',
|
|
||||||
MEDIUM = 'medium',
|
|
||||||
HIGH = 'high',
|
|
||||||
VERY_HIGH = 'very_high',
|
|
||||||
}
|
|
||||||
|
|
||||||
// Feature flags for spam detection
|
|
||||||
// Use the centralized feature flag system from feature-flags.ts
|
|
||||||
// These are aliases for quick access
|
|
||||||
export const spamFeatureFlags = {
|
|
||||||
enableNumberReputation: checkFlag('spamshield.enable.number.reputation', true),
|
|
||||||
enableContentClassification: checkFlag('spamshield.enable.content.classification', true),
|
|
||||||
enableBehavioralAnalysis: checkFlag('spamshield.enable.behavioral.analysis', true),
|
|
||||||
enableCommunityIntelligence: checkFlag('spamshield.enable.community.intelligence', true),
|
|
||||||
enableRealTimeBlocking: checkFlag('spamshield.enable.real.time.blocking', true),
|
|
||||||
enableMultipleSources: checkFlag('spamshield.enable.multiple.sources', false),
|
|
||||||
enableMLClassifier: checkFlag('spamshield.enable.ml.classifier', false),
|
|
||||||
};
|
|
||||||
|
|
||||||
// Rate limits for spam analysis
|
|
||||||
export const spamRateLimits = {
|
|
||||||
basic: {
|
|
||||||
analysesPerMinute: 10,
|
|
||||||
analysesPerDay: 100,
|
|
||||||
},
|
|
||||||
plus: {
|
|
||||||
analysesPerMinute: 50,
|
|
||||||
analysesPerDay: 1000,
|
|
||||||
},
|
|
||||||
premium: {
|
|
||||||
analysesPerMinute: 200,
|
|
||||||
analysesPerDay: 10000,
|
|
||||||
},
|
|
||||||
};
|
|
||||||
|
|
||||||
// Default confidence scores for spam detection layers
|
|
||||||
export const defaultScores = {
|
|
||||||
// Number reputation service defaults
|
|
||||||
defaultReputationConfidence: 0.0,
|
|
||||||
defaultReputationLowConfidence: 0.1,
|
|
||||||
|
|
||||||
// SMS classifier defaults
|
|
||||||
defaultBaseConfidence: 0.5,
|
|
||||||
defaultMaxConfidence: 1.0,
|
|
||||||
|
|
||||||
// Feature weights for SMS classification
|
|
||||||
featureWeights: {
|
|
||||||
urlPresent: 0.1,
|
|
||||||
highEmojiDensity: 0.15,
|
|
||||||
urgencyKeyword: 0.2,
|
|
||||||
excessiveCaps: 0.15,
|
|
||||||
},
|
|
||||||
|
|
||||||
// Call analysis defaults
|
|
||||||
defaultSpamScore: 0.0,
|
|
||||||
highReputationThreshold: 0.7,
|
|
||||||
reputationWeightInCombinedScore: 0.4,
|
|
||||||
shortDurationScore: 0.2,
|
|
||||||
voipScore: 0.15,
|
|
||||||
unusualHoursScore: 0.1,
|
|
||||||
|
|
||||||
// Source combination weights
|
|
||||||
hiyaWeightInCombinedScore: 0.7,
|
|
||||||
truecallerWeightInCombinedScore: 0.3,
|
|
||||||
};
|
|
||||||
|
|
||||||
// Metadata size limits for SpamFeedback
|
|
||||||
export const metadataLimits = {
|
|
||||||
// Maximum size for metadata JSON in bytes
|
|
||||||
maxMetadataSizeBytes: 4096,
|
|
||||||
|
|
||||||
// Maximum number of keys in metadata object
|
|
||||||
maxMetadataKeys: 20,
|
|
||||||
|
|
||||||
// Maximum size for individual metadata value in bytes
|
|
||||||
maxMetadataValueSizeBytes: 512,
|
|
||||||
};
|
|
||||||
|
|
||||||
// Standard error codes for spamshield API
|
|
||||||
export enum SpamErrorCode {
|
|
||||||
// Client errors (4xx)
|
|
||||||
INVALID_REQUEST = 'INVALID_REQUEST',
|
|
||||||
MISSING_REQUIRED_FIELD = 'MISSING_REQUIRED_FIELD',
|
|
||||||
UNAUTHORIZED = 'UNAUTHORIZED',
|
|
||||||
NOT_FOUND = 'NOT_FOUND',
|
|
||||||
VALIDATION_ERROR = 'VALIDATION_ERROR',
|
|
||||||
|
|
||||||
// Server errors (5xx)
|
|
||||||
CLASSIFICATION_FAILED = 'CLASSIFICATION_FAILED',
|
|
||||||
REPUTATION_CHECK_FAILED = 'REPUTATION_CHECK_FAILED',
|
|
||||||
ANALYSIS_FAILED = 'ANALYSIS_FAILED',
|
|
||||||
FEEDBACK_RECORD_FAILED = 'FEEDBACK_RECORD_FAILED',
|
|
||||||
DATABASE_ERROR = 'DATABASE_ERROR',
|
|
||||||
RATE_LIMIT_EXCEEDED = 'RATE_LIMIT_EXCEEDED',
|
|
||||||
SERVICE_UNAVAILABLE = 'SERVICE_UNAVAILABLE',
|
|
||||||
}
|
|
||||||
|
|
||||||
// Standard error response type
|
|
||||||
export interface SpamErrorResponse {
|
|
||||||
error: {
|
|
||||||
code: SpamErrorCode;
|
|
||||||
message: string;
|
|
||||||
field?: string;
|
|
||||||
timestamp: string;
|
|
||||||
requestId?: string;
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
// HTTP status code constants
|
|
||||||
export const HttpStatus = {
|
|
||||||
OK: 200,
|
|
||||||
CREATED: 201,
|
|
||||||
BAD_REQUEST: 400,
|
|
||||||
UNAUTHORIZED: 401,
|
|
||||||
FORBIDDEN: 403,
|
|
||||||
NOT_FOUND: 404,
|
|
||||||
UNPROCESSABLE_ENTITY: 422,
|
|
||||||
TOO_MANY_REQUESTS: 429,
|
|
||||||
INTERNAL_SERVER_ERROR: 500,
|
|
||||||
SERVICE_UNAVAILABLE: 503,
|
|
||||||
};
|
|
||||||
@@ -1,118 +0,0 @@
|
|||||||
import { FastifyReply } from 'fastify';
|
|
||||||
import { SpamErrorCode, HttpStatus, SpamErrorResponse } from './spamshield.config';
|
|
||||||
|
|
||||||
export { SpamErrorCode, HttpStatus };
|
|
||||||
export type { SpamErrorResponse };
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Standardized error response builder for SpamShield API
|
|
||||||
*/
|
|
||||||
export class ErrorHandler {
|
|
||||||
/**
|
|
||||||
* Create a standard error response
|
|
||||||
*/
|
|
||||||
static create(
|
|
||||||
code: SpamErrorCode,
|
|
||||||
message: string,
|
|
||||||
options?: {
|
|
||||||
field?: string;
|
|
||||||
requestId?: string;
|
|
||||||
additionalData?: Record<string, unknown>;
|
|
||||||
}
|
|
||||||
): SpamErrorResponse {
|
|
||||||
return {
|
|
||||||
error: {
|
|
||||||
code,
|
|
||||||
message,
|
|
||||||
...(options?.field && { field: options.field }),
|
|
||||||
timestamp: new Date().toISOString(),
|
|
||||||
...(options?.requestId && { requestId: options.requestId }),
|
|
||||||
},
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Send a standard error response with appropriate HTTP status code
|
|
||||||
*/
|
|
||||||
static send(
|
|
||||||
reply: FastifyReply,
|
|
||||||
code: SpamErrorCode,
|
|
||||||
message: string,
|
|
||||||
options?: {
|
|
||||||
field?: string;
|
|
||||||
status?: number;
|
|
||||||
requestId?: string;
|
|
||||||
}
|
|
||||||
): void {
|
|
||||||
const status = options?.status ?? this.getStatusForCode(code);
|
|
||||||
const errorResponse = this.create(code, message, {
|
|
||||||
field: options?.field,
|
|
||||||
requestId: options?.requestId,
|
|
||||||
});
|
|
||||||
reply.code(status).send(errorResponse);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Map error codes to HTTP status codes
|
|
||||||
*/
|
|
||||||
private static getStatusForCode(code: SpamErrorCode): number {
|
|
||||||
const statusMap: Record<SpamErrorCode, number> = {
|
|
||||||
// Client errors
|
|
||||||
[SpamErrorCode.INVALID_REQUEST]: HttpStatus.BAD_REQUEST,
|
|
||||||
[SpamErrorCode.MISSING_REQUIRED_FIELD]: HttpStatus.BAD_REQUEST,
|
|
||||||
[SpamErrorCode.UNAUTHORIZED]: HttpStatus.UNAUTHORIZED,
|
|
||||||
[SpamErrorCode.NOT_FOUND]: HttpStatus.NOT_FOUND,
|
|
||||||
[SpamErrorCode.VALIDATION_ERROR]: HttpStatus.BAD_REQUEST,
|
|
||||||
|
|
||||||
// Server errors
|
|
||||||
[SpamErrorCode.CLASSIFICATION_FAILED]: HttpStatus.UNPROCESSABLE_ENTITY,
|
|
||||||
[SpamErrorCode.REPUTATION_CHECK_FAILED]: HttpStatus.UNPROCESSABLE_ENTITY,
|
|
||||||
[SpamErrorCode.ANALYSIS_FAILED]: HttpStatus.UNPROCESSABLE_ENTITY,
|
|
||||||
[SpamErrorCode.FEEDBACK_RECORD_FAILED]: HttpStatus.UNPROCESSABLE_ENTITY,
|
|
||||||
[SpamErrorCode.DATABASE_ERROR]: HttpStatus.INTERNAL_SERVER_ERROR,
|
|
||||||
[SpamErrorCode.RATE_LIMIT_EXCEEDED]: HttpStatus.TOO_MANY_REQUESTS,
|
|
||||||
[SpamErrorCode.SERVICE_UNAVAILABLE]: HttpStatus.SERVICE_UNAVAILABLE,
|
|
||||||
};
|
|
||||||
return statusMap[code] ?? HttpStatus.INTERNAL_SERVER_ERROR;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Validate required string field
|
|
||||||
*/
|
|
||||||
static validateRequiredField(
|
|
||||||
value: unknown,
|
|
||||||
fieldName: string
|
|
||||||
): { isValid: boolean; error?: { code: SpamErrorCode; message: string; field: string } } {
|
|
||||||
if (!value || typeof value !== 'string' || value.trim() === '') {
|
|
||||||
return {
|
|
||||||
isValid: false,
|
|
||||||
error: {
|
|
||||||
code: SpamErrorCode.MISSING_REQUIRED_FIELD,
|
|
||||||
message: `${fieldName} is required`,
|
|
||||||
field: fieldName,
|
|
||||||
},
|
|
||||||
};
|
|
||||||
}
|
|
||||||
return { isValid: true };
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Validate boolean field
|
|
||||||
*/
|
|
||||||
static validateBooleanField(
|
|
||||||
value: unknown,
|
|
||||||
fieldName: string
|
|
||||||
): { isValid: boolean; error?: { code: SpamErrorCode; message: string; field: string } } {
|
|
||||||
if (value === undefined || value === null || typeof value !== 'boolean') {
|
|
||||||
return {
|
|
||||||
isValid: false,
|
|
||||||
error: {
|
|
||||||
code: SpamErrorCode.VALIDATION_ERROR,
|
|
||||||
message: `${fieldName} must be a boolean`,
|
|
||||||
field: fieldName,
|
|
||||||
},
|
|
||||||
};
|
|
||||||
}
|
|
||||||
return { isValid: true };
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,462 +0,0 @@
|
|||||||
import { prisma, SpamFeedback } from '@shieldsai/shared-db';
|
|
||||||
import { spamShieldEnv, SpamDecision, spamFeatureFlags, defaultScores, metadataLimits } from './spamshield.config';
|
|
||||||
import { createHash } from 'crypto';
|
|
||||||
import { spamAuditLogger, hashPhoneNumber } from './spamshield.audit-logger';
|
|
||||||
|
|
||||||
// Number reputation service (Hiya API integration)
|
|
||||||
export class NumberReputationService {
|
|
||||||
/**
|
|
||||||
* Check number reputation using Hiya API
|
|
||||||
*/
|
|
||||||
async checkReputation(phoneNumber: string): Promise<{
|
|
||||||
isSpam: boolean;
|
|
||||||
confidence: number;
|
|
||||||
spamType?: string;
|
|
||||||
reportCount: number;
|
|
||||||
}> {
|
|
||||||
try {
|
|
||||||
// Only enable if feature flag is set
|
|
||||||
if (!spamFeatureFlags.enableNumberReputation) {
|
|
||||||
return {
|
|
||||||
isSpam: false,
|
|
||||||
confidence: 0.0,
|
|
||||||
reportCount: 0,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
// TODO: Integrate with Hiya API
|
|
||||||
// const response = await fetch(`${spamShieldEnv.HIYA_API_URL}/lookup`, {
|
|
||||||
// headers: { 'X-API-Key': spamShieldEnv.HIYA_API_KEY },
|
|
||||||
// method: 'POST',
|
|
||||||
// body: JSON.stringify({ phone: phoneNumber }),
|
|
||||||
// });
|
|
||||||
|
|
||||||
// Simulated response for now
|
|
||||||
return {
|
|
||||||
isSpam: false,
|
|
||||||
confidence: defaultScores.defaultReputationLowConfidence,
|
|
||||||
spamType: undefined,
|
|
||||||
reportCount: 0,
|
|
||||||
};
|
|
||||||
} catch (error) {
|
|
||||||
console.error('Error checking number reputation:', error);
|
|
||||||
return {
|
|
||||||
isSpam: false,
|
|
||||||
confidence: defaultScores.defaultReputationConfidence,
|
|
||||||
reportCount: 0,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check number against multiple reputation sources
|
|
||||||
*/
|
|
||||||
async checkMultiSource(phoneNumber: string): Promise<{
|
|
||||||
hiya: { isSpam: boolean; confidence: number };
|
|
||||||
truecaller: { isSpam: boolean; confidence: number } | null;
|
|
||||||
combinedScore: number;
|
|
||||||
}> {
|
|
||||||
// Only enable if feature flag is set
|
|
||||||
if (!spamFeatureFlags.enableMultipleSources) {
|
|
||||||
return {
|
|
||||||
hiya: { isSpam: false, confidence: defaultScores.defaultReputationConfidence },
|
|
||||||
truecaller: null,
|
|
||||||
combinedScore: defaultScores.defaultSpamScore,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
const hiyaResult = await this.checkReputation(phoneNumber);
|
|
||||||
|
|
||||||
let truecallerResult: { isSpam: boolean; confidence: number } | null = null;
|
|
||||||
if (spamShieldEnv.TRUECALLER_API_KEY) {
|
|
||||||
// TODO: Integrate Truecaller
|
|
||||||
truecallerResult = {
|
|
||||||
isSpam: false,
|
|
||||||
confidence: defaultScores.defaultReputationConfidence,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
// Weighted average: Hiya 70%, Truecaller 30%
|
|
||||||
const combinedScore = hiyaResult.confidence * defaultScores.hiyaWeightInCombinedScore +
|
|
||||||
(truecallerResult?.confidence ?? defaultScores.defaultReputationConfidence) * defaultScores.truecallerWeightInCombinedScore;
|
|
||||||
|
|
||||||
return {
|
|
||||||
hiya: { isSpam: hiyaResult.isSpam, confidence: hiyaResult.confidence },
|
|
||||||
truecaller: truecallerResult,
|
|
||||||
combinedScore,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// SMS content classifier (BERT-based)
|
|
||||||
export class SMSClassifierService {
|
|
||||||
private model: any = null; // BERT model placeholder
|
|
||||||
private _initPromise: Promise<void> | null = null;
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Initialize the BERT model (thread-safe via promise deduplication)
|
|
||||||
*/
|
|
||||||
async initialize(): Promise<void> {
|
|
||||||
// TODO: Load BERT model from path
|
|
||||||
// this.model = await loadBERTModel(spamShieldEnv.BERT_MODEL_PATH);
|
|
||||||
console.log('SMS classifier initialized');
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Ensures model is initialized before use. Concurrent callers
|
|
||||||
* await the same initialization promise to avoid race conditions.
|
|
||||||
*/
|
|
||||||
private async ensureInitialized(): Promise<void> {
|
|
||||||
if (this._initPromise) {
|
|
||||||
return this._initPromise;
|
|
||||||
}
|
|
||||||
this._initPromise = (async () => {
|
|
||||||
if (this.model) {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
await this.initialize();
|
|
||||||
})();
|
|
||||||
return this._initPromise;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Classify SMS text as spam or ham
|
|
||||||
*/
|
|
||||||
async classify(
|
|
||||||
smsText: string,
|
|
||||||
phoneNumber?: string
|
|
||||||
): Promise<{
|
|
||||||
isSpam: boolean;
|
|
||||||
confidence: number;
|
|
||||||
spamFeatures: string[];
|
|
||||||
}> {
|
|
||||||
// Only enable if feature flag is set
|
|
||||||
if (!spamFeatureFlags.enableMLClassifier) {
|
|
||||||
// Return basic feature-based classification
|
|
||||||
const features = this.extractFeatures(smsText);
|
|
||||||
const confidence = this.calculateConfidence(features);
|
|
||||||
const isSpam = confidence >= spamShieldEnv.SPAM_THRESHOLD_AUTO_BLOCK;
|
|
||||||
|
|
||||||
spamAuditLogger.logClassification({
|
|
||||||
type: 'sms',
|
|
||||||
phoneNumberHash: phoneNumber ? hashPhoneNumber(phoneNumber) : 'unknown',
|
|
||||||
decision: isSpam ? 'spam' : 'ham',
|
|
||||||
confidence,
|
|
||||||
reasons: features,
|
|
||||||
featureFlags: { enableMLClassifier: spamFeatureFlags.enableMLClassifier },
|
|
||||||
});
|
|
||||||
|
|
||||||
return {
|
|
||||||
isSpam,
|
|
||||||
confidence,
|
|
||||||
spamFeatures: features,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
await this.ensureInitialized();
|
|
||||||
|
|
||||||
// Extract features
|
|
||||||
const features = this.extractFeatures(smsText);
|
|
||||||
|
|
||||||
// TODO: Run through BERT model
|
|
||||||
// const prediction = await this.model.predict(smsText);
|
|
||||||
|
|
||||||
// Simulated prediction
|
|
||||||
const confidence = this.calculateConfidence(features);
|
|
||||||
const isSpam = confidence >= spamShieldEnv.SPAM_THRESHOLD_AUTO_BLOCK;
|
|
||||||
|
|
||||||
spamAuditLogger.logClassification({
|
|
||||||
type: 'sms',
|
|
||||||
phoneNumberHash: phoneNumber ? hashPhoneNumber(phoneNumber) : 'unknown',
|
|
||||||
decision: isSpam ? 'spam' : 'ham',
|
|
||||||
confidence,
|
|
||||||
reasons: features,
|
|
||||||
featureFlags: { enableMLClassifier: spamFeatureFlags.enableMLClassifier },
|
|
||||||
});
|
|
||||||
|
|
||||||
return {
|
|
||||||
isSpam,
|
|
||||||
confidence,
|
|
||||||
spamFeatures: features,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
private extractFeatures(text: string): string[] {
|
|
||||||
const features: string[] = [];
|
|
||||||
const lowerText = text.toLowerCase();
|
|
||||||
|
|
||||||
// URL presence
|
|
||||||
if (/(http|www)\./i.test(text)) {
|
|
||||||
features.push('url_present');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Emoji density
|
|
||||||
const emojiCount = (text.match(/[\p{Emoji}]/gu) || []).length;
|
|
||||||
if (emojiCount / text.length > 0.1) {
|
|
||||||
features.push('high_emoji_density');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Urgency keywords
|
|
||||||
const urgencyWords = ['now', 'urgent', 'limited', 'act fast', 'today'];
|
|
||||||
if (urgencyWords.some(word => lowerText.includes(word))) {
|
|
||||||
features.push('urgency_keyword');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Excessive capitalization
|
|
||||||
if (/[A-Z]{3,}/.test(text)) {
|
|
||||||
features.push('excessive_caps');
|
|
||||||
}
|
|
||||||
|
|
||||||
return features;
|
|
||||||
}
|
|
||||||
|
|
||||||
private calculateConfidence(features: string[]): number {
|
|
||||||
const baseConfidence = defaultScores.defaultBaseConfidence;
|
|
||||||
const featureWeights: Record<string, number> = {
|
|
||||||
url_present: defaultScores.featureWeights.urlPresent,
|
|
||||||
high_emoji_density: defaultScores.featureWeights.highEmojiDensity,
|
|
||||||
urgency_keyword: defaultScores.featureWeights.urgencyKeyword,
|
|
||||||
excessive_caps: defaultScores.featureWeights.excessiveCaps,
|
|
||||||
};
|
|
||||||
|
|
||||||
return Math.min(defaultScores.defaultMaxConfidence, baseConfidence +
|
|
||||||
features.reduce((sum, f) => sum + (featureWeights[f] || 0), 0));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Call analysis service
|
|
||||||
export class CallAnalysisService {
|
|
||||||
/**
|
|
||||||
* Analyze incoming call for spam indicators
|
|
||||||
*/
|
|
||||||
async analyzeCall(callData: {
|
|
||||||
phoneNumber: string;
|
|
||||||
duration?: number;
|
|
||||||
callTime: Date;
|
|
||||||
isVoip?: boolean;
|
|
||||||
}): Promise<{
|
|
||||||
decision: SpamDecision;
|
|
||||||
confidence: number;
|
|
||||||
reasons: string[];
|
|
||||||
}> {
|
|
||||||
const reasons: string[] = [];
|
|
||||||
let spamScore = defaultScores.defaultSpamScore;
|
|
||||||
|
|
||||||
// Number reputation check - only if feature flag enabled
|
|
||||||
if (spamFeatureFlags.enableBehavioralAnalysis) {
|
|
||||||
const reputationService = new NumberReputationService();
|
|
||||||
const reputation = await reputationService.checkMultiSource(callData.phoneNumber);
|
|
||||||
|
|
||||||
if (reputation.combinedScore > defaultScores.highReputationThreshold) {
|
|
||||||
spamScore += reputation.combinedScore * defaultScores.reputationWeightInCombinedScore;
|
|
||||||
reasons.push('high_spam_reputation');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Behavioral analysis - only if feature flag enabled
|
|
||||||
if (spamFeatureFlags.enableBehavioralAnalysis) {
|
|
||||||
if (callData.duration && callData.duration < 10) {
|
|
||||||
spamScore += defaultScores.shortDurationScore;
|
|
||||||
reasons.push('short_duration');
|
|
||||||
}
|
|
||||||
|
|
||||||
if (callData.isVoip) {
|
|
||||||
spamScore += defaultScores.voipScore;
|
|
||||||
reasons.push('voip_number');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Time-of-day anomaly (simplified)
|
|
||||||
const hour = callData.callTime.getHours();
|
|
||||||
if (hour < 6 || hour > 22) {
|
|
||||||
spamScore += defaultScores.unusualHoursScore;
|
|
||||||
reasons.push('unusual_hours');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Determine decision
|
|
||||||
let decision: SpamDecision;
|
|
||||||
if (spamScore >= spamShieldEnv.SPAM_THRESHOLD_AUTO_BLOCK) {
|
|
||||||
decision = SpamDecision.BLOCK;
|
|
||||||
} else if (spamScore >= spamShieldEnv.SPAM_THRESHOLD_FLAG) {
|
|
||||||
decision = SpamDecision.FLAG;
|
|
||||||
} else {
|
|
||||||
decision = SpamDecision.ALLOW;
|
|
||||||
}
|
|
||||||
|
|
||||||
spamAuditLogger.logClassification({
|
|
||||||
type: 'call',
|
|
||||||
phoneNumberHash: hashPhoneNumber(callData.phoneNumber),
|
|
||||||
decision: decision.toLowerCase() as 'block' | 'flag' | 'allow',
|
|
||||||
confidence: spamScore,
|
|
||||||
reasons,
|
|
||||||
featureFlags: {
|
|
||||||
enableBehavioralAnalysis: spamFeatureFlags.enableBehavioralAnalysis,
|
|
||||||
enableNumberReputation: spamFeatureFlags.enableNumberReputation,
|
|
||||||
},
|
|
||||||
metadata: {
|
|
||||||
duration: callData.duration,
|
|
||||||
isVoip: callData.isVoip,
|
|
||||||
callTime: callData.callTime.toISOString(),
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
return {
|
|
||||||
decision,
|
|
||||||
confidence: spamScore,
|
|
||||||
reasons,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// User feedback service
|
|
||||||
export class SpamFeedbackService {
|
|
||||||
/**
|
|
||||||
* Validate metadata size against defined limits
|
|
||||||
*/
|
|
||||||
private validateMetadata(metadata?: Record<string, any>): {
|
|
||||||
isValid: boolean;
|
|
||||||
trimmedMetadata?: Record<string, any>;
|
|
||||||
reasons?: string[];
|
|
||||||
} {
|
|
||||||
if (!metadata) {
|
|
||||||
return { isValid: true };
|
|
||||||
}
|
|
||||||
|
|
||||||
const reasons: string[] = [];
|
|
||||||
let trimmedMetadata: Record<string, any> = metadata;
|
|
||||||
|
|
||||||
// Check number of keys
|
|
||||||
const keyCount = Object.keys(metadata).length;
|
|
||||||
if (keyCount > metadataLimits.maxMetadataKeys) {
|
|
||||||
reasons.push(`Metadata has ${keyCount} keys, exceeding limit of ${metadataLimits.maxMetadataKeys}`);
|
|
||||||
trimmedMetadata = Object.entries(metadata).slice(0, metadataLimits.maxMetadataKeys);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check total JSON size
|
|
||||||
const jsonSize = JSON.stringify(metadata).length;
|
|
||||||
if (jsonSize > metadataLimits.maxMetadataSizeBytes) {
|
|
||||||
reasons.push(`Metadata size ${jsonSize} bytes exceeds limit of ${metadataLimits.maxMetadataSizeBytes} bytes`);
|
|
||||||
|
|
||||||
// Truncate long values
|
|
||||||
trimmedMetadata = Object.fromEntries(
|
|
||||||
Object.entries(metadata).map(([key, value]) => {
|
|
||||||
const valueStr = String(value);
|
|
||||||
if (valueStr.length > metadataLimits.maxMetadataValueSizeBytes) {
|
|
||||||
return [key, valueStr.slice(0, metadataLimits.maxMetadataValueSizeBytes)];
|
|
||||||
}
|
|
||||||
return [key, value];
|
|
||||||
})
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
return {
|
|
||||||
isValid: reasons.length === 0,
|
|
||||||
trimmedMetadata,
|
|
||||||
reasons: reasons.length > 0 ? reasons : undefined,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Record user feedback on spam detection
|
|
||||||
*/
|
|
||||||
async recordFeedback(
|
|
||||||
userId: string,
|
|
||||||
phoneNumber: string,
|
|
||||||
isSpam: boolean,
|
|
||||||
confidence?: number,
|
|
||||||
metadata?: Record<string, any>
|
|
||||||
): Promise<SpamFeedback> {
|
|
||||||
// Validate metadata
|
|
||||||
const validation = this.validateMetadata(metadata);
|
|
||||||
const validatedMetadata = validation.trimmedMetadata;
|
|
||||||
|
|
||||||
// Only enable if feature flag is set
|
|
||||||
if (!spamFeatureFlags.enableCommunityIntelligence) {
|
|
||||||
// Return a mock feedback for development
|
|
||||||
return {
|
|
||||||
id: `mock_${Date.now()}`,
|
|
||||||
userId,
|
|
||||||
phoneNumber,
|
|
||||||
phoneNumberHash: this.hashPhoneNumber(phoneNumber),
|
|
||||||
isSpam,
|
|
||||||
confidence,
|
|
||||||
feedbackType: 'user_confirmation' as const,
|
|
||||||
metadata: validatedMetadata,
|
|
||||||
createdAt: new Date(),
|
|
||||||
updatedAt: new Date(),
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
const phoneNumberHash = this.hashPhoneNumber(phoneNumber);
|
|
||||||
|
|
||||||
const feedback = await prisma.spamFeedback.create({
|
|
||||||
data: {
|
|
||||||
userId,
|
|
||||||
phoneNumber,
|
|
||||||
phoneNumberHash,
|
|
||||||
isSpam,
|
|
||||||
confidence,
|
|
||||||
feedbackType: 'user_confirmation',
|
|
||||||
metadata: validatedMetadata,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
return feedback;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get spam history for a user
|
|
||||||
*/
|
|
||||||
async getSpamHistory(
|
|
||||||
userId: string,
|
|
||||||
options?: {
|
|
||||||
limit?: number;
|
|
||||||
isSpam?: boolean;
|
|
||||||
startDate?: Date;
|
|
||||||
}
|
|
||||||
): Promise<SpamFeedback[]> {
|
|
||||||
return prisma.spamFeedback.findMany({
|
|
||||||
where: {
|
|
||||||
userId,
|
|
||||||
...(options?.isSpam !== undefined && { isSpam: options.isSpam }),
|
|
||||||
...(options?.startDate && { createdAt: { gte: options.startDate } }),
|
|
||||||
},
|
|
||||||
orderBy: { createdAt: 'desc' },
|
|
||||||
take: options?.limit ?? 100,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get statistics for a user
|
|
||||||
*/
|
|
||||||
async getStatistics(userId: string): Promise<{
|
|
||||||
totalAnalyses: number;
|
|
||||||
spamCount: number;
|
|
||||||
hamCount: number;
|
|
||||||
spamPercentage: number;
|
|
||||||
}> {
|
|
||||||
const [total, spam] = await Promise.all([
|
|
||||||
prisma.spamFeedback.count({ where: { userId } }),
|
|
||||||
prisma.spamFeedback.count({ where: { userId, isSpam: true } }),
|
|
||||||
]);
|
|
||||||
|
|
||||||
return {
|
|
||||||
totalAnalyses: total,
|
|
||||||
spamCount: spam,
|
|
||||||
hamCount: total - spam,
|
|
||||||
spamPercentage: total > 0 ? (spam / total) * 100 : 0,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
private hashPhoneNumber(phoneNumber: string): string {
|
|
||||||
// SHA-256 hash for phone number fingerprinting
|
|
||||||
const hash = createHash('sha256').update(phoneNumber).digest('hex');
|
|
||||||
return `sha256_${hash}`;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Export instances
|
|
||||||
export const numberReputationService = new NumberReputationService();
|
|
||||||
export const smsClassifierService = new SMSClassifierService();
|
|
||||||
export const callAnalysisService = new CallAnalysisService();
|
|
||||||
export const spamFeedbackService = new SpamFeedbackService();
|
|
||||||
@@ -1,30 +0,0 @@
|
|||||||
// Config
|
|
||||||
export {
|
|
||||||
voicePrintEnv,
|
|
||||||
VoicePrintSource,
|
|
||||||
AnalysisJobStatus,
|
|
||||||
DetectionType,
|
|
||||||
ConfidenceLevel,
|
|
||||||
audioPreprocessingConfig,
|
|
||||||
voicePrintFeatureFlags,
|
|
||||||
voicePrintRateLimits,
|
|
||||||
checkFlag,
|
|
||||||
isFeatureEnabled,
|
|
||||||
} from './voiceprint.config';
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
// Services
|
|
||||||
export {
|
|
||||||
AudioPreprocessor,
|
|
||||||
VoiceEnrollmentService,
|
|
||||||
AnalysisService,
|
|
||||||
BatchAnalysisService,
|
|
||||||
EmbeddingService,
|
|
||||||
FAISSIndex,
|
|
||||||
audioPreprocessor,
|
|
||||||
voiceEnrollmentService,
|
|
||||||
analysisService,
|
|
||||||
batchAnalysisService,
|
|
||||||
embeddingService,
|
|
||||||
} from './voiceprint.service';
|
|
||||||
@@ -1,102 +0,0 @@
|
|||||||
import { z } from 'zod';
|
|
||||||
import { checkFlag } from './voiceprint.feature-flags';
|
|
||||||
|
|
||||||
// Environment variables for VoicePrint
|
|
||||||
const envSchema = z.object({
|
|
||||||
ECAPA_TDNN_MODEL_PATH: z.string().default('./models/ecapa-tdnn'),
|
|
||||||
ML_SERVICE_URL: z.string().default('http://localhost:8001'),
|
|
||||||
FAISS_INDEX_PATH: z.string().default('./data/voiceprint_faiss.index'),
|
|
||||||
AUDIO_STORAGE_BUCKET: z.string().default('voiceprint-audio'),
|
|
||||||
AUDIO_STORAGE_ENDPOINT: z.string().default('http://localhost:9000'),
|
|
||||||
SYNTHETIC_THRESHOLD: z.string().transform(Number).default(0.75),
|
|
||||||
ENROLLMENT_MIN_DURATION_SEC: z.string().transform(Number).default(3),
|
|
||||||
ENROLLMENT_MAX_DURATION_SEC: z.string().transform(Number).default(60),
|
|
||||||
EMBEDDING_DIMENSIONS: z.string().transform(Number).default(192),
|
|
||||||
BATCH_MAX_FILES: z.string().transform(Number).default(20),
|
|
||||||
ANALYSIS_TIMEOUT_MS: z.string().transform(Number).default(30000),
|
|
||||||
});
|
|
||||||
|
|
||||||
export const voicePrintEnv = envSchema.parse({
|
|
||||||
ECAPA_TDNN_MODEL_PATH: process.env.ECAPA_TDNN_MODEL_PATH,
|
|
||||||
ML_SERVICE_URL: process.env.ML_SERVICE_URL,
|
|
||||||
FAISS_INDEX_PATH: process.env.FAISS_INDEX_PATH,
|
|
||||||
AUDIO_STORAGE_BUCKET: process.env.AUDIO_STORAGE_BUCKET,
|
|
||||||
AUDIO_STORAGE_ENDPOINT: process.env.AUDIO_STORAGE_ENDPOINT,
|
|
||||||
SYNTHETIC_THRESHOLD: process.env.SYNTHETIC_THRESHOLD,
|
|
||||||
ENROLLMENT_MIN_DURATION_SEC: process.env.ENROLLMENT_MIN_DURATION_SEC,
|
|
||||||
ENROLLMENT_MAX_DURATION_SEC: process.env.ENROLLMENT_MAX_DURATION_SEC,
|
|
||||||
EMBEDDING_DIMENSIONS: process.env.EMBEDDING_DIMENSIONS,
|
|
||||||
BATCH_MAX_FILES: process.env.BATCH_MAX_FILES,
|
|
||||||
ANALYSIS_TIMEOUT_MS: process.env.ANALYSIS_TIMEOUT_MS,
|
|
||||||
});
|
|
||||||
|
|
||||||
// Audio source types
|
|
||||||
export enum VoicePrintSource {
|
|
||||||
UPLOAD = 'upload',
|
|
||||||
S3 = 's3',
|
|
||||||
URL = 'url',
|
|
||||||
REALTIME = 'realtime',
|
|
||||||
}
|
|
||||||
|
|
||||||
// Analysis job status
|
|
||||||
export enum AnalysisJobStatus {
|
|
||||||
PENDING = 'pending',
|
|
||||||
PROCESSING = 'processing',
|
|
||||||
COMPLETED = 'completed',
|
|
||||||
FAILED = 'failed',
|
|
||||||
CANCELLED = 'cancelled',
|
|
||||||
}
|
|
||||||
|
|
||||||
// Detection result types
|
|
||||||
export enum DetectionType {
|
|
||||||
SYNTHETIC_VOICE = 'synthetic_voice',
|
|
||||||
VOICE_CLONE = 'voice_clone',
|
|
||||||
DEEPFAKE = 'deepfake',
|
|
||||||
NATURAL = 'natural',
|
|
||||||
}
|
|
||||||
|
|
||||||
// Confidence levels
|
|
||||||
export enum ConfidenceLevel {
|
|
||||||
LOW = 'low',
|
|
||||||
MEDIUM = 'medium',
|
|
||||||
HIGH = 'high',
|
|
||||||
VERY_HIGH = 'very_high',
|
|
||||||
}
|
|
||||||
|
|
||||||
// Audio preprocessing configuration
|
|
||||||
export const audioPreprocessingConfig = {
|
|
||||||
sampleRate: 16000,
|
|
||||||
channels: 1,
|
|
||||||
bitDepth: 16,
|
|
||||||
vadThreshold: 0.5,
|
|
||||||
noiseReduction: true,
|
|
||||||
maxSilenceDurationMs: 500,
|
|
||||||
};
|
|
||||||
|
|
||||||
// Feature flags - use centralized system
|
|
||||||
export const voicePrintFeatureFlags = {
|
|
||||||
enableMLService: checkFlag('voiceprint.enable.ml.service', false),
|
|
||||||
enableFAISSIndex: checkFlag('voiceprint.enable.faiss.index', true),
|
|
||||||
enableBatchAnalysis: checkFlag('voiceprint.enable.batch.analysis', true),
|
|
||||||
enableRealtimeAnalysis: checkFlag('voiceprint.enable.realtime.analysis', false),
|
|
||||||
enableMockModel: checkFlag('voiceprint.enable.mock.model', true),
|
|
||||||
};
|
|
||||||
|
|
||||||
// Rate limits for voice analysis
|
|
||||||
export const voicePrintRateLimits = {
|
|
||||||
basic: {
|
|
||||||
analysesPerMinute: 5,
|
|
||||||
enrollmentsPerDay: 10,
|
|
||||||
maxAudioFileSizeMB: 50,
|
|
||||||
},
|
|
||||||
plus: {
|
|
||||||
analysesPerMinute: 30,
|
|
||||||
enrollmentsPerDay: 50,
|
|
||||||
maxAudioFileSizeMB: 200,
|
|
||||||
},
|
|
||||||
premium: {
|
|
||||||
analysesPerMinute: 100,
|
|
||||||
enrollmentsPerDay: 500,
|
|
||||||
maxAudioFileSizeMB: 500,
|
|
||||||
},
|
|
||||||
};
|
|
||||||
@@ -1,7 +0,0 @@
|
|||||||
/**
|
|
||||||
* VoicePrint Feature Flags
|
|
||||||
* Re-exports the checkFlag function from the centralized feature flag system
|
|
||||||
*/
|
|
||||||
|
|
||||||
// Re-export the checkFlag function from the spamshield feature flags module
|
|
||||||
export { checkFlag } from '../spamshield/feature-flags';
|
|
||||||
@@ -1,594 +0,0 @@
|
|||||||
import { prisma, VoiceEnrollment, VoiceAnalysis } from '@shieldsai/shared-db';
|
|
||||||
import {
|
|
||||||
voicePrintEnv,
|
|
||||||
AnalysisJobStatus,
|
|
||||||
DetectionType,
|
|
||||||
ConfidenceLevel,
|
|
||||||
audioPreprocessingConfig,
|
|
||||||
voicePrintFeatureFlags,
|
|
||||||
} from './voiceprint.config';
|
|
||||||
import { checkFlag } from './voiceprint.feature-flags';
|
|
||||||
|
|
||||||
// Audio preprocessing service
|
|
||||||
export class AudioPreprocessor {
|
|
||||||
/**
|
|
||||||
* Normalize audio to 16kHz mono with VAD and noise reduction.
|
|
||||||
* Returns preprocessing metadata and the processed audio buffer.
|
|
||||||
*/
|
|
||||||
async preprocess(
|
|
||||||
audioBuffer: Buffer,
|
|
||||||
options?: {
|
|
||||||
sourceSampleRate?: number;
|
|
||||||
channels?: number;
|
|
||||||
}
|
|
||||||
): Promise<{
|
|
||||||
buffer: Buffer;
|
|
||||||
metadata: {
|
|
||||||
sampleRate: number;
|
|
||||||
channels: number;
|
|
||||||
duration: number;
|
|
||||||
format: string;
|
|
||||||
};
|
|
||||||
}> {
|
|
||||||
const duration = this.estimateDuration(audioBuffer, options?.sourceSampleRate ?? 44100);
|
|
||||||
|
|
||||||
if (duration < voicePrintEnv.ENROLLMENT_MIN_DURATION_SEC) {
|
|
||||||
throw new Error(
|
|
||||||
`Audio too short: ${duration.toFixed(1)}s < ${voicePrintEnv.ENROLLMENT_MIN_DURATION_SEC}s minimum`
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (duration > voicePrintEnv.ENROLLMENT_MAX_DURATION_SEC) {
|
|
||||||
throw new Error(
|
|
||||||
`Audio too long: ${duration.toFixed(1)}s > ${voicePrintEnv.ENROLLMENT_MAX_DURATION_SEC}s maximum`
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
// TODO: Integrate with Python librosa/torchaudio for actual preprocessing
|
|
||||||
// For MVP, return original buffer with target metadata
|
|
||||||
return {
|
|
||||||
buffer: audioBuffer,
|
|
||||||
metadata: {
|
|
||||||
sampleRate: audioPreprocessingConfig.sampleRate,
|
|
||||||
channels: audioPreprocessingConfig.channels,
|
|
||||||
duration,
|
|
||||||
format: 'wav',
|
|
||||||
},
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Apply Voice Activity Detection to remove silence segments.
|
|
||||||
*/
|
|
||||||
async applyVAD(buffer: Buffer): Promise<Buffer> {
|
|
||||||
// TODO: Integrate with Python webrtcvad or silero-vad
|
|
||||||
// For MVP, return original buffer
|
|
||||||
return buffer;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Estimate audio duration from buffer size and sample rate.
|
|
||||||
*/
|
|
||||||
private estimateDuration(
|
|
||||||
buffer: Buffer,
|
|
||||||
sampleRate: number
|
|
||||||
): number {
|
|
||||||
const bytesPerSample = 2;
|
|
||||||
const channels = 1;
|
|
||||||
const samples = buffer.length / (bytesPerSample * channels);
|
|
||||||
return samples / sampleRate;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Voice enrollment service
|
|
||||||
export class VoiceEnrollmentService {
|
|
||||||
/**
|
|
||||||
* Enroll a new voice profile from audio data.
|
|
||||||
*/
|
|
||||||
async enroll(
|
|
||||||
userId: string,
|
|
||||||
name: string,
|
|
||||||
audioBuffer: Buffer
|
|
||||||
): Promise<VoiceEnrollment> {
|
|
||||||
const preprocessor = new AudioPreprocessor();
|
|
||||||
const processed = await preprocessor.preprocess(audioBuffer);
|
|
||||||
|
|
||||||
const embeddingService = new EmbeddingService();
|
|
||||||
const embedding = await embeddingService.extract(processed.buffer);
|
|
||||||
const voiceHash = this.computeEmbeddingHash(embedding);
|
|
||||||
|
|
||||||
const enrollment = await prisma.voiceEnrollment.create({
|
|
||||||
data: {
|
|
||||||
userId,
|
|
||||||
name,
|
|
||||||
voiceHash,
|
|
||||||
audioMetadata: {
|
|
||||||
...processed.metadata,
|
|
||||||
embeddingDimensions: embedding.length,
|
|
||||||
enrollmentTimestamp: new Date().toISOString(),
|
|
||||||
},
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
// Index in FAISS for similarity search
|
|
||||||
const faissIndex = new FAISSIndex();
|
|
||||||
await faissIndex.add(enrollment.id, embedding);
|
|
||||||
|
|
||||||
return enrollment;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* List all enrollments for a user.
|
|
||||||
*/
|
|
||||||
async listEnrollments(
|
|
||||||
userId: string,
|
|
||||||
options?: {
|
|
||||||
isActive?: boolean;
|
|
||||||
limit?: number;
|
|
||||||
offset?: number;
|
|
||||||
}
|
|
||||||
): Promise<VoiceEnrollment[]> {
|
|
||||||
return prisma.voiceEnrollment.findMany({
|
|
||||||
where: {
|
|
||||||
userId,
|
|
||||||
...(options?.isActive !== undefined && { isActive: options.isActive }),
|
|
||||||
},
|
|
||||||
orderBy: { createdAt: 'desc' },
|
|
||||||
take: options?.limit ?? 50,
|
|
||||||
skip: options?.offset ?? 0,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get a single enrollment by ID.
|
|
||||||
*/
|
|
||||||
async getEnrollment(
|
|
||||||
enrollmentId: string,
|
|
||||||
userId: string
|
|
||||||
): Promise<VoiceEnrollment | null> {
|
|
||||||
return prisma.voiceEnrollment.findFirst({
|
|
||||||
where: {
|
|
||||||
id: enrollmentId,
|
|
||||||
userId,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Remove (deactivate) an enrollment.
|
|
||||||
*/
|
|
||||||
async removeEnrollment(
|
|
||||||
enrollmentId: string,
|
|
||||||
userId: string
|
|
||||||
): Promise<VoiceEnrollment> {
|
|
||||||
const enrollment = await this.getEnrollment(enrollmentId, userId);
|
|
||||||
if (!enrollment) {
|
|
||||||
throw new Error('Enrollment not found');
|
|
||||||
}
|
|
||||||
|
|
||||||
const faissIndex = new FAISSIndex();
|
|
||||||
await faissIndex.remove(enrollmentId);
|
|
||||||
|
|
||||||
return prisma.voiceEnrollment.update({
|
|
||||||
where: { id: enrollmentId },
|
|
||||||
data: { isActive: false },
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Search for similar enrollments using FAISS.
|
|
||||||
*/
|
|
||||||
async findSimilar(
|
|
||||||
embedding: number[],
|
|
||||||
topK: number = 5
|
|
||||||
): Promise<Array<{ enrollment: VoiceEnrollment; similarity: number }>> {
|
|
||||||
const faissIndex = new FAISSIndex();
|
|
||||||
const results = await faissIndex.search(embedding, topK);
|
|
||||||
|
|
||||||
const enrollmentIds = results.map((r) => r.id);
|
|
||||||
const enrollments = await prisma.voiceEnrollment.findMany({
|
|
||||||
where: { id: { in: enrollmentIds } },
|
|
||||||
});
|
|
||||||
|
|
||||||
return results.map((r, i) => ({
|
|
||||||
enrollment: enrollments[i],
|
|
||||||
similarity: r.similarity,
|
|
||||||
}));
|
|
||||||
}
|
|
||||||
|
|
||||||
private computeEmbeddingHash(embedding: number[]): string {
|
|
||||||
let hash = 0;
|
|
||||||
for (let i = 0; i < embedding.length; i++) {
|
|
||||||
hash = ((hash << 5) - hash) + embedding[i];
|
|
||||||
hash |= 0;
|
|
||||||
}
|
|
||||||
return `vp_${Math.abs(hash).toString(16)}_${embedding.length}`;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Audio analysis service
|
|
||||||
export class AnalysisService {
|
|
||||||
/**
|
|
||||||
* Analyze a single audio file for synthetic voice detection.
|
|
||||||
*/
|
|
||||||
async analyze(
|
|
||||||
userId: string,
|
|
||||||
audioBuffer: Buffer,
|
|
||||||
options?: {
|
|
||||||
enrollmentId?: string;
|
|
||||||
audioUrl?: string;
|
|
||||||
}
|
|
||||||
): Promise<VoiceAnalysis> {
|
|
||||||
const preprocessor = new AudioPreprocessor();
|
|
||||||
const processed = await preprocessor.preprocess(audioBuffer);
|
|
||||||
|
|
||||||
const audioHash = this.computeAudioHash(audioBuffer);
|
|
||||||
|
|
||||||
const embeddingService = new EmbeddingService();
|
|
||||||
const analysisResult = await embeddingService.analyze(processed.buffer);
|
|
||||||
|
|
||||||
const isSynthetic = analysisResult.confidence >= voicePrintEnv.SYNTHETIC_THRESHOLD;
|
|
||||||
|
|
||||||
const voiceAnalysis = await prisma.voiceAnalysis.create({
|
|
||||||
data: {
|
|
||||||
userId,
|
|
||||||
enrollmentId: options?.enrollmentId,
|
|
||||||
audioHash,
|
|
||||||
isSynthetic,
|
|
||||||
confidence: analysisResult.confidence,
|
|
||||||
analysisResult: {
|
|
||||||
...analysisResult,
|
|
||||||
processedMetadata: processed.metadata,
|
|
||||||
analysisTimestamp: new Date().toISOString(),
|
|
||||||
modelVersion: 'ecapa-tdnn-v1-mock',
|
|
||||||
},
|
|
||||||
audioUrl: options?.audioUrl ?? '',
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
return voiceAnalysis;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get analysis result by ID.
|
|
||||||
*/
|
|
||||||
async getResult(
|
|
||||||
analysisId: string,
|
|
||||||
userId: string
|
|
||||||
): Promise<VoiceAnalysis | null> {
|
|
||||||
return prisma.voiceAnalysis.findFirst({
|
|
||||||
where: {
|
|
||||||
id: analysisId,
|
|
||||||
userId,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get analysis history for a user.
|
|
||||||
*/
|
|
||||||
async getHistory(
|
|
||||||
userId: string,
|
|
||||||
options?: {
|
|
||||||
limit?: number;
|
|
||||||
offset?: number;
|
|
||||||
isSynthetic?: boolean;
|
|
||||||
}
|
|
||||||
): Promise<VoiceAnalysis[]> {
|
|
||||||
return prisma.voiceAnalysis.findMany({
|
|
||||||
where: {
|
|
||||||
userId,
|
|
||||||
...(options?.isSynthetic !== undefined && { isSynthetic: options.isSynthetic }),
|
|
||||||
},
|
|
||||||
orderBy: { createdAt: 'desc' },
|
|
||||||
take: options?.limit ?? 50,
|
|
||||||
skip: options?.offset ?? 0,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
private computeAudioHash(buffer: Buffer): string {
|
|
||||||
let hash = 0;
|
|
||||||
const sampleSize = Math.min(buffer.length, 1024);
|
|
||||||
for (let i = 0; i < sampleSize; i += 8) {
|
|
||||||
hash = ((hash << 5) - hash) + buffer.readUInt8(i);
|
|
||||||
hash |= 0;
|
|
||||||
}
|
|
||||||
return `audio_${Math.abs(hash).toString(16)}`;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Batch analysis service
|
|
||||||
export class BatchAnalysisService {
|
|
||||||
/**
|
|
||||||
* Analyze multiple audio files in a batch.
|
|
||||||
*/
|
|
||||||
async analyzeBatch(
|
|
||||||
userId: string,
|
|
||||||
files: Array<{
|
|
||||||
name: string;
|
|
||||||
buffer: Buffer;
|
|
||||||
audioUrl?: string;
|
|
||||||
}>,
|
|
||||||
options?: {
|
|
||||||
enrollmentId?: string;
|
|
||||||
}
|
|
||||||
): Promise<{
|
|
||||||
jobId: string;
|
|
||||||
results: VoiceAnalysis[];
|
|
||||||
summary: {
|
|
||||||
total: number;
|
|
||||||
synthetic: number;
|
|
||||||
natural: number;
|
|
||||||
failed: number;
|
|
||||||
};
|
|
||||||
}> {
|
|
||||||
if (files.length > voicePrintEnv.BATCH_MAX_FILES) {
|
|
||||||
throw new Error(
|
|
||||||
`Batch too large: ${files.length} > ${voicePrintEnv.BATCH_MAX_FILES} max`
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
const analysisService = new AnalysisService();
|
|
||||||
const results: VoiceAnalysis[] = [];
|
|
||||||
let synthetic = 0;
|
|
||||||
let natural = 0;
|
|
||||||
let failed = 0;
|
|
||||||
|
|
||||||
for (const file of files) {
|
|
||||||
try {
|
|
||||||
const result = await analysisService.analyze(userId, file.buffer, {
|
|
||||||
enrollmentId: options?.enrollmentId,
|
|
||||||
audioUrl: file.audioUrl,
|
|
||||||
});
|
|
||||||
results.push(result);
|
|
||||||
if (result.isSynthetic) {
|
|
||||||
synthetic++;
|
|
||||||
} else {
|
|
||||||
natural++;
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.error(`Batch analysis failed for ${file.name}:`, error);
|
|
||||||
failed++;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const jobId = `batch_${Date.now()}_${Math.random().toString(36).slice(2, 8)}`;
|
|
||||||
|
|
||||||
return {
|
|
||||||
jobId,
|
|
||||||
results,
|
|
||||||
summary: {
|
|
||||||
total: files.length,
|
|
||||||
synthetic,
|
|
||||||
natural,
|
|
||||||
failed,
|
|
||||||
},
|
|
||||||
};
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Embedding service — ECAPA-TDNN inference wrapper
|
|
||||||
export class EmbeddingService {
|
|
||||||
private initialized = false;
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Initialize the ECAPA-TDNN model.
|
|
||||||
*/
|
|
||||||
async initialize(): Promise<void> {
|
|
||||||
if (this.initialized) return;
|
|
||||||
|
|
||||||
// TODO: Connect to Python ML service for real inference
|
|
||||||
// const response = await fetch(`${voicePrintEnv.ML_SERVICE_URL}/initialize`, {
|
|
||||||
// method: 'POST',
|
|
||||||
// body: JSON.stringify({ modelPath: voicePrintEnv.ECAPA_TDNN_MODEL_PATH }),
|
|
||||||
// });
|
|
||||||
|
|
||||||
this.initialized = true;
|
|
||||||
console.log('Embedding service initialized (mock model)');
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Extract voice embedding from audio.
|
|
||||||
*/
|
|
||||||
async extract(audioBuffer: Buffer): Promise<number[]> {
|
|
||||||
await this.initialize();
|
|
||||||
|
|
||||||
// TODO: Call Python ML service
|
|
||||||
// const response = await fetch(`${voicePrintEnv.ML_SERVICE_URL}/embed`, {
|
|
||||||
// method: 'POST',
|
|
||||||
// body: audioBuffer,
|
|
||||||
// });
|
|
||||||
// const data = await response.json();
|
|
||||||
// return data.embedding;
|
|
||||||
|
|
||||||
// Mock: generate deterministic embedding based on buffer content
|
|
||||||
const dims = voicePrintEnv.EMBEDDING_DIMENSIONS;
|
|
||||||
const embedding: number[] = new Array(dims);
|
|
||||||
let hash = 0;
|
|
||||||
for (let i = 0; i < Math.min(audioBuffer.length, 256); i++) {
|
|
||||||
hash = ((hash << 5) - hash) + audioBuffer[i];
|
|
||||||
hash |= 0;
|
|
||||||
}
|
|
||||||
for (let i = 0; i < dims; i++) {
|
|
||||||
hash = ((hash << 5) - hash) + i;
|
|
||||||
hash |= 0;
|
|
||||||
embedding[i] = (Math.abs(hash) % 1000) / 1000.0;
|
|
||||||
}
|
|
||||||
|
|
||||||
// L2 normalize
|
|
||||||
const norm = Math.sqrt(embedding.reduce((s, v) => s + v * v, 0));
|
|
||||||
return embedding.map((v) => v / norm);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Run full analysis: embedding + synthetic detection.
|
|
||||||
*/
|
|
||||||
async analyze(audioBuffer: Buffer): Promise<{
|
|
||||||
confidence: number;
|
|
||||||
detectionType: DetectionType;
|
|
||||||
features: Record<string, number>;
|
|
||||||
embedding: number[];
|
|
||||||
}> {
|
|
||||||
const embedding = await this.extract(audioBuffer);
|
|
||||||
|
|
||||||
// TODO: Run synthetic voice detection model
|
|
||||||
// For MVP, use heuristic based on embedding statistics
|
|
||||||
const confidence = this.estimateSyntheticConfidence(audioBuffer, embedding);
|
|
||||||
const detectionType =
|
|
||||||
confidence >= voicePrintEnv.SYNTHETIC_THRESHOLD
|
|
||||||
? DetectionType.SYNTHETIC_VOICE
|
|
||||||
: DetectionType.NATURAL;
|
|
||||||
|
|
||||||
const features = this.extractAnalysisFeatures(audioBuffer, embedding);
|
|
||||||
|
|
||||||
return {
|
|
||||||
confidence,
|
|
||||||
detectionType,
|
|
||||||
features,
|
|
||||||
embedding,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
private estimateSyntheticConfidence(
|
|
||||||
buffer: Buffer,
|
|
||||||
embedding: number[]
|
|
||||||
): number {
|
|
||||||
// Heuristic features for synthetic detection
|
|
||||||
const meanAmplitude =
|
|
||||||
buffer.reduce((s, v) => s + v, 0) / buffer.length / 255;
|
|
||||||
const embeddingStdDev =
|
|
||||||
Math.sqrt(
|
|
||||||
embedding.reduce((s, v) => s + (v - embedding.reduce((a, b) => a + b) / embedding.length) ** 2, 0) /
|
|
||||||
embedding.length
|
|
||||||
) || 0;
|
|
||||||
|
|
||||||
// Combine features into confidence score
|
|
||||||
const amplitudeScore = Math.abs(meanAmplitude - 0.5) * 2;
|
|
||||||
const embeddingScore = 1.0 - Math.min(1.0, embeddingStdDev * 2);
|
|
||||||
|
|
||||||
return Math.min(
|
|
||||||
1.0,
|
|
||||||
amplitudeScore * 0.3 + embeddingScore * 0.4 + Math.random() * 0.3
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
private extractAnalysisFeatures(
|
|
||||||
buffer: Buffer,
|
|
||||||
embedding: number[]
|
|
||||||
): Record<string, number> {
|
|
||||||
const meanAmplitude =
|
|
||||||
buffer.reduce((s, v) => s + v, 0) / buffer.length / 255;
|
|
||||||
const zeroCrossings = buffer.reduce((count, v, i, arr) => {
|
|
||||||
return i > 0 && ((v - 128) * (arr[i - 1] - 128) < 0) ? count + 1 : count;
|
|
||||||
}, 0);
|
|
||||||
|
|
||||||
return {
|
|
||||||
mean_amplitude: meanAmplitude,
|
|
||||||
zero_crossing_rate: zeroCrossings / buffer.length,
|
|
||||||
embedding_energy: embedding.reduce((s, v) => s + v * v, 0),
|
|
||||||
embedding_entropy: this.calculateEntropy(embedding),
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
private calculateEntropy(values: number[]): number {
|
|
||||||
const bins = 20;
|
|
||||||
const histogram = new Array(bins).fill(0);
|
|
||||||
const min = Math.min(...values);
|
|
||||||
const max = Math.max(...values);
|
|
||||||
const range = max - min || 1;
|
|
||||||
|
|
||||||
for (const v of values) {
|
|
||||||
const bin = Math.min(bins - 1, Math.floor(((v - min) / range) * bins));
|
|
||||||
histogram[bin]++;
|
|
||||||
}
|
|
||||||
|
|
||||||
let entropy = 0;
|
|
||||||
const total = values.length;
|
|
||||||
for (const count of histogram) {
|
|
||||||
if (count > 0) {
|
|
||||||
const p = count / total;
|
|
||||||
entropy -= p * Math.log2(p);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return entropy;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// FAISS index wrapper for voice fingerprint matching
|
|
||||||
export class FAISSIndex {
|
|
||||||
private indexPath: string;
|
|
||||||
private initialized = false;
|
|
||||||
|
|
||||||
constructor(path?: string) {
|
|
||||||
this.indexPath = path ?? voicePrintEnv.FAISS_INDEX_PATH;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Initialize or load the FAISS index.
|
|
||||||
*/
|
|
||||||
async initialize(): Promise<void> {
|
|
||||||
if (this.initialized) return;
|
|
||||||
|
|
||||||
// TODO: Load FAISS index from disk
|
|
||||||
// const faiss = require('faiss-node');
|
|
||||||
// this.index = faiss.readIndex(this.indexPath);
|
|
||||||
|
|
||||||
this.initialized = true;
|
|
||||||
console.log(`FAISS index initialized at ${this.indexPath}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Add an enrollment embedding to the index.
|
|
||||||
*/
|
|
||||||
async add(enrollmentId: string, embedding: number[]): Promise<void> {
|
|
||||||
await this.initialize();
|
|
||||||
|
|
||||||
// TODO: Add to FAISS index
|
|
||||||
// this.index.add([embedding]);
|
|
||||||
// Store mapping: enrollmentId -> index position
|
|
||||||
console.log(`Added enrollment ${enrollmentId} to FAISS index`);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Remove an enrollment from the index.
|
|
||||||
*/
|
|
||||||
async remove(enrollmentId: string): Promise<void> {
|
|
||||||
await this.initialize();
|
|
||||||
|
|
||||||
// TODO: Remove from FAISS index
|
|
||||||
console.log(`Removed enrollment ${enrollmentId} from FAISS index`);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Search for similar voice embeddings.
|
|
||||||
*/
|
|
||||||
async search(
|
|
||||||
embedding: number[],
|
|
||||||
topK: number = 5
|
|
||||||
): Promise<Array<{ id: string; similarity: number }>> {
|
|
||||||
await this.initialize();
|
|
||||||
|
|
||||||
// TODO: Query FAISS index
|
|
||||||
// const [distances, indices] = this.index.search([embedding], topK);
|
|
||||||
// Map indices back to enrollment IDs
|
|
||||||
|
|
||||||
// Mock: return empty results
|
|
||||||
return [];
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Save the index to disk.
|
|
||||||
*/
|
|
||||||
async save(): Promise<void> {
|
|
||||||
await this.initialize();
|
|
||||||
// TODO: Write FAISS index to disk
|
|
||||||
console.log(`FAISS index saved to ${this.indexPath}`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Export singleton instances
|
|
||||||
export const audioPreprocessor = new AudioPreprocessor();
|
|
||||||
export const voiceEnrollmentService = new VoiceEnrollmentService();
|
|
||||||
export const analysisService = new AnalysisService();
|
|
||||||
export const batchAnalysisService = new BatchAnalysisService();
|
|
||||||
export const embeddingService = new EmbeddingService();
|
|
||||||
@@ -1,12 +0,0 @@
|
|||||||
{
|
|
||||||
"extends": "../../tsconfig.base.json",
|
|
||||||
"compilerOptions": {
|
|
||||||
"outDir": "./dist",
|
|
||||||
"rootDir": "./src",
|
|
||||||
"declaration": true,
|
|
||||||
"declarationMap": true,
|
|
||||||
"sourceMap": true
|
|
||||||
},
|
|
||||||
"include": ["src/**/*"],
|
|
||||||
"exclude": ["node_modules", "dist"]
|
|
||||||
}
|
|
||||||
@@ -1,22 +0,0 @@
|
|||||||
{
|
|
||||||
"name": "mobile",
|
|
||||||
"version": "0.1.0",
|
|
||||||
"private": true,
|
|
||||||
"type": "module",
|
|
||||||
"scripts": {
|
|
||||||
"dev": "vite",
|
|
||||||
"build": "tsc && vite build",
|
|
||||||
"lint": "eslint src/"
|
|
||||||
},
|
|
||||||
"dependencies": {
|
|
||||||
"solid-js": "^1.8.14",
|
|
||||||
"@shieldsai/shared-auth": "*",
|
|
||||||
"@shieldsai/shared-ui": "*",
|
|
||||||
"@shieldsai/shared-utils": "*"
|
|
||||||
},
|
|
||||||
"devDependencies": {
|
|
||||||
"typescript": "^5.3.3",
|
|
||||||
"vite": "^5.1.4",
|
|
||||||
"@types/node": "^25.6.0"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,24 +0,0 @@
|
|||||||
{
|
|
||||||
"name": "web",
|
|
||||||
"version": "0.1.0",
|
|
||||||
"private": true,
|
|
||||||
"type": "module",
|
|
||||||
"scripts": {
|
|
||||||
"dev": "vite",
|
|
||||||
"build": "tsc && vite build",
|
|
||||||
"preview": "vite preview",
|
|
||||||
"lint": "eslint src/"
|
|
||||||
},
|
|
||||||
"dependencies": {
|
|
||||||
"solid-js": "^1.8.14",
|
|
||||||
"@shieldsai/shared-auth": "*",
|
|
||||||
"@shieldsai/shared-ui": "*",
|
|
||||||
"@shieldsai/shared-utils": "*"
|
|
||||||
},
|
|
||||||
"devDependencies": {
|
|
||||||
"typescript": "^5.3.3",
|
|
||||||
"vite": "^5.1.4",
|
|
||||||
"vite-plugin-solid": "^2.8.2",
|
|
||||||
"@types/node": "^25.6.0"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,90 +0,0 @@
|
|||||||
/**
|
|
||||||
* Example: Real-Time Call Analysis
|
|
||||||
* Demonstrates how to use the RealTimeCallAnalysisServer
|
|
||||||
*/
|
|
||||||
|
|
||||||
import { RealTimeCallAnalysisServer } from '../src/lib/call-analysis/real-time-call-server';
|
|
||||||
|
|
||||||
async function example() {
|
|
||||||
// Create and start the server
|
|
||||||
const server = new RealTimeCallAnalysisServer({
|
|
||||||
port: 8089,
|
|
||||||
enableEchoCancellation: true,
|
|
||||||
enableNoiseSuppression: true,
|
|
||||||
enableAutoGainControl: true,
|
|
||||||
analysisConfig: {
|
|
||||||
sentimentWindowMs: 5000,
|
|
||||||
interruptThresholdMs: 200,
|
|
||||||
overlapThresholdMs: 300,
|
|
||||||
pauseThresholdMs: 2000,
|
|
||||||
volumeSpikeThreshold: 0.8,
|
|
||||||
anomalySensitivity: 'medium',
|
|
||||||
enableSpeakerDiarization: false,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
// Listen for events
|
|
||||||
server.on('client:connected', ({ clientId }) => {
|
|
||||||
console.log(`Client connected: ${clientId}`);
|
|
||||||
});
|
|
||||||
|
|
||||||
server.on('client:disconnected', ({ clientId }) => {
|
|
||||||
console.log(`Client disconnected: ${clientId}`);
|
|
||||||
});
|
|
||||||
|
|
||||||
server.on('analysis:alert', ({ clientId, alert }) => {
|
|
||||||
console.log(`Alert from ${clientId}: ${alert.message} (${alert.severity})`);
|
|
||||||
});
|
|
||||||
|
|
||||||
server.on('analysis:result', ({ clientId, status }) => {
|
|
||||||
console.log(`Analysis status for ${clientId}: ${status}`);
|
|
||||||
});
|
|
||||||
|
|
||||||
server.on('analysis:error', ({ clientId, error }) => {
|
|
||||||
console.error(`Error for ${clientId}:`, error);
|
|
||||||
});
|
|
||||||
|
|
||||||
// Start the server
|
|
||||||
await server.start();
|
|
||||||
console.log('Server started, waiting for clients...');
|
|
||||||
|
|
||||||
// Example: Client connection simulation
|
|
||||||
const WebSocket = require('ws');
|
|
||||||
const client = new WebSocket('ws://localhost:8089?clientId=test-client');
|
|
||||||
|
|
||||||
client.on('open', () => {
|
|
||||||
console.log('Client connected');
|
|
||||||
|
|
||||||
// Start audio capture
|
|
||||||
client.send(JSON.stringify({ type: 'start' }));
|
|
||||||
});
|
|
||||||
|
|
||||||
client.on('message', (data: Buffer) => {
|
|
||||||
const message = JSON.parse(data.toString());
|
|
||||||
console.log('Received:', message.type, message);
|
|
||||||
|
|
||||||
if (message.type === 'alert' || message.type === 'anomaly') {
|
|
||||||
console.log(` - ${message.alertType}: ${message.message}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (message.type === 'analysis') {
|
|
||||||
console.log(` - MOS: ${message.callQuality.mosScore}`);
|
|
||||||
console.log(` - Sentiment: ${message.sentiment.sentiment}`);
|
|
||||||
console.log(` - Summary: ${message.summary}`);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// Stop after 60 seconds
|
|
||||||
setTimeout(async () => {
|
|
||||||
console.log('Stopping server...');
|
|
||||||
await server.stop();
|
|
||||||
process.exit(0);
|
|
||||||
}, 60000);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Run example if called directly
|
|
||||||
if (require.main === module) {
|
|
||||||
example().catch(console.error);
|
|
||||||
}
|
|
||||||
|
|
||||||
export default example;
|
|
||||||
4159
package-lock.json
generated
4159
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
@@ -1,8 +1,8 @@
|
|||||||
{
|
{
|
||||||
"name": "shieldsai-monorepo",
|
"name": "frenocorp",
|
||||||
"version": "0.1.0",
|
"version": "0.1.0",
|
||||||
"private": true,
|
"private": true,
|
||||||
"description": "ShieldAI multi-service SaaS platform",
|
"description": "FrenoCorp agent orchestration platform",
|
||||||
"type": "module",
|
"type": "module",
|
||||||
"packageManager": "npm@10.9.0",
|
"packageManager": "npm@10.9.0",
|
||||||
"workspaces": [
|
"workspaces": [
|
||||||
|
|||||||
@@ -1,25 +0,0 @@
|
|||||||
{
|
|
||||||
"name": "@shieldsai/jobs",
|
|
||||||
"version": "0.1.0",
|
|
||||||
"private": true,
|
|
||||||
"type": "module",
|
|
||||||
"main": "src/index.ts",
|
|
||||||
"types": "src/index.ts",
|
|
||||||
"scripts": {
|
|
||||||
"start": "tsx src/index.ts",
|
|
||||||
"dev": "tsx watch src/index.ts"
|
|
||||||
},
|
|
||||||
"dependencies": {
|
|
||||||
"@shieldsai/shared-analytics": "*",
|
|
||||||
"@shieldsai/shared-billing": "*",
|
|
||||||
"@shieldsai/shared-db": "*",
|
|
||||||
"@shieldsai/shared-utils": "*",
|
|
||||||
"bullmq": "^5.1.0",
|
|
||||||
"ioredis": "^5.3.0",
|
|
||||||
"zod": "^4.3.6"
|
|
||||||
},
|
|
||||||
"devDependencies": {
|
|
||||||
"tsx": "^4.7.1",
|
|
||||||
"typescript": "^5.3.3"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,173 +0,0 @@
|
|||||||
import { prisma, SubscriptionTier } from '@shieldsai/shared-db';
|
|
||||||
import { Queue, Worker, Job } from 'bullmq';
|
|
||||||
import { Redis } from 'ioredis';
|
|
||||||
import { tierConfig, getTierFeatures } from '@shieldsai/shared-billing';
|
|
||||||
import { mixpanelService, EventType } from '@shieldsai/shared-analytics';
|
|
||||||
|
|
||||||
const redisHost = process.env.REDIS_HOST || 'localhost';
|
|
||||||
const redisPort = parseInt(process.env.REDIS_PORT || '6379', 10);
|
|
||||||
|
|
||||||
const connection = new Redis({
|
|
||||||
host: redisHost,
|
|
||||||
port: redisPort,
|
|
||||||
retryStrategy: (times: number) => Math.min(times * 50, 2000),
|
|
||||||
});
|
|
||||||
|
|
||||||
const QUEUE_CONFIG = {
|
|
||||||
darkwatchScan: {
|
|
||||||
name: 'darkwatch-scan',
|
|
||||||
concurrency: parseInt(process.env.DARKWATCH_CONCURRENCY || '5', 10),
|
|
||||||
defaultJobTimeout: parseInt(process.env.DARKWATCH_JOB_TIMEOUT || '120000', 10),
|
|
||||||
maxAttempts: parseInt(process.env.DARKWATCH_MAX_ATTEMPTS || '3', 10),
|
|
||||||
},
|
|
||||||
};
|
|
||||||
|
|
||||||
export const darkwatchScanQueue = new Queue(
|
|
||||||
QUEUE_CONFIG.darkwatchScan.name,
|
|
||||||
{ connection }
|
|
||||||
);
|
|
||||||
|
|
||||||
async function processDarkwatchScan(
|
|
||||||
job: Job<{
|
|
||||||
subscriptionId: string;
|
|
||||||
tier: string;
|
|
||||||
scanType: 'scheduled' | 'on-demand' | 'realtime';
|
|
||||||
sourceData?: Record<string, unknown>;
|
|
||||||
}>
|
|
||||||
) {
|
|
||||||
const { subscriptionId, tier, scanType, sourceData } = job.data;
|
|
||||||
|
|
||||||
const { scanService } = await import(
|
|
||||||
'../../../apps/api/src/services/darkwatch/scan.service'
|
|
||||||
);
|
|
||||||
const { alertPipeline } = await import(
|
|
||||||
'../../../apps/api/src/services/darkwatch/alert.pipeline'
|
|
||||||
);
|
|
||||||
|
|
||||||
job.updateProgress(10);
|
|
||||||
console.log(
|
|
||||||
`[DarkWatch:Scan] Starting ${scanType} scan for subscription ${subscriptionId} (tier: ${tier})`
|
|
||||||
);
|
|
||||||
|
|
||||||
try {
|
|
||||||
const subscription = await prisma.subscription.findUnique({
|
|
||||||
where: { id: subscriptionId },
|
|
||||||
select: { userId: true, tier: true },
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!subscription) {
|
|
||||||
job.updateProgress(100);
|
|
||||||
return { status: 'skipped', reason: 'subscription_not_found' };
|
|
||||||
}
|
|
||||||
|
|
||||||
await mixpanelService.track(
|
|
||||||
EventType.DARK_WEB_SCAN_STARTED,
|
|
||||||
subscription.userId,
|
|
||||||
{
|
|
||||||
scanType,
|
|
||||||
subscriptionTier: subscription.tier,
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
job.updateProgress(25);
|
|
||||||
|
|
||||||
const watchlistItems = await scanService.getWatchlistItems(subscriptionId);
|
|
||||||
|
|
||||||
if (watchlistItems.length === 0) {
|
|
||||||
job.updateProgress(100);
|
|
||||||
return { status: 'completed', exposuresCreated: 0, exposuresUpdated: 0 };
|
|
||||||
}
|
|
||||||
|
|
||||||
job.updateProgress(50);
|
|
||||||
|
|
||||||
const { exposuresCreated, exposuresUpdated } =
|
|
||||||
await scanService.processSubscriptionScan(subscriptionId, watchlistItems);
|
|
||||||
|
|
||||||
job.updateProgress(80);
|
|
||||||
|
|
||||||
const newExposureIds = await prisma.exposure.findMany({
|
|
||||||
where: {
|
|
||||||
subscriptionId,
|
|
||||||
isFirstTime: true,
|
|
||||||
detectedAt: { gte: new Date(Date.now() - 5 * 60 * 1000) },
|
|
||||||
},
|
|
||||||
select: { id: true },
|
|
||||||
});
|
|
||||||
|
|
||||||
if (newExposureIds.length > 0) {
|
|
||||||
await alertPipeline.processNewExposures(newExposureIds.map((e) => e.id));
|
|
||||||
}
|
|
||||||
|
|
||||||
await alertPipeline.dispatchScanCompleteAlert(
|
|
||||||
subscriptionId,
|
|
||||||
subscription.userId,
|
|
||||||
exposuresCreated
|
|
||||||
);
|
|
||||||
|
|
||||||
job.updateProgress(95);
|
|
||||||
|
|
||||||
await mixpanelService.track(
|
|
||||||
EventType.DARK_WEB_SCAN_COMPLETED,
|
|
||||||
subscription.userId,
|
|
||||||
{
|
|
||||||
scanType,
|
|
||||||
subscriptionTier: subscription.tier,
|
|
||||||
exposuresCreated,
|
|
||||||
exposuresUpdated,
|
|
||||||
watchlistItemsScanned: watchlistItems.length,
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
job.updateProgress(100);
|
|
||||||
|
|
||||||
return {
|
|
||||||
status: 'completed',
|
|
||||||
exposuresCreated,
|
|
||||||
exposuresUpdated,
|
|
||||||
watchlistItemsScanned: watchlistItems.length,
|
|
||||||
};
|
|
||||||
} catch (error) {
|
|
||||||
const message = error instanceof Error ? error.message : 'Scan failed';
|
|
||||||
console.error(`[DarkWatch:Scan] Job ${job.id} failed:`, message);
|
|
||||||
job.updateProgress(100);
|
|
||||||
throw new Error(message);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
export const darkwatchScanWorker = new Worker(
|
|
||||||
QUEUE_CONFIG.darkwatchScan.name,
|
|
||||||
processDarkwatchScan,
|
|
||||||
{
|
|
||||||
connection,
|
|
||||||
concurrency: QUEUE_CONFIG.darkwatchScan.concurrency,
|
|
||||||
limiter: {
|
|
||||||
max: 20,
|
|
||||||
duration: 1000,
|
|
||||||
},
|
|
||||||
removeOnComplete: {
|
|
||||||
age: 7 * 24 * 60 * 60,
|
|
||||||
count: 1000,
|
|
||||||
},
|
|
||||||
removeOnFail: {
|
|
||||||
age: 30 * 24 * 60 * 60,
|
|
||||||
count: 100,
|
|
||||||
},
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
darkwatchScanWorker.on('completed', (job, result) => {
|
|
||||||
console.log(`[DarkWatch:Scan] Job ${job.id} completed:`, result);
|
|
||||||
});
|
|
||||||
|
|
||||||
darkwatchScanWorker.on('failed', (job, err) => {
|
|
||||||
console.error(`[DarkWatch:Scan] Job ${job?.id} failed:`, err.message);
|
|
||||||
});
|
|
||||||
|
|
||||||
darkwatchScanWorker.on('error', (err) => {
|
|
||||||
console.error('[DarkWatch:Scan] Worker error:', err.message);
|
|
||||||
});
|
|
||||||
|
|
||||||
export default {
|
|
||||||
darkwatchScanQueue,
|
|
||||||
darkwatchScanWorker,
|
|
||||||
};
|
|
||||||
@@ -1,9 +0,0 @@
|
|||||||
export {
|
|
||||||
voiceprintAnalysisQueue,
|
|
||||||
voiceprintAnalysisWorker,
|
|
||||||
} from './voiceprint.jobs';
|
|
||||||
|
|
||||||
export {
|
|
||||||
darkwatchScanQueue,
|
|
||||||
darkwatchScanWorker,
|
|
||||||
} from './darkwatch.jobs';
|
|
||||||
@@ -1,93 +0,0 @@
|
|||||||
import { prisma } from '@shieldsai/shared-db';
|
|
||||||
import { Queue, Worker, Job } from 'bullmq';
|
|
||||||
import { Redis } from 'ioredis';
|
|
||||||
|
|
||||||
// Redis connection
|
|
||||||
const redisHost = process.env.REDIS_HOST || 'localhost';
|
|
||||||
const redisPort = parseInt(process.env.REDIS_PORT || '6379', 10);
|
|
||||||
|
|
||||||
const connection = new Redis({
|
|
||||||
host: redisHost,
|
|
||||||
port: redisPort,
|
|
||||||
retryStrategy: (times: number) => Math.min(times * 50, 2000),
|
|
||||||
});
|
|
||||||
|
|
||||||
// Queue configuration
|
|
||||||
const QUEUE_CONFIG = {
|
|
||||||
voiceprintAnalysis: {
|
|
||||||
name: 'voiceprint-analysis',
|
|
||||||
concurrency: parseInt(process.env.VOICEPRINT_CONCURRENCY || '3', 10),
|
|
||||||
defaultJobTimeout: parseInt(process.env.VOICEPRINT_JOB_TIMEOUT || '30000', 10),
|
|
||||||
maxAttempts: parseInt(process.env.VOICEPRINT_MAX_ATTEMPTS || '3', 10),
|
|
||||||
},
|
|
||||||
};
|
|
||||||
|
|
||||||
// Create queues
|
|
||||||
export const voiceprintAnalysisQueue = new Queue(
|
|
||||||
QUEUE_CONFIG.voiceprintAnalysis.name,
|
|
||||||
{ connection }
|
|
||||||
);
|
|
||||||
|
|
||||||
// VoicePrint analysis job processor
|
|
||||||
async function processVoiceprintAnalysis(job: Job<{
|
|
||||||
userId: string;
|
|
||||||
audioBuffer: Buffer;
|
|
||||||
enrollmentId?: string;
|
|
||||||
audioUrl?: string;
|
|
||||||
}>) {
|
|
||||||
const { userId, audioBuffer, enrollmentId, audioUrl } = job.data;
|
|
||||||
|
|
||||||
// Import analysis service dynamically to avoid circular dependencies
|
|
||||||
const { analysisService } = await import(
|
|
||||||
'../../../apps/api/src/services/voiceprint'
|
|
||||||
);
|
|
||||||
|
|
||||||
try {
|
|
||||||
const result = await analysisService.analyze(userId, audioBuffer, {
|
|
||||||
enrollmentId,
|
|
||||||
audioUrl,
|
|
||||||
});
|
|
||||||
|
|
||||||
return {
|
|
||||||
analysisId: result.id,
|
|
||||||
isSynthetic: result.isSynthetic,
|
|
||||||
confidence: result.confidence,
|
|
||||||
};
|
|
||||||
} catch (error) {
|
|
||||||
const message = error instanceof Error ? error.message : 'Analysis failed';
|
|
||||||
job.updateProgress(100);
|
|
||||||
throw new Error(message);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Create worker
|
|
||||||
export const voiceprintAnalysisWorker = new Worker(
|
|
||||||
QUEUE_CONFIG.voiceprintAnalysis.name,
|
|
||||||
processVoiceprintAnalysis,
|
|
||||||
{
|
|
||||||
connection,
|
|
||||||
concurrency: QUEUE_CONFIG.voiceprintAnalysis.concurrency,
|
|
||||||
limiter: {
|
|
||||||
max: 10,
|
|
||||||
duration: 1000,
|
|
||||||
},
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
// Add event handlers
|
|
||||||
voiceprintAnalysisWorker.on('completed', (job, result) => {
|
|
||||||
console.log(`Job ${job.id} completed:`, result);
|
|
||||||
});
|
|
||||||
|
|
||||||
voiceprintAnalysisWorker.on('failed', (job, err) => {
|
|
||||||
console.error(`Job ${job.id} failed:`, err.message);
|
|
||||||
});
|
|
||||||
|
|
||||||
voiceprintAnalysisWorker.on('error', (err) => {
|
|
||||||
console.error('Worker error:', err.message);
|
|
||||||
});
|
|
||||||
|
|
||||||
export default {
|
|
||||||
voiceprintAnalysisQueue,
|
|
||||||
voiceprintAnalysisWorker,
|
|
||||||
};
|
|
||||||
@@ -1,12 +0,0 @@
|
|||||||
{
|
|
||||||
"extends": "../../tsconfig.base.json",
|
|
||||||
"compilerOptions": {
|
|
||||||
"outDir": "./dist",
|
|
||||||
"rootDir": "./src",
|
|
||||||
"declaration": true,
|
|
||||||
"declarationMap": true,
|
|
||||||
"sourceMap": true
|
|
||||||
},
|
|
||||||
"include": ["src/**/*"],
|
|
||||||
"exclude": ["node_modules", "dist"]
|
|
||||||
}
|
|
||||||
@@ -1,19 +0,0 @@
|
|||||||
{
|
|
||||||
"name": "@shieldsai/shared-analytics",
|
|
||||||
"version": "0.1.0",
|
|
||||||
"private": true,
|
|
||||||
"type": "module",
|
|
||||||
"main": "src/index.ts",
|
|
||||||
"types": "src/index.ts",
|
|
||||||
"scripts": {
|
|
||||||
"lint": "eslint src/"
|
|
||||||
},
|
|
||||||
"dependencies": {
|
|
||||||
"@segment/analytics-node": "^1.0.0",
|
|
||||||
"googleapis": "^128.0.0",
|
|
||||||
"zod": "^4.3.6"
|
|
||||||
},
|
|
||||||
"devDependencies": {
|
|
||||||
"typescript": "^5.3.3"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,132 +0,0 @@
|
|||||||
import { z } from 'zod';
|
|
||||||
|
|
||||||
// Environment variables for analytics
|
|
||||||
const envSchema = z.object({
|
|
||||||
MIXPANEL_TOKEN: z.string(),
|
|
||||||
MIXPANEL_API_SECRET: z.string().optional(),
|
|
||||||
GA4_MEASUREMENT_ID: z.string(),
|
|
||||||
GA4_API_SECRET: z.string().optional(),
|
|
||||||
STRIPE_WEBHOOK_SECRET: z.string(),
|
|
||||||
ANALYTICS_ENV: z.enum(['development', 'production', 'staging']).default('development'),
|
|
||||||
});
|
|
||||||
|
|
||||||
export const analyticsEnv = envSchema.parse({
|
|
||||||
MIXPANEL_TOKEN: process.env.MIXPANEL_TOKEN,
|
|
||||||
MIXPANEL_API_SECRET: process.env.MIXPANEL_API_SECRET,
|
|
||||||
GA4_MEASUREMENT_ID: process.env.GA4_MEASUREMENT_ID,
|
|
||||||
GA4_API_SECRET: process.env.GA4_API_SECRET,
|
|
||||||
STRIPE_WEBHOOK_SECRET: process.env.STRIPE_WEBHOOK_SECRET,
|
|
||||||
ANALYTICS_ENV: process.env.ANALYTICS_ENV,
|
|
||||||
});
|
|
||||||
|
|
||||||
// Event taxonomy
|
|
||||||
export enum EventType {
|
|
||||||
// User events
|
|
||||||
USER_SIGNED_UP = 'user_signed_up',
|
|
||||||
USER_LOGGED_IN = 'user_logged_in',
|
|
||||||
USER_LOGGED_OUT = 'user_logged_out',
|
|
||||||
USER_UPGRADED = 'user_upgraded',
|
|
||||||
USER_DOWNGRADED = 'user_downgraded',
|
|
||||||
|
|
||||||
// Subscription events
|
|
||||||
SUBSCRIPTION_CREATED = 'subscription_created',
|
|
||||||
SUBSCRIPTION_UPDATED = 'subscription_updated',
|
|
||||||
SUBSCRIPTION_CANCELLED = 'subscription_cancelled',
|
|
||||||
SUBSCRIPTION_RENEWED = 'subscription_renewed',
|
|
||||||
|
|
||||||
// DarkWatch events
|
|
||||||
DARK_WEB_SCAN_STARTED = 'dark_web_scan_started',
|
|
||||||
DARK_WEB_SCAN_COMPLETED = 'dark_web_scan_completed',
|
|
||||||
EXPOSURE_DETECTED = 'exposure_detected',
|
|
||||||
EXPOSURE_RESOLVED = 'exposure_resolved',
|
|
||||||
WATCHLIST_ITEM_ADDED = 'watchlist_item_added',
|
|
||||||
WATCHLIST_ITEM_REMOVED = 'watchlist_item_removed',
|
|
||||||
|
|
||||||
// VoicePrint events
|
|
||||||
VOICE_ENROLLED = 'voice_enrolled',
|
|
||||||
VOICE_ANALYZED = 'voice_analyzed',
|
|
||||||
VOICE_MATCH_FOUND = 'voice_match_found',
|
|
||||||
SYNTHETIC_VOICE_DETECTED = 'synthetic_voice_detected',
|
|
||||||
|
|
||||||
// SpamShield events
|
|
||||||
CALL_ANALYZED = 'call_analyzed',
|
|
||||||
SMS_ANALYZED = 'sms_analyzed',
|
|
||||||
SPAM_BLOCKED = 'spam_blocked',
|
|
||||||
SPAM_FLAGGED = 'spam_flagged',
|
|
||||||
SPAM_FEEDBACK_SUBMITTED = 'spam_feedback_submitted',
|
|
||||||
|
|
||||||
// KPI events
|
|
||||||
MRR_UPDATED = 'mrr_updated',
|
|
||||||
CONVERSION_OCCURRED = 'conversion_occurred',
|
|
||||||
CHURN_OCCURRED = 'churn_occurred',
|
|
||||||
REFERRAL_SENT = 'referral_sent',
|
|
||||||
REFERRAL_CONVERTED = 'referral_converted',
|
|
||||||
}
|
|
||||||
|
|
||||||
// Event properties schema
|
|
||||||
export const eventPropertiesSchema = z.object({
|
|
||||||
userId: z.string().optional(),
|
|
||||||
sessionId: z.string().optional(),
|
|
||||||
timestamp: z.date().optional(),
|
|
||||||
platform: z.enum(['web', 'mobile', 'desktop', 'api']).optional(),
|
|
||||||
version: z.string().optional(),
|
|
||||||
environment: z.string().optional(),
|
|
||||||
});
|
|
||||||
|
|
||||||
// KPI definitions
|
|
||||||
export const kpiDefinitions = {
|
|
||||||
mau: {
|
|
||||||
name: 'Monthly Active Users',
|
|
||||||
description: 'Unique users who performed an action in the last 30 days',
|
|
||||||
calculation: 'COUNT(DISTINCT userId) WHERE timestamp > NOW() - INTERVAL 30 DAYS',
|
|
||||||
},
|
|
||||||
payingUsers: {
|
|
||||||
name: 'Paying Users',
|
|
||||||
description: 'Users with active subscriptions',
|
|
||||||
calculation: 'COUNT(DISTINCT userId) WHERE subscription.status = "active"',
|
|
||||||
},
|
|
||||||
mrr: {
|
|
||||||
name: 'Monthly Recurring Revenue',
|
|
||||||
description: 'Total monthly subscription revenue',
|
|
||||||
calculation: 'SUM(subscription.amount) WHERE subscription.status = "active"',
|
|
||||||
},
|
|
||||||
conversionRate: {
|
|
||||||
name: 'Conversion Rate',
|
|
||||||
description: 'Percentage of free users who upgrade to paid',
|
|
||||||
calculation: 'COUNT(upgrade events) / COUNT(signup events)',
|
|
||||||
},
|
|
||||||
churn: {
|
|
||||||
name: 'Churn Rate',
|
|
||||||
description: 'Percentage of paying users who cancel',
|
|
||||||
calculation: 'COUNT(cancel events) / COUNT(active subscriptions)',
|
|
||||||
},
|
|
||||||
cac: {
|
|
||||||
name: 'Customer Acquisition Cost',
|
|
||||||
description: 'Average cost to acquire a new paying user',
|
|
||||||
calculation: 'Total marketing spend / COUNT(new paying users)',
|
|
||||||
},
|
|
||||||
ltv: {
|
|
||||||
name: 'Lifetime Value',
|
|
||||||
description: 'Average revenue per user over their lifetime',
|
|
||||||
calculation: 'Average subscription amount / Churn rate',
|
|
||||||
},
|
|
||||||
nps: {
|
|
||||||
name: 'Net Promoter Score',
|
|
||||||
description: 'Customer satisfaction metric (-100 to 100)',
|
|
||||||
calculation: '% Promoters - % Detractors',
|
|
||||||
},
|
|
||||||
viralCoefficient: {
|
|
||||||
name: 'Viral Coefficient',
|
|
||||||
description: 'Average number of referrals per user',
|
|
||||||
calculation: 'COUNT(referral events) / COUNT(users)',
|
|
||||||
},
|
|
||||||
};
|
|
||||||
|
|
||||||
// Alert thresholds
|
|
||||||
export const alertThresholds = {
|
|
||||||
churn: { warning: 0.05, critical: 0.10 },
|
|
||||||
conversionRate: { warning: 0.02, critical: 0.01 },
|
|
||||||
mrr: { warning: 0.90, critical: 0.80 }, // Percentage of target
|
|
||||||
nps: { warning: 50, critical: 40 },
|
|
||||||
viralCoefficient: { warning: 0.4, critical: 0.3 },
|
|
||||||
};
|
|
||||||
@@ -1,18 +0,0 @@
|
|||||||
// Config
|
|
||||||
export {
|
|
||||||
analyticsEnv,
|
|
||||||
EventType,
|
|
||||||
eventPropertiesSchema,
|
|
||||||
kpiDefinitions,
|
|
||||||
alertThresholds,
|
|
||||||
} from './config/analytics.config';
|
|
||||||
|
|
||||||
// Services
|
|
||||||
export {
|
|
||||||
MixpanelService,
|
|
||||||
mixpanelService,
|
|
||||||
} from './services/mixpanel.service';
|
|
||||||
export {
|
|
||||||
GA4Service,
|
|
||||||
ga4Service,
|
|
||||||
} from './services/ga4.service';
|
|
||||||
@@ -1,104 +0,0 @@
|
|||||||
import { google } from 'googleapis';
|
|
||||||
import { analyticsEnv, EventType } from '../config/analytics.config';
|
|
||||||
|
|
||||||
// GA4 service
|
|
||||||
export class GA4Service {
|
|
||||||
private auth: any;
|
|
||||||
|
|
||||||
constructor() {
|
|
||||||
this.auth = google.auth.fromAPIKey(analyticsEnv.GA4_API_SECRET || 'placeholder');
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Initialize GA4 client
|
|
||||||
*/
|
|
||||||
async initialize(): Promise<void> {
|
|
||||||
// TODO: Initialize GA4 client with measurement ID
|
|
||||||
console.log('GA4 client initialized');
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Send event to GA4
|
|
||||||
*/
|
|
||||||
async sendEvent(
|
|
||||||
eventName: string,
|
|
||||||
params: {
|
|
||||||
client_id: string;
|
|
||||||
[key: string]: any;
|
|
||||||
}
|
|
||||||
): Promise<void> {
|
|
||||||
// TODO: Implement GA4 event tracking
|
|
||||||
// const measurementId = analyticsEnv.GA4_MEASUREMENT_ID;
|
|
||||||
// await fetch(`https://www.google-analytics.com/mp/collect?measurement_id=${measurementId}&api_secret=${analyticsEnv.GA4_API_SECRET}`, {
|
|
||||||
// method: 'POST',
|
|
||||||
// body: JSON.stringify({
|
|
||||||
// events: [{ name: eventName, params }],
|
|
||||||
// }),
|
|
||||||
// });
|
|
||||||
|
|
||||||
console.log('GA4 event:', eventName, params);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Track page view
|
|
||||||
*/
|
|
||||||
async trackPageView(clientId: string, path: string, title?: string): Promise<void> {
|
|
||||||
await this.sendEvent('page_view', {
|
|
||||||
client_id: clientId,
|
|
||||||
page_path: path,
|
|
||||||
page_title: title,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Track e-commerce purchase
|
|
||||||
*/
|
|
||||||
async trackPurchase(
|
|
||||||
clientId: string,
|
|
||||||
transactionId: string,
|
|
||||||
value: number,
|
|
||||||
currency: string,
|
|
||||||
items: Array<{ name: string; price: number; quantity: number }>
|
|
||||||
): Promise<void> {
|
|
||||||
await this.sendEvent('purchase', {
|
|
||||||
client_id: clientId,
|
|
||||||
transaction_id: transactionId,
|
|
||||||
value,
|
|
||||||
currency,
|
|
||||||
items,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Track conversion
|
|
||||||
*/
|
|
||||||
async trackConversion(
|
|
||||||
clientId: string,
|
|
||||||
conversionName: string,
|
|
||||||
metadata?: Record<string, any>
|
|
||||||
): Promise<void> {
|
|
||||||
await this.sendEvent('conversion', {
|
|
||||||
client_id: clientId,
|
|
||||||
conversion_name: conversionName,
|
|
||||||
...metadata,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get analytics data (for dashboards)
|
|
||||||
*/
|
|
||||||
async getMetrics(
|
|
||||||
dateRange: { startDate: string; endDate: string },
|
|
||||||
metrics: string[],
|
|
||||||
dimensions?: string[]
|
|
||||||
): Promise<any> {
|
|
||||||
// TODO: Implement GA4 Analytics Data API
|
|
||||||
return {
|
|
||||||
rows: [],
|
|
||||||
totals: [],
|
|
||||||
};
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Export instance
|
|
||||||
export const ga4Service = new GA4Service();
|
|
||||||
@@ -1,117 +0,0 @@
|
|||||||
import { Analytics } from '@segment/analytics-node';
|
|
||||||
import { analyticsEnv, EventType, eventPropertiesSchema } from '../config/analytics.config';
|
|
||||||
import { hashPhoneNumber } from '../utils/phone-hash';
|
|
||||||
|
|
||||||
// Mixpanel service
|
|
||||||
export class MixpanelService {
|
|
||||||
private client: Analytics;
|
|
||||||
|
|
||||||
constructor() {
|
|
||||||
this.client = new Analytics({
|
|
||||||
apiKey: analyticsEnv.MIXPANEL_TOKEN,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Track an event in Mixpanel
|
|
||||||
*/
|
|
||||||
async track(
|
|
||||||
event: EventType,
|
|
||||||
distinctId: string,
|
|
||||||
properties?: Record<string, any>
|
|
||||||
): Promise<void> {
|
|
||||||
const validatedProperties = eventPropertiesSchema.parse(properties);
|
|
||||||
|
|
||||||
this.client.track({
|
|
||||||
event,
|
|
||||||
distinctId,
|
|
||||||
properties: {
|
|
||||||
...validatedProperties,
|
|
||||||
...properties,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Identify a user
|
|
||||||
*/
|
|
||||||
async identify(userId: string, traits?: Record<string, any>): Promise<void> {
|
|
||||||
this.client.identify({
|
|
||||||
distinctId: userId,
|
|
||||||
traits,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Group users by subscription tier
|
|
||||||
*/
|
|
||||||
async group(groupId: string, groupKey: string, traits?: Record<string, any>): Promise<void> {
|
|
||||||
this.client.group({
|
|
||||||
groupKey,
|
|
||||||
groupId,
|
|
||||||
traits,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Track user sign-up
|
|
||||||
*/
|
|
||||||
async userSignedUp(userId: string, plan?: string, referrer?: string): Promise<void> {
|
|
||||||
await this.track(EventType.USER_SIGNED_UP, userId, {
|
|
||||||
plan,
|
|
||||||
referrer,
|
|
||||||
timestamp: new Date(),
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Track subscription upgrade
|
|
||||||
*/
|
|
||||||
async userUpgraded(userId: string, fromTier: string, toTier: string, mrr: number): Promise<void> {
|
|
||||||
await this.track(EventType.USER_UPGRADED, userId, {
|
|
||||||
fromTier,
|
|
||||||
toTier,
|
|
||||||
mrr,
|
|
||||||
timestamp: new Date(),
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Track exposure detection
|
|
||||||
*/
|
|
||||||
async exposureDetected(
|
|
||||||
userId: string,
|
|
||||||
exposureType: string,
|
|
||||||
severity: string,
|
|
||||||
source: string
|
|
||||||
): Promise<void> {
|
|
||||||
await this.track(EventType.EXPOSURE_DETECTED, userId, {
|
|
||||||
exposureType,
|
|
||||||
severity,
|
|
||||||
source,
|
|
||||||
timestamp: new Date(),
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Track spam detection
|
|
||||||
*/
|
|
||||||
async spamBlocked(userId: string, phoneNumber: string, confidence: number, method: string): Promise<void> {
|
|
||||||
await this.track(EventType.SPAM_BLOCKED, userId, {
|
|
||||||
phoneNumber: hashPhoneNumber(phoneNumber),
|
|
||||||
confidence,
|
|
||||||
method,
|
|
||||||
timestamp: new Date(),
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Flush pending events
|
|
||||||
*/
|
|
||||||
async flush(): Promise<void> {
|
|
||||||
await this.client.flush();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Export instance
|
|
||||||
export const mixpanelService = new MixpanelService();
|
|
||||||
@@ -1,12 +0,0 @@
|
|||||||
/**
|
|
||||||
* Hash a phone number for analytics purposes
|
|
||||||
* Uses a consistent hashing algorithm to create a deterministic hash
|
|
||||||
*/
|
|
||||||
export function hashPhoneNumber(phoneNumber: string): string {
|
|
||||||
let hash = 0;
|
|
||||||
for (let i = 0; i < phoneNumber.length; i++) {
|
|
||||||
hash = ((hash << 5) - hash) + phoneNumber.charCodeAt(i);
|
|
||||||
hash |= 0;
|
|
||||||
}
|
|
||||||
return `hash_${Math.abs(hash)}`;
|
|
||||||
}
|
|
||||||
@@ -1,12 +0,0 @@
|
|||||||
{
|
|
||||||
"extends": "../../tsconfig.base.json",
|
|
||||||
"compilerOptions": {
|
|
||||||
"outDir": "./dist",
|
|
||||||
"rootDir": "./src",
|
|
||||||
"declaration": true,
|
|
||||||
"declarationMap": true,
|
|
||||||
"emitDeclarationOnly": true
|
|
||||||
},
|
|
||||||
"include": ["src/**/*"],
|
|
||||||
"exclude": ["node_modules", "dist"]
|
|
||||||
}
|
|
||||||
@@ -1,18 +0,0 @@
|
|||||||
{
|
|
||||||
"name": "@shieldsai/shared-auth",
|
|
||||||
"version": "0.1.0",
|
|
||||||
"private": true,
|
|
||||||
"type": "module",
|
|
||||||
"main": "src/index.ts",
|
|
||||||
"types": "src/index.ts",
|
|
||||||
"scripts": {
|
|
||||||
"lint": "eslint src/"
|
|
||||||
},
|
|
||||||
"dependencies": {
|
|
||||||
"next-auth": "^4.24.0",
|
|
||||||
"zod": "^4.3.6"
|
|
||||||
},
|
|
||||||
"devDependencies": {
|
|
||||||
"typescript": "^5.3.3"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,114 +0,0 @@
|
|||||||
import { NextAuthOptions } from 'next-auth';
|
|
||||||
import CredentialsProvider from 'next-auth/providers/credentials';
|
|
||||||
import GoogleProvider from 'next-auth/providers/google';
|
|
||||||
import AppleProvider from 'next-auth/providers/apple';
|
|
||||||
import { z } from 'zod';
|
|
||||||
|
|
||||||
// Environment variables
|
|
||||||
const envSchema = z.object({
|
|
||||||
NEXTAUTH_URL: z.string().url(),
|
|
||||||
NEXTAUTH_SECRET: z.string().min(32),
|
|
||||||
GOOGLE_CLIENT_ID: z.string(),
|
|
||||||
GOOGLE_CLIENT_SECRET: z.string(),
|
|
||||||
APPLE_CLIENT_ID: z.string(),
|
|
||||||
APPLE_CLIENT_SECRET: z.string(),
|
|
||||||
DATABASE_URL: z.string().url(),
|
|
||||||
});
|
|
||||||
|
|
||||||
export const authEnv = envSchema.parse({
|
|
||||||
NEXTAUTH_URL: process.env.NEXTAUTH_URL,
|
|
||||||
NEXTAUTH_SECRET: process.env.NEXTAUTH_SECRET,
|
|
||||||
GOOGLE_CLIENT_ID: process.env.GOOGLE_CLIENT_ID,
|
|
||||||
GOOGLE_CLIENT_SECRET: process.env.GOOGLE_CLIENT_SECRET,
|
|
||||||
APPLE_CLIENT_ID: process.env.APPLE_CLIENT_ID,
|
|
||||||
APPLE_CLIENT_SECRET: process.env.APPLE_CLIENT_SECRET,
|
|
||||||
DATABASE_URL: process.env.DATABASE_URL,
|
|
||||||
});
|
|
||||||
|
|
||||||
// Role-based access control
|
|
||||||
export type UserRole = 'user' | 'family_admin' | 'family_member' | 'support';
|
|
||||||
|
|
||||||
export const userRoles: UserRole[] = ['user', 'family_admin', 'family_member', 'support'];
|
|
||||||
|
|
||||||
// Family group types
|
|
||||||
export type FamilyGroup = {
|
|
||||||
id: string;
|
|
||||||
name: string;
|
|
||||||
members: string[]; // user IDs
|
|
||||||
createdAt: Date;
|
|
||||||
updatedAt: Date;
|
|
||||||
};
|
|
||||||
|
|
||||||
// NextAuth options
|
|
||||||
export const authOptions: NextAuthOptions = {
|
|
||||||
providers: [
|
|
||||||
CredentialsProvider({
|
|
||||||
name: 'Credentials',
|
|
||||||
credentials: {
|
|
||||||
email: { label: 'Email', type: 'email' },
|
|
||||||
password: { label: 'Password', type: 'password' },
|
|
||||||
},
|
|
||||||
async authorize(credentials) {
|
|
||||||
if (!credentials?.email || !credentials?.password) {
|
|
||||||
throw new Error('Email and password required');
|
|
||||||
}
|
|
||||||
|
|
||||||
// TODO: Validate against database
|
|
||||||
const user = {
|
|
||||||
id: '1',
|
|
||||||
email: credentials.email,
|
|
||||||
name: credentials.email.split('@')[0],
|
|
||||||
role: 'user' as UserRole,
|
|
||||||
};
|
|
||||||
|
|
||||||
return user;
|
|
||||||
},
|
|
||||||
}),
|
|
||||||
GoogleProvider({
|
|
||||||
clientId: authEnv.GOOGLE_CLIENT_ID,
|
|
||||||
clientSecret: authEnv.GOOGLE_CLIENT_SECRET,
|
|
||||||
}),
|
|
||||||
AppleProvider({
|
|
||||||
clientId: authEnv.APPLE_CLIENT_ID,
|
|
||||||
clientSecret: authEnv.APPLE_CLIENT_SECRET,
|
|
||||||
}),
|
|
||||||
],
|
|
||||||
session: {
|
|
||||||
strategy: 'jwt',
|
|
||||||
maxAge: 30 * 24 * 60 * 60, // 30 days
|
|
||||||
},
|
|
||||||
pages: {
|
|
||||||
signIn: '/auth/signin',
|
|
||||||
signOut: '/auth/signout',
|
|
||||||
error: '/auth/error',
|
|
||||||
},
|
|
||||||
callbacks: {
|
|
||||||
async jwt({ token, user, account }) {
|
|
||||||
if (user) {
|
|
||||||
token.id = user.id;
|
|
||||||
token.role = (user as any).role;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (account) {
|
|
||||||
token.provider = account.provider;
|
|
||||||
token.accessToken = account.access_token;
|
|
||||||
}
|
|
||||||
|
|
||||||
return token;
|
|
||||||
},
|
|
||||||
async session({ session, token }) {
|
|
||||||
if (session.user) {
|
|
||||||
session.user.id = token.id as string;
|
|
||||||
session.user.role = token.role as UserRole;
|
|
||||||
}
|
|
||||||
|
|
||||||
return session;
|
|
||||||
},
|
|
||||||
},
|
|
||||||
events: {
|
|
||||||
async createUser({ user }) {
|
|
||||||
// TODO: Create default family group
|
|
||||||
console.log('New user created:', user.email);
|
|
||||||
},
|
|
||||||
},
|
|
||||||
};
|
|
||||||
@@ -1,25 +0,0 @@
|
|||||||
// Config
|
|
||||||
export { authOptions, authEnv, userRoles } from './config/auth.config';
|
|
||||||
export type { UserRole, FamilyGroup } from './config/auth.config';
|
|
||||||
|
|
||||||
// Middleware
|
|
||||||
export { withAuth, withRole, protectApiRoute } from './middleware/auth.middleware';
|
|
||||||
|
|
||||||
// Models
|
|
||||||
export {
|
|
||||||
userSchema,
|
|
||||||
familyGroupSchema,
|
|
||||||
familyMemberSchema,
|
|
||||||
sessionSchema,
|
|
||||||
accountSchema,
|
|
||||||
createUserSchema,
|
|
||||||
createFamilyGroupSchema,
|
|
||||||
addFamilyMemberSchema,
|
|
||||||
} from './models/auth.models';
|
|
||||||
export type {
|
|
||||||
User,
|
|
||||||
FamilyGroup as AuthFamilyGroup,
|
|
||||||
FamilyMember,
|
|
||||||
Session,
|
|
||||||
Account,
|
|
||||||
} from './models/auth.models';
|
|
||||||
@@ -1,62 +0,0 @@
|
|||||||
import { NextRequest, NextResponse } from 'next-auth/react';
|
|
||||||
import { UserRole } from '../config/auth.config';
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Middleware to protect routes that require authentication
|
|
||||||
*/
|
|
||||||
export function withAuth(
|
|
||||||
request: NextRequest,
|
|
||||||
options?: {
|
|
||||||
signInPath?: string;
|
|
||||||
}
|
|
||||||
): NextResponse {
|
|
||||||
const token = request.cookies.get('next-auth.session-token')?.value;
|
|
||||||
const signInPath = options?.signInPath ?? '/auth/signin';
|
|
||||||
|
|
||||||
if (!token) {
|
|
||||||
const signInUrl = new URL(signInPath, request.url);
|
|
||||||
signInUrl.searchParams.set('callbackUrl', request.nextUrl.pathname);
|
|
||||||
return NextResponse.redirect(signInUrl);
|
|
||||||
}
|
|
||||||
|
|
||||||
return NextResponse.next();
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Middleware to check if user has required role
|
|
||||||
*/
|
|
||||||
export function withRole(
|
|
||||||
response: NextResponse,
|
|
||||||
request: NextRequest,
|
|
||||||
requiredRoles: UserRole[]
|
|
||||||
): NextResponse {
|
|
||||||
const token = request.cookies.get('next-auth.session-token')?.value;
|
|
||||||
|
|
||||||
if (!token) {
|
|
||||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 });
|
|
||||||
}
|
|
||||||
|
|
||||||
// TODO: Decode JWT and check role
|
|
||||||
// For now, allow all authenticated users
|
|
||||||
return response;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Middleware to protect API routes
|
|
||||||
*/
|
|
||||||
export function protectApiRoute(request: NextRequest): NextResponse {
|
|
||||||
const authHeader = request.headers.get('authorization');
|
|
||||||
|
|
||||||
if (!authHeader?.startsWith('Bearer ')) {
|
|
||||||
return NextResponse.json({ error: 'Missing or invalid token' }, { status: 401 });
|
|
||||||
}
|
|
||||||
|
|
||||||
const token = authHeader.split(' ')[1];
|
|
||||||
|
|
||||||
try {
|
|
||||||
// TODO: Verify JWT token
|
|
||||||
return NextResponse.next();
|
|
||||||
} catch (error) {
|
|
||||||
return NextResponse.json({ error: 'Invalid token' }, { status: 401 });
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,81 +0,0 @@
|
|||||||
import { z } from 'zod';
|
|
||||||
|
|
||||||
// User schema
|
|
||||||
export const userSchema = z.object({
|
|
||||||
id: z.string().uuid(),
|
|
||||||
email: z.string().email(),
|
|
||||||
name: z.string().min(1),
|
|
||||||
image: z.string().url().optional(),
|
|
||||||
role: z.enum(['user', 'family_admin', 'family_member', 'support']),
|
|
||||||
emailVerified: z.date().optional(),
|
|
||||||
createdAt: z.date(),
|
|
||||||
updatedAt: z.date(),
|
|
||||||
});
|
|
||||||
|
|
||||||
export type User = z.infer<typeof userSchema>;
|
|
||||||
|
|
||||||
// Family group schema
|
|
||||||
export const familyGroupSchema = z.object({
|
|
||||||
id: z.string().uuid(),
|
|
||||||
name: z.string().min(1).max(100),
|
|
||||||
ownerId: z.string().uuid(),
|
|
||||||
createdAt: z.date(),
|
|
||||||
updatedAt: z.date(),
|
|
||||||
});
|
|
||||||
|
|
||||||
export type FamilyGroup = z.infer<typeof familyGroupSchema>;
|
|
||||||
|
|
||||||
// Family member schema
|
|
||||||
export const familyMemberSchema = z.object({
|
|
||||||
id: z.string().uuid(),
|
|
||||||
groupId: z.string().uuid(),
|
|
||||||
userId: z.string().uuid(),
|
|
||||||
role: z.enum(['owner', 'admin', 'member']),
|
|
||||||
joinedAt: z.date(),
|
|
||||||
});
|
|
||||||
|
|
||||||
export type FamilyMember = z.infer<typeof familyMemberSchema>;
|
|
||||||
|
|
||||||
// Session schema
|
|
||||||
export const sessionSchema = z.object({
|
|
||||||
id: z.string().uuid(),
|
|
||||||
userId: z.string().uuid(),
|
|
||||||
sessionToken: z.string(),
|
|
||||||
expires: z.date(),
|
|
||||||
createdAt: z.date(),
|
|
||||||
});
|
|
||||||
|
|
||||||
export type Session = z.infer<typeof sessionSchema>;
|
|
||||||
|
|
||||||
// Account schema (for OAuth)
|
|
||||||
export const accountSchema = z.object({
|
|
||||||
id: z.string().uuid(),
|
|
||||||
userId: z.string().uuid(),
|
|
||||||
provider: z.string(),
|
|
||||||
providerAccountId: z.string(),
|
|
||||||
access_token: z.string().optional(),
|
|
||||||
refresh_token: z.string().optional(),
|
|
||||||
expires_at: z.number().optional(),
|
|
||||||
token_type: z.string().optional(),
|
|
||||||
scope: z.string().optional(),
|
|
||||||
});
|
|
||||||
|
|
||||||
export type Account = z.infer<typeof accountSchema>;
|
|
||||||
|
|
||||||
// Validation schemas for API
|
|
||||||
export const createUserSchema = z.object({
|
|
||||||
email: z.string().email(),
|
|
||||||
password: z.string().min(8),
|
|
||||||
name: z.string().min(1),
|
|
||||||
});
|
|
||||||
|
|
||||||
export const createFamilyGroupSchema = z.object({
|
|
||||||
name: z.string().min(1).max(100),
|
|
||||||
ownerId: z.string().uuid(),
|
|
||||||
});
|
|
||||||
|
|
||||||
export const addFamilyMemberSchema = z.object({
|
|
||||||
groupId: z.string().uuid(),
|
|
||||||
userId: z.string().uuid(),
|
|
||||||
role: z.enum(['admin', 'member']).default('member'),
|
|
||||||
});
|
|
||||||
@@ -1,12 +0,0 @@
|
|||||||
{
|
|
||||||
"extends": "../../tsconfig.base.json",
|
|
||||||
"compilerOptions": {
|
|
||||||
"outDir": "./dist",
|
|
||||||
"rootDir": "./src",
|
|
||||||
"declaration": true,
|
|
||||||
"declarationMap": true,
|
|
||||||
"emitDeclarationOnly": true
|
|
||||||
},
|
|
||||||
"include": ["src/**/*"],
|
|
||||||
"exclude": ["node_modules", "dist"]
|
|
||||||
}
|
|
||||||
@@ -1,15 +0,0 @@
|
|||||||
{
|
|
||||||
"name": "@shieldsai/shared-billing",
|
|
||||||
"version": "0.1.0",
|
|
||||||
"private": true,
|
|
||||||
"type": "module",
|
|
||||||
"main": "src/index.ts",
|
|
||||||
"types": "src/index.ts",
|
|
||||||
"dependencies": {
|
|
||||||
"stripe": "^14.0.0",
|
|
||||||
"zod": "^4.3.6"
|
|
||||||
},
|
|
||||||
"devDependencies": {
|
|
||||||
"typescript": "^5.3.3"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,90 +0,0 @@
|
|||||||
import Stripe from 'stripe';
|
|
||||||
import { z } from 'zod';
|
|
||||||
|
|
||||||
// Environment variables
|
|
||||||
const envSchema = z.object({
|
|
||||||
STRIPE_SECRET_KEY: z.string().startsWith('sk_'),
|
|
||||||
STRIPE_WEBHOOK_SECRET: z.string().startsWith('whsec_'),
|
|
||||||
STRIPE_PRICE_ID_BASIC: z.string().startsWith('price_'),
|
|
||||||
STRIPE_PRICE_ID_PLUS: z.string().startsWith('price_'),
|
|
||||||
STRIPE_PRICE_ID_PREMIUM: z.string().startsWith('price_'),
|
|
||||||
});
|
|
||||||
|
|
||||||
export const billingEnv = envSchema.parse({
|
|
||||||
STRIPE_SECRET_KEY: process.env.STRIPE_SECRET_KEY,
|
|
||||||
STRIPE_WEBHOOK_SECRET: process.env.STRIPE_WEBHOOK_SECRET,
|
|
||||||
STRIPE_PRICE_ID_BASIC: process.env.STRIPE_PRICE_ID_BASIC,
|
|
||||||
STRIPE_PRICE_ID_PLUS: process.env.STRIPE_PRICE_ID_PLUS,
|
|
||||||
STRIPE_PRICE_ID_PREMIUM: process.env.STRIPE_PRICE_ID_PREMIUM,
|
|
||||||
});
|
|
||||||
|
|
||||||
// Initialize Stripe
|
|
||||||
export const stripe = new Stripe(billingEnv.STRIPE_SECRET_KEY, {
|
|
||||||
apiVersion: '2024-06-20',
|
|
||||||
typescript: true,
|
|
||||||
});
|
|
||||||
|
|
||||||
// Subscription tiers
|
|
||||||
export enum SubscriptionTier {
|
|
||||||
BASIC = 'basic',
|
|
||||||
PLUS = 'plus',
|
|
||||||
PREMIUM = 'premium',
|
|
||||||
}
|
|
||||||
|
|
||||||
// Tier configuration
|
|
||||||
export const tierConfig: Record<SubscriptionTier, {
|
|
||||||
name: string;
|
|
||||||
priceId: string;
|
|
||||||
features: {
|
|
||||||
maxWatchlistItems: number;
|
|
||||||
maxFamilyMembers: number;
|
|
||||||
darkWebScanFrequency: 'daily' | 'hourly' | 'realtime';
|
|
||||||
exposureRetentionMonths: number;
|
|
||||||
alertChannels: string[];
|
|
||||||
};
|
|
||||||
}> = {
|
|
||||||
[SubscriptionTier.BASIC]: {
|
|
||||||
name: 'Basic',
|
|
||||||
priceId: billingEnv.STRIPE_PRICE_ID_BASIC,
|
|
||||||
features: {
|
|
||||||
maxWatchlistItems: 2,
|
|
||||||
maxFamilyMembers: 1,
|
|
||||||
darkWebScanFrequency: 'daily',
|
|
||||||
exposureRetentionMonths: 12,
|
|
||||||
alertChannels: ['email'],
|
|
||||||
},
|
|
||||||
},
|
|
||||||
[SubscriptionTier.PLUS]: {
|
|
||||||
name: 'Plus',
|
|
||||||
priceId: billingEnv.STRIPE_PRICE_ID_PLUS,
|
|
||||||
features: {
|
|
||||||
maxWatchlistItems: 10,
|
|
||||||
maxFamilyMembers: 3,
|
|
||||||
darkWebScanFrequency: 'hourly',
|
|
||||||
exposureRetentionMonths: 24,
|
|
||||||
alertChannels: ['email', 'push'],
|
|
||||||
},
|
|
||||||
},
|
|
||||||
[SubscriptionTier.PREMIUM]: {
|
|
||||||
name: 'Premium',
|
|
||||||
priceId: billingEnv.STRIPE_PRICE_ID_PREMIUM,
|
|
||||||
features: {
|
|
||||||
maxWatchlistItems: Infinity,
|
|
||||||
maxFamilyMembers: Infinity,
|
|
||||||
darkWebScanFrequency: 'realtime',
|
|
||||||
exposureRetentionMonths: 60,
|
|
||||||
alertChannels: ['email', 'push', 'sms'],
|
|
||||||
},
|
|
||||||
},
|
|
||||||
};
|
|
||||||
|
|
||||||
// Feature gating middleware
|
|
||||||
export function getTierFeatures(tier: SubscriptionTier) {
|
|
||||||
return tierConfig[tier].features;
|
|
||||||
}
|
|
||||||
|
|
||||||
export function checkFeatureAccess(tier: SubscriptionTier, feature: string, value: number): boolean {
|
|
||||||
const features = tierConfig[tier].features;
|
|
||||||
const featureValue = features[feature as keyof typeof features] as number;
|
|
||||||
return value <= featureValue;
|
|
||||||
}
|
|
||||||
@@ -1,22 +0,0 @@
|
|||||||
// Config
|
|
||||||
export {
|
|
||||||
stripe,
|
|
||||||
billingEnv,
|
|
||||||
SubscriptionTier,
|
|
||||||
tierConfig,
|
|
||||||
getTierFeatures,
|
|
||||||
checkFeatureAccess,
|
|
||||||
} from './config/billing.config';
|
|
||||||
|
|
||||||
// Services
|
|
||||||
export {
|
|
||||||
SubscriptionService,
|
|
||||||
CustomerService,
|
|
||||||
WebhookService,
|
|
||||||
subscriptionService,
|
|
||||||
customerService,
|
|
||||||
webhookService,
|
|
||||||
} from './services/billing.services';
|
|
||||||
|
|
||||||
// Middleware
|
|
||||||
export { requireTier, checkFeatureLimit } from './middleware/billing.middleware';
|
|
||||||
@@ -1,68 +0,0 @@
|
|||||||
import { NextRequest, NextResponse } from 'next/server';
|
|
||||||
import { SubscriptionTier, getTierFeatures } from '../config/billing.config';
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Middleware to check if user has access to required tier features
|
|
||||||
*/
|
|
||||||
export function requireTier(
|
|
||||||
request: NextRequest,
|
|
||||||
requiredTier: SubscriptionTier
|
|
||||||
): NextResponse | null {
|
|
||||||
const userTier = request.headers.get('x-user-tier') as SubscriptionTier;
|
|
||||||
|
|
||||||
if (!userTier) {
|
|
||||||
return NextResponse.json({ error: 'User tier not found' }, { status: 401 });
|
|
||||||
}
|
|
||||||
|
|
||||||
const tierOrder: Record<SubscriptionTier, number> = {
|
|
||||||
[SubscriptionTier.BASIC]: 1,
|
|
||||||
[SubscriptionTier.PLUS]: 2,
|
|
||||||
[SubscriptionTier.PREMIUM]: 3,
|
|
||||||
};
|
|
||||||
|
|
||||||
if (tierOrder[userTier] < tierOrder[requiredTier]) {
|
|
||||||
return NextResponse.json(
|
|
||||||
{
|
|
||||||
error: `Feature requires ${requiredTier} tier or higher`,
|
|
||||||
currentTier: userTier,
|
|
||||||
requiredTier,
|
|
||||||
},
|
|
||||||
{ status: 403 }
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Middleware to check feature limits
|
|
||||||
*/
|
|
||||||
export function checkFeatureLimit(
|
|
||||||
request: NextRequest,
|
|
||||||
feature: string,
|
|
||||||
currentValue: number
|
|
||||||
): NextResponse | null {
|
|
||||||
const userTier = request.headers.get('x-user-tier') as SubscriptionTier;
|
|
||||||
|
|
||||||
if (!userTier) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
const features = getTierFeatures(userTier);
|
|
||||||
const featureLimit = features[feature as keyof typeof features] as number;
|
|
||||||
|
|
||||||
if (currentValue > featureLimit) {
|
|
||||||
return NextResponse.json(
|
|
||||||
{
|
|
||||||
error: `Feature limit exceeded`,
|
|
||||||
feature,
|
|
||||||
current: currentValue,
|
|
||||||
limit: featureLimit,
|
|
||||||
tier: userTier,
|
|
||||||
},
|
|
||||||
{ status: 400 }
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
@@ -1,223 +0,0 @@
|
|||||||
import { stripe, SubscriptionTier, tierConfig } from '../config/billing.config';
|
|
||||||
import { z } from 'zod';
|
|
||||||
|
|
||||||
// Subscription service
|
|
||||||
export class SubscriptionService {
|
|
||||||
/**
|
|
||||||
* Create a new subscription for a customer
|
|
||||||
*/
|
|
||||||
async createSubscription(
|
|
||||||
customerId: string,
|
|
||||||
tier: SubscriptionTier,
|
|
||||||
metadata?: Record<string, string>
|
|
||||||
): Promise<Stripe.Subscription> {
|
|
||||||
const priceId = tierConfig[tier].priceId;
|
|
||||||
|
|
||||||
const subscription = await stripe.subscriptions.create({
|
|
||||||
customer: customerId,
|
|
||||||
items: [{ price: priceId }],
|
|
||||||
metadata: metadata,
|
|
||||||
proration_behavior: 'create_prorations',
|
|
||||||
});
|
|
||||||
|
|
||||||
return subscription;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Update a customer's subscription tier
|
|
||||||
*/
|
|
||||||
async updateSubscriptionTier(
|
|
||||||
subscriptionId: string,
|
|
||||||
newTier: SubscriptionTier
|
|
||||||
): Promise<Stripe.Subscription> {
|
|
||||||
const newPriceId = tierConfig[newTier].priceId;
|
|
||||||
|
|
||||||
const subscription = await stripe.subscriptions.update(subscriptionId, {
|
|
||||||
items: [
|
|
||||||
{
|
|
||||||
price: newPriceId,
|
|
||||||
quantity: 1,
|
|
||||||
},
|
|
||||||
],
|
|
||||||
proration_behavior: 'create_prorations',
|
|
||||||
});
|
|
||||||
|
|
||||||
return subscription;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Cancel a subscription
|
|
||||||
*/
|
|
||||||
async cancelSubscription(
|
|
||||||
subscriptionId: string,
|
|
||||||
atPeriodEnd: boolean = true
|
|
||||||
): Promise<Stripe.Subscription> {
|
|
||||||
const subscription = await stripe.subscriptions.update(subscriptionId, {
|
|
||||||
cancel_at_period_end: atPeriodEnd,
|
|
||||||
});
|
|
||||||
|
|
||||||
return subscription;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get subscription by ID
|
|
||||||
*/
|
|
||||||
async getSubscription(subscriptionId: string): Promise<Stripe.Subscription | null> {
|
|
||||||
try {
|
|
||||||
const subscription = await stripe.subscriptions.retrieve(subscriptionId);
|
|
||||||
return subscription;
|
|
||||||
} catch (error) {
|
|
||||||
if (error instanceof Stripe.errors.StripeInvalidRequestError) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
throw error;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get customer's current subscription
|
|
||||||
*/
|
|
||||||
async getCustomerSubscription(customerId: string): Promise<Stripe.Subscription | null> {
|
|
||||||
const subscriptions = await stripe.subscriptions.list({
|
|
||||||
customer: customerId,
|
|
||||||
status: 'active',
|
|
||||||
limit: 1,
|
|
||||||
});
|
|
||||||
|
|
||||||
return subscriptions.data[0] || null;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Customer service
|
|
||||||
export class CustomerService {
|
|
||||||
/**
|
|
||||||
* Create a new Stripe customer
|
|
||||||
*/
|
|
||||||
async createCustomer(
|
|
||||||
email: string,
|
|
||||||
name?: string,
|
|
||||||
metadata?: Record<string, string>
|
|
||||||
): Promise<Stripe.Customer> {
|
|
||||||
const customer = await stripe.customers.create({
|
|
||||||
email,
|
|
||||||
name,
|
|
||||||
metadata,
|
|
||||||
});
|
|
||||||
|
|
||||||
return customer;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get or create customer by email
|
|
||||||
*/
|
|
||||||
async getOrCreateCustomer(
|
|
||||||
email: string,
|
|
||||||
name?: string
|
|
||||||
): Promise<Stripe.Customer> {
|
|
||||||
const existingCustomers = await stripe.customers.list({
|
|
||||||
email,
|
|
||||||
limit: 1,
|
|
||||||
});
|
|
||||||
|
|
||||||
if (existingCustomers.data.length > 0) {
|
|
||||||
return existingCustomers.data[0];
|
|
||||||
}
|
|
||||||
|
|
||||||
return this.createCustomer(email, name);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Create a billing portal session
|
|
||||||
*/
|
|
||||||
async createBillingPortalSession(
|
|
||||||
customerId: string,
|
|
||||||
returnUrl: string
|
|
||||||
): Promise<Stripe.BillingPortal.Session> {
|
|
||||||
const session = await stripe.billingPortal.sessions.create({
|
|
||||||
customer: customerId,
|
|
||||||
return_url: returnUrl,
|
|
||||||
});
|
|
||||||
|
|
||||||
return session;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get customer by ID
|
|
||||||
*/
|
|
||||||
async getCustomer(customerId: string): Promise<Stripe.Customer | null> {
|
|
||||||
try {
|
|
||||||
const customer = await stripe.customers.retrieve(customerId);
|
|
||||||
return customer as Stripe.Customer;
|
|
||||||
} catch (error) {
|
|
||||||
if (error instanceof Stripe.errors.StripeInvalidRequestError) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
throw error;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Webhook service
|
|
||||||
export class WebhookService {
|
|
||||||
/**
|
|
||||||
* Construct webhook event from raw body
|
|
||||||
*/
|
|
||||||
constructEvent(
|
|
||||||
rawBody: Buffer | string,
|
|
||||||
signature: string
|
|
||||||
): Stripe.Event {
|
|
||||||
return stripe.webhooks.constructEvent(
|
|
||||||
rawBody,
|
|
||||||
signature,
|
|
||||||
process.env.STRIPE_WEBHOOK_SECRET!
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Handle webhook event
|
|
||||||
*/
|
|
||||||
async handleWebhook(event: Stripe.Event): Promise<void> {
|
|
||||||
switch (event.type) {
|
|
||||||
case 'customer.subscription.created':
|
|
||||||
case 'customer.subscription.updated':
|
|
||||||
await this.handleSubscriptionChange(event.data.object);
|
|
||||||
break;
|
|
||||||
case 'customer.subscription.deleted':
|
|
||||||
await this.handleSubscriptionDeleted(event.data.object);
|
|
||||||
break;
|
|
||||||
case 'invoice.payment_succeeded':
|
|
||||||
await this.handlePaymentSucceeded(event.data.object);
|
|
||||||
break;
|
|
||||||
case 'invoice.payment_failed':
|
|
||||||
await this.handlePaymentFailed(event.data.object);
|
|
||||||
break;
|
|
||||||
default:
|
|
||||||
console.log(`Unhandled event type: ${event.type}`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private async handleSubscriptionChange(subscription: Stripe.Subscription) {
|
|
||||||
console.log(`Subscription ${subscription.id} changed to ${subscription.status}`);
|
|
||||||
// TODO: Update local database
|
|
||||||
}
|
|
||||||
|
|
||||||
private async handleSubscriptionDeleted(subscription: Stripe.Subscription) {
|
|
||||||
console.log(`Subscription ${subscription.id} deleted`);
|
|
||||||
// TODO: Update local database
|
|
||||||
}
|
|
||||||
|
|
||||||
private async handlePaymentSucceeded(invoice: Stripe.Invoice) {
|
|
||||||
console.log(`Payment succeeded for invoice ${invoice.id}`);
|
|
||||||
// TODO: Update usage tracking
|
|
||||||
}
|
|
||||||
|
|
||||||
private async handlePaymentFailed(invoice: Stripe.Invoice) {
|
|
||||||
console.log(`Payment failed for invoice ${invoice.id}`);
|
|
||||||
// TODO: Send notification to customer
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Export instances
|
|
||||||
export const subscriptionService = new SubscriptionService();
|
|
||||||
export const customerService = new CustomerService();
|
|
||||||
export const webhookService = new WebhookService();
|
|
||||||
@@ -1,12 +0,0 @@
|
|||||||
{
|
|
||||||
"extends": "../../tsconfig.base.json",
|
|
||||||
"compilerOptions": {
|
|
||||||
"outDir": "./dist",
|
|
||||||
"rootDir": "./src",
|
|
||||||
"declaration": true,
|
|
||||||
"declarationMap": true,
|
|
||||||
"emitDeclarationOnly": true
|
|
||||||
},
|
|
||||||
"include": ["src/**/*"],
|
|
||||||
"exclude": ["node_modules", "dist"]
|
|
||||||
}
|
|
||||||
@@ -1,12 +0,0 @@
|
|||||||
import { defineConfig } from 'drizzle-kit';
|
|
||||||
|
|
||||||
export default defineConfig({
|
|
||||||
schema: './prisma/schema.prisma',
|
|
||||||
out: './migrations',
|
|
||||||
dialect: 'postgresql',
|
|
||||||
dbCredentials: {
|
|
||||||
url: process.env.DATABASE_URL!,
|
|
||||||
},
|
|
||||||
verbose: true,
|
|
||||||
strict: true,
|
|
||||||
});
|
|
||||||
@@ -1,23 +0,0 @@
|
|||||||
{
|
|
||||||
"name": "@shieldsai/shared-db",
|
|
||||||
"version": "0.1.0",
|
|
||||||
"private": true,
|
|
||||||
"type": "module",
|
|
||||||
"main": "src/index.ts",
|
|
||||||
"types": "src/index.ts",
|
|
||||||
"scripts": {
|
|
||||||
"db:generate": "prisma generate",
|
|
||||||
"db:push": "prisma db push",
|
|
||||||
"db:migrate": "prisma migrate deploy",
|
|
||||||
"db:studio": "prisma studio",
|
|
||||||
"db:format": "prisma format"
|
|
||||||
},
|
|
||||||
"dependencies": {
|
|
||||||
"@prisma/client": "^5.14.0",
|
|
||||||
"zod": "^4.3.6"
|
|
||||||
},
|
|
||||||
"devDependencies": {
|
|
||||||
"prisma": "^5.14.0",
|
|
||||||
"typescript": "^5.3.3"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,437 +0,0 @@
|
|||||||
// Prisma schema for ShieldAI
|
|
||||||
// All models for the multi-service SaaS platform
|
|
||||||
|
|
||||||
generator client {
|
|
||||||
provider = "prisma-client-js"
|
|
||||||
}
|
|
||||||
|
|
||||||
datasource db {
|
|
||||||
provider = "postgresql"
|
|
||||||
url = env("DATABASE_URL")
|
|
||||||
}
|
|
||||||
|
|
||||||
// ============================================
|
|
||||||
// User & Authentication Models
|
|
||||||
// ============================================
|
|
||||||
|
|
||||||
model User {
|
|
||||||
id String @id @default(uuid())
|
|
||||||
email String @unique
|
|
||||||
emailVerified DateTime?
|
|
||||||
name String?
|
|
||||||
image String?
|
|
||||||
role UserRole @default(user)
|
|
||||||
|
|
||||||
// Relationships
|
|
||||||
accounts Account[]
|
|
||||||
sessions Session[]
|
|
||||||
familyGroups FamilyGroupMember[]
|
|
||||||
familyGroupOwned FamilyGroup[] @relation("FamilyGroupOwner")
|
|
||||||
subscriptions Subscription[]
|
|
||||||
alerts Alert[]
|
|
||||||
voiceEnrollments VoiceEnrollment[]
|
|
||||||
voiceAnalyses VoiceAnalysis[]
|
|
||||||
spamFeedback SpamFeedback[]
|
|
||||||
spamRules SpamRule[]
|
|
||||||
|
|
||||||
// Audit
|
|
||||||
createdAt DateTime @default(now())
|
|
||||||
updatedAt DateTime @updatedAt
|
|
||||||
|
|
||||||
@@index([email])
|
|
||||||
@@index([role])
|
|
||||||
}
|
|
||||||
|
|
||||||
enum UserRole {
|
|
||||||
user
|
|
||||||
family_admin
|
|
||||||
family_member
|
|
||||||
support
|
|
||||||
}
|
|
||||||
|
|
||||||
model Account {
|
|
||||||
id String @id @default(uuid())
|
|
||||||
userId String
|
|
||||||
provider String
|
|
||||||
providerAccountId String
|
|
||||||
access_token String?
|
|
||||||
refresh_token String?
|
|
||||||
expires_at Int?
|
|
||||||
token_type String?
|
|
||||||
scope String?
|
|
||||||
|
|
||||||
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
|
|
||||||
|
|
||||||
createdAt DateTime @default(now())
|
|
||||||
updatedAt DateTime @updatedAt
|
|
||||||
|
|
||||||
@@unique([userId, provider, providerAccountId])
|
|
||||||
@@index([userId])
|
|
||||||
}
|
|
||||||
|
|
||||||
model Session {
|
|
||||||
id String @id @default(uuid())
|
|
||||||
userId String
|
|
||||||
sessionToken String @unique
|
|
||||||
expires DateTime
|
|
||||||
|
|
||||||
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
|
|
||||||
|
|
||||||
createdAt DateTime @default(now())
|
|
||||||
updatedAt DateTime @updatedAt
|
|
||||||
|
|
||||||
@@index([sessionToken])
|
|
||||||
@@index([userId])
|
|
||||||
}
|
|
||||||
|
|
||||||
// ============================================
|
|
||||||
// Family & Subscription Models
|
|
||||||
// ============================================
|
|
||||||
|
|
||||||
model FamilyGroup {
|
|
||||||
id String @id @default(uuid())
|
|
||||||
name String
|
|
||||||
ownerId String
|
|
||||||
createdAt DateTime @default(now())
|
|
||||||
updatedAt DateTime @updatedAt
|
|
||||||
|
|
||||||
owner User @relation("FamilyGroupOwner", fields: [ownerId], references: [id])
|
|
||||||
members FamilyGroupMember[]
|
|
||||||
subscriptions Subscription[]
|
|
||||||
|
|
||||||
@@index([ownerId])
|
|
||||||
@@index([name])
|
|
||||||
}
|
|
||||||
|
|
||||||
model FamilyGroupMember {
|
|
||||||
id String @id @default(uuid())
|
|
||||||
groupId String
|
|
||||||
userId String
|
|
||||||
role FamilyMemberRole @default(member)
|
|
||||||
joinedAt DateTime @default(now())
|
|
||||||
|
|
||||||
group FamilyGroup @relation(fields: [groupId], references: [id], onDelete: Cascade)
|
|
||||||
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
|
|
||||||
|
|
||||||
createdAt DateTime @default(now())
|
|
||||||
updatedAt DateTime @updatedAt
|
|
||||||
|
|
||||||
@@unique([groupId, userId])
|
|
||||||
@@index([groupId])
|
|
||||||
@@index([userId])
|
|
||||||
}
|
|
||||||
|
|
||||||
enum FamilyMemberRole {
|
|
||||||
owner
|
|
||||||
admin
|
|
||||||
member
|
|
||||||
}
|
|
||||||
|
|
||||||
model Subscription {
|
|
||||||
id String @id @default(uuid())
|
|
||||||
userId String
|
|
||||||
familyGroupId String?
|
|
||||||
stripeId String? @unique
|
|
||||||
tier SubscriptionTier @default(basic)
|
|
||||||
status SubscriptionStatus @default(active)
|
|
||||||
currentPeriodStart DateTime
|
|
||||||
currentPeriodEnd DateTime
|
|
||||||
cancelAtPeriodEnd Boolean @default(false)
|
|
||||||
|
|
||||||
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
|
|
||||||
familyGroup FamilyGroup? @relation(fields: [familyGroupId], references: [id])
|
|
||||||
|
|
||||||
watchlistItems WatchlistItem[]
|
|
||||||
exposures Exposure[]
|
|
||||||
alerts Alert[]
|
|
||||||
|
|
||||||
createdAt DateTime @default(now())
|
|
||||||
updatedAt DateTime @updatedAt
|
|
||||||
|
|
||||||
@@index([userId])
|
|
||||||
@@index([familyGroupId])
|
|
||||||
@@index([stripeId])
|
|
||||||
@@index([tier])
|
|
||||||
}
|
|
||||||
|
|
||||||
enum SubscriptionTier {
|
|
||||||
basic
|
|
||||||
plus
|
|
||||||
premium
|
|
||||||
}
|
|
||||||
|
|
||||||
enum SubscriptionStatus {
|
|
||||||
active
|
|
||||||
past_due
|
|
||||||
canceled
|
|
||||||
unpaid
|
|
||||||
trialing
|
|
||||||
}
|
|
||||||
|
|
||||||
// ============================================
|
|
||||||
// DarkWatch Models (Dark Web Monitoring)
|
|
||||||
// ============================================
|
|
||||||
|
|
||||||
model WatchlistItem {
|
|
||||||
id String @id @default(uuid())
|
|
||||||
subscriptionId String
|
|
||||||
type WatchlistType
|
|
||||||
value String
|
|
||||||
hash String // SHA-256 hash for deduplication
|
|
||||||
isActive Boolean @default(true)
|
|
||||||
|
|
||||||
subscription Subscription @relation(fields: [subscriptionId], references: [id], onDelete: Cascade)
|
|
||||||
exposures Exposure[]
|
|
||||||
|
|
||||||
createdAt DateTime @default(now())
|
|
||||||
updatedAt DateTime @updatedAt
|
|
||||||
|
|
||||||
@@unique([subscriptionId, type, hash])
|
|
||||||
@@index([subscriptionId])
|
|
||||||
@@index([type])
|
|
||||||
@@index([hash])
|
|
||||||
}
|
|
||||||
|
|
||||||
enum WatchlistType {
|
|
||||||
email
|
|
||||||
phoneNumber
|
|
||||||
ssn
|
|
||||||
address
|
|
||||||
domain
|
|
||||||
}
|
|
||||||
|
|
||||||
model Exposure {
|
|
||||||
id String @id @default(uuid())
|
|
||||||
subscriptionId String
|
|
||||||
watchlistItemId String?
|
|
||||||
source ExposureSource
|
|
||||||
dataType WatchlistType
|
|
||||||
identifier String
|
|
||||||
identifierHash String
|
|
||||||
severity ExposureSeverity @default(info)
|
|
||||||
metadata Json? // Additional source-specific data
|
|
||||||
isFirstTime Boolean @default(false)
|
|
||||||
|
|
||||||
subscription Subscription @relation(fields: [subscriptionId], references: [id], onDelete: Cascade)
|
|
||||||
watchlistItem WatchlistItem? @relation(fields: [watchlistItemId], references: [id])
|
|
||||||
alerts Alert[]
|
|
||||||
|
|
||||||
detectedAt DateTime
|
|
||||||
createdAt DateTime @default(now())
|
|
||||||
updatedAt DateTime @updatedAt
|
|
||||||
|
|
||||||
@@index([subscriptionId])
|
|
||||||
@@index([watchlistItemId])
|
|
||||||
@@index([source])
|
|
||||||
@@index([severity])
|
|
||||||
@@index([detectedAt])
|
|
||||||
}
|
|
||||||
|
|
||||||
enum ExposureSource {
|
|
||||||
hibp // Have I Been Pwned
|
|
||||||
securityTrails
|
|
||||||
censys
|
|
||||||
darkWebForum
|
|
||||||
shodan
|
|
||||||
honeypot
|
|
||||||
}
|
|
||||||
|
|
||||||
enum ExposureSeverity {
|
|
||||||
info
|
|
||||||
warning
|
|
||||||
critical
|
|
||||||
}
|
|
||||||
|
|
||||||
// ============================================
|
|
||||||
// Notification & Alert Models
|
|
||||||
// ============================================
|
|
||||||
|
|
||||||
model Alert {
|
|
||||||
id String @id @default(uuid())
|
|
||||||
subscriptionId String
|
|
||||||
userId String
|
|
||||||
exposureId String?
|
|
||||||
type AlertType
|
|
||||||
title String
|
|
||||||
message String
|
|
||||||
severity AlertSeverity @default(info)
|
|
||||||
isRead Boolean @default(false)
|
|
||||||
readAt DateTime?
|
|
||||||
channel AlertChannel[] // Array of notification channels
|
|
||||||
|
|
||||||
subscription Subscription @relation(fields: [subscriptionId], references: [id], onDelete: Cascade)
|
|
||||||
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
|
|
||||||
exposure Exposure? @relation(fields: [exposureId], references: [id])
|
|
||||||
|
|
||||||
createdAt DateTime @default(now())
|
|
||||||
updatedAt DateTime @updatedAt
|
|
||||||
|
|
||||||
@@index([subscriptionId])
|
|
||||||
@@index([userId])
|
|
||||||
@@index([isRead])
|
|
||||||
@@index([createdAt])
|
|
||||||
}
|
|
||||||
|
|
||||||
enum AlertType {
|
|
||||||
exposure_detected
|
|
||||||
exposure_resolved
|
|
||||||
scan_complete
|
|
||||||
subscription_changed
|
|
||||||
system_warning
|
|
||||||
}
|
|
||||||
|
|
||||||
enum AlertSeverity {
|
|
||||||
info
|
|
||||||
warning
|
|
||||||
critical
|
|
||||||
}
|
|
||||||
|
|
||||||
enum AlertChannel {
|
|
||||||
email
|
|
||||||
push
|
|
||||||
sms
|
|
||||||
}
|
|
||||||
|
|
||||||
// ============================================
|
|
||||||
// VoicePrint Models (Voice Cloning Detection)
|
|
||||||
// ============================================
|
|
||||||
|
|
||||||
model VoiceEnrollment {
|
|
||||||
id String @id @default(uuid())
|
|
||||||
userId String
|
|
||||||
name String
|
|
||||||
voiceHash String // FAISS embedding hash
|
|
||||||
audioMetadata Json? // Sample rate, duration, etc.
|
|
||||||
|
|
||||||
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
|
|
||||||
analyses VoiceAnalysis[]
|
|
||||||
|
|
||||||
isActive Boolean @default(true)
|
|
||||||
createdAt DateTime @default(now())
|
|
||||||
updatedAt DateTime @updatedAt
|
|
||||||
|
|
||||||
@@index([userId])
|
|
||||||
@@index([voiceHash])
|
|
||||||
}
|
|
||||||
|
|
||||||
model VoiceAnalysis {
|
|
||||||
id String @id @default(uuid())
|
|
||||||
enrollmentId String?
|
|
||||||
userId String
|
|
||||||
audioHash String // Content hash of audio file
|
|
||||||
isSynthetic Boolean
|
|
||||||
confidence Float // 0.0 to 1.0
|
|
||||||
analysisResult Json // Full ML analysis results
|
|
||||||
audioUrl String // S3 storage URL
|
|
||||||
|
|
||||||
enrollment VoiceEnrollment? @relation(fields: [enrollmentId], references: [id])
|
|
||||||
user User @relation(fields: [userId], references: [id])
|
|
||||||
|
|
||||||
createdAt DateTime @default(now())
|
|
||||||
|
|
||||||
@@index([userId])
|
|
||||||
@@index([enrollmentId])
|
|
||||||
@@index([audioHash])
|
|
||||||
}
|
|
||||||
|
|
||||||
// ============================================
|
|
||||||
// SpamShield Models (Spam Detection)
|
|
||||||
// ============================================
|
|
||||||
|
|
||||||
model SpamFeedback {
|
|
||||||
id String @id @default(uuid())
|
|
||||||
userId String
|
|
||||||
phoneNumber String
|
|
||||||
phoneNumberHash String // SHA-256 hash
|
|
||||||
isSpam Boolean
|
|
||||||
confidence Float? // ML model confidence
|
|
||||||
feedbackType FeedbackType
|
|
||||||
metadata Json? // Call duration, time, etc.
|
|
||||||
|
|
||||||
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
|
|
||||||
|
|
||||||
createdAt DateTime @default(now())
|
|
||||||
updatedAt DateTime @updatedAt
|
|
||||||
|
|
||||||
@@index([userId])
|
|
||||||
@@index([phoneNumberHash])
|
|
||||||
@@index([isSpam])
|
|
||||||
}
|
|
||||||
|
|
||||||
enum FeedbackType {
|
|
||||||
initial_detection
|
|
||||||
user_confirmation
|
|
||||||
user_rejection
|
|
||||||
auto_learned
|
|
||||||
}
|
|
||||||
|
|
||||||
model SpamRule {
|
|
||||||
id String @id @default(uuid())
|
|
||||||
userId String?
|
|
||||||
isGlobal Boolean @default(false)
|
|
||||||
ruleType RuleType
|
|
||||||
pattern String
|
|
||||||
action RuleAction
|
|
||||||
priority Int @default(0)
|
|
||||||
isActive Boolean @default(true)
|
|
||||||
|
|
||||||
user User? @relation(fields: [userId], references: [id], onDelete: Cascade)
|
|
||||||
|
|
||||||
createdAt DateTime @default(now())
|
|
||||||
updatedAt DateTime @updatedAt
|
|
||||||
|
|
||||||
@@index([userId])
|
|
||||||
@@index([isGlobal])
|
|
||||||
@@index([ruleType])
|
|
||||||
}
|
|
||||||
|
|
||||||
enum RuleType {
|
|
||||||
phoneNumber
|
|
||||||
areaCode
|
|
||||||
prefix
|
|
||||||
pattern
|
|
||||||
reputation
|
|
||||||
}
|
|
||||||
|
|
||||||
enum RuleAction {
|
|
||||||
block
|
|
||||||
flag
|
|
||||||
allow
|
|
||||||
challenge
|
|
||||||
}
|
|
||||||
|
|
||||||
// ============================================
|
|
||||||
// Audit & Analytics Models
|
|
||||||
// ============================================
|
|
||||||
|
|
||||||
model AuditLog {
|
|
||||||
id String @id @default(uuid())
|
|
||||||
userId String?
|
|
||||||
action String
|
|
||||||
resource String
|
|
||||||
resourceId String?
|
|
||||||
changes Json? // Before/after values
|
|
||||||
metadata Json?
|
|
||||||
ipAddress String?
|
|
||||||
userAgent String?
|
|
||||||
|
|
||||||
createdAt DateTime @default(now())
|
|
||||||
|
|
||||||
@@index([userId])
|
|
||||||
@@index([action])
|
|
||||||
@@index([resource])
|
|
||||||
@@index([createdAt])
|
|
||||||
}
|
|
||||||
|
|
||||||
model KPISnapshot {
|
|
||||||
id String @id @default(uuid())
|
|
||||||
date DateTime @unique
|
|
||||||
metricName String
|
|
||||||
metricValue Float
|
|
||||||
metadata Json?
|
|
||||||
|
|
||||||
createdAt DateTime @default(now())
|
|
||||||
|
|
||||||
@@index([metricName])
|
|
||||||
@@index([date])
|
|
||||||
}
|
|
||||||
@@ -1,50 +0,0 @@
|
|||||||
import { PrismaClient } from '@prisma/client';
|
|
||||||
|
|
||||||
// Singleton pattern for Prisma Client
|
|
||||||
const globalForPrisma = globalThis as unknown as {
|
|
||||||
prisma: PrismaClient | undefined;
|
|
||||||
};
|
|
||||||
|
|
||||||
export const prisma =
|
|
||||||
globalForPrisma.prisma ??
|
|
||||||
new PrismaClient({
|
|
||||||
log: process.env.NODE_ENV === 'development' ? ['query', 'error', 'warn'] : ['error'],
|
|
||||||
});
|
|
||||||
|
|
||||||
if (process.env.NODE_ENV === 'development') {
|
|
||||||
globalForPrisma.prisma = prisma;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Export types from generated client
|
|
||||||
export type {
|
|
||||||
User,
|
|
||||||
Account,
|
|
||||||
Session,
|
|
||||||
FamilyGroup,
|
|
||||||
FamilyGroupMember,
|
|
||||||
Subscription,
|
|
||||||
WatchlistItem,
|
|
||||||
Exposure,
|
|
||||||
Alert,
|
|
||||||
VoiceEnrollment,
|
|
||||||
VoiceAnalysis,
|
|
||||||
SpamFeedback,
|
|
||||||
SpamRule,
|
|
||||||
AuditLog,
|
|
||||||
KPISnapshot,
|
|
||||||
UserRole,
|
|
||||||
FamilyMemberRole,
|
|
||||||
SubscriptionTier,
|
|
||||||
SubscriptionStatus,
|
|
||||||
WatchlistType,
|
|
||||||
ExposureSource,
|
|
||||||
ExposureSeverity,
|
|
||||||
AlertType,
|
|
||||||
AlertSeverity,
|
|
||||||
AlertChannel,
|
|
||||||
FeedbackType,
|
|
||||||
RuleType,
|
|
||||||
RuleAction,
|
|
||||||
} from '@prisma/client';
|
|
||||||
|
|
||||||
export * as PrismaModels from '@prisma/client';
|
|
||||||
@@ -1,21 +0,0 @@
|
|||||||
// Re-export Prisma client
|
|
||||||
export { prisma } from './client';
|
|
||||||
|
|
||||||
// Export types
|
|
||||||
export type {
|
|
||||||
User,
|
|
||||||
Account,
|
|
||||||
Session,
|
|
||||||
FamilyGroup,
|
|
||||||
FamilyGroupMember,
|
|
||||||
Subscription,
|
|
||||||
WatchlistItem,
|
|
||||||
Exposure,
|
|
||||||
Alert,
|
|
||||||
VoiceEnrollment,
|
|
||||||
VoiceAnalysis,
|
|
||||||
SpamFeedback,
|
|
||||||
SpamRule,
|
|
||||||
AuditLog,
|
|
||||||
KPISnapshot,
|
|
||||||
} from './client';
|
|
||||||
@@ -1,12 +0,0 @@
|
|||||||
{
|
|
||||||
"extends": "../../tsconfig.base.json",
|
|
||||||
"compilerOptions": {
|
|
||||||
"outDir": "./dist",
|
|
||||||
"rootDir": "./src",
|
|
||||||
"declaration": true,
|
|
||||||
"declarationMap": true,
|
|
||||||
"emitDeclarationOnly": true
|
|
||||||
},
|
|
||||||
"include": ["src/**/*"],
|
|
||||||
"exclude": ["node_modules", "dist", "prisma"]
|
|
||||||
}
|
|
||||||
@@ -1,28 +0,0 @@
|
|||||||
{
|
|
||||||
"name": "@shieldsai/shared-notifications",
|
|
||||||
"version": "0.1.0",
|
|
||||||
"private": true,
|
|
||||||
"type": "module",
|
|
||||||
"main": "./src/index.ts",
|
|
||||||
"types": "./src/index.ts",
|
|
||||||
"exports": {
|
|
||||||
".": {
|
|
||||||
"import": "./src/index.ts",
|
|
||||||
"types": "./src/index.ts"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"scripts": {
|
|
||||||
"build": "tsc",
|
|
||||||
"lint": "eslint src/"
|
|
||||||
},
|
|
||||||
"dependencies": {
|
|
||||||
"resend": "^6.12.2",
|
|
||||||
"firebase-admin": "^13.2.0",
|
|
||||||
"twilio": "^5.4.0",
|
|
||||||
"zod": "^4.3.6"
|
|
||||||
},
|
|
||||||
"devDependencies": {
|
|
||||||
"@types/node": "^25.6.0",
|
|
||||||
"typescript": "^5.3.3"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,92 +0,0 @@
|
|||||||
import { RateLimitConfig, NotificationChannel } from '../types/notification.types';
|
|
||||||
|
|
||||||
// Resend configuration
|
|
||||||
export interface ResendConfig {
|
|
||||||
apiKey: string;
|
|
||||||
fromEmail: string;
|
|
||||||
fromName: string;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Firebase Cloud Messaging configuration
|
|
||||||
export interface FCMConfig {
|
|
||||||
projectId: string;
|
|
||||||
privateKey: string;
|
|
||||||
clientEmail: string;
|
|
||||||
keyPath?: string; // Path to service account key file
|
|
||||||
}
|
|
||||||
|
|
||||||
// APNs configuration
|
|
||||||
export interface APNsConfig {
|
|
||||||
keyPath: string; // Path to .p8 key file
|
|
||||||
keyId: string;
|
|
||||||
teamId: string;
|
|
||||||
bundleId: string;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Twilio configuration
|
|
||||||
export interface TwilioConfig {
|
|
||||||
accountSid: string;
|
|
||||||
authToken: string;
|
|
||||||
fromNumber?: string; // Optional default sender number
|
|
||||||
}
|
|
||||||
|
|
||||||
// Combined notification config
|
|
||||||
export interface NotificationConfig {
|
|
||||||
resend: ResendConfig;
|
|
||||||
fcm?: FCMConfig;
|
|
||||||
apns?: APNsConfig;
|
|
||||||
twilio?: TwilioConfig;
|
|
||||||
rateLimits: {
|
|
||||||
email: RateLimitConfig;
|
|
||||||
push: RateLimitConfig;
|
|
||||||
sms: RateLimitConfig;
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
// Default rate limits
|
|
||||||
export const defaultRateLimits: Record<NotificationChannel, RateLimitConfig> = {
|
|
||||||
[NotificationChannel.EMAIL]: {
|
|
||||||
maxPerWindow: 100,
|
|
||||||
windowMs: 60 * 60 * 1000, // 1 hour
|
|
||||||
key: 'user',
|
|
||||||
},
|
|
||||||
[NotificationChannel.PUSH]: {
|
|
||||||
maxPerWindow: 50,
|
|
||||||
windowMs: 60 * 60 * 1000, // 1 hour
|
|
||||||
key: 'user',
|
|
||||||
},
|
|
||||||
[NotificationChannel.SMS]: {
|
|
||||||
maxPerWindow: 20,
|
|
||||||
windowMs: 60 * 60 * 1000, // 1 hour
|
|
||||||
key: 'user',
|
|
||||||
},
|
|
||||||
};
|
|
||||||
|
|
||||||
// Load config from environment variables
|
|
||||||
export function loadNotificationConfig(): NotificationConfig {
|
|
||||||
return {
|
|
||||||
resend: {
|
|
||||||
apiKey: process.env.RESEND_API_KEY!,
|
|
||||||
fromEmail: process.env.RESEND_FROM_EMAIL || 'noreply@shieldsai.com',
|
|
||||||
fromName: process.env.RESEND_FROM_NAME || 'ShieldAI',
|
|
||||||
},
|
|
||||||
fcm: process.env.FCM_PROJECT_ID ? {
|
|
||||||
projectId: process.env.FCM_PROJECT_ID,
|
|
||||||
privateKey: process.env.FCM_PRIVATE_KEY!.replace(/\\n/g, '\n'),
|
|
||||||
clientEmail: process.env.FCM_CLIENT_EMAIL!,
|
|
||||||
keyPath: process.env.FCM_KEY_PATH,
|
|
||||||
} : undefined,
|
|
||||||
apns: process.env.APNS_KEY_PATH ? {
|
|
||||||
keyPath: process.env.APNS_KEY_PATH,
|
|
||||||
keyId: process.env.APNS_KEY_ID!,
|
|
||||||
teamId: process.env.APNS_TEAM_ID!,
|
|
||||||
bundleId: process.env.APNS_BUNDLE_ID!,
|
|
||||||
} : undefined,
|
|
||||||
twilio: process.env.TWILIO_ACCOUNT_SID ? {
|
|
||||||
accountSid: process.env.TWILIO_ACCOUNT_SID!,
|
|
||||||
authToken: process.env.TWILIO_AUTH_TOKEN!,
|
|
||||||
fromNumber: process.env.TWILIO_FROM_NUMBER,
|
|
||||||
} : undefined,
|
|
||||||
rateLimits: defaultRateLimits,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
@@ -1,11 +0,0 @@
|
|||||||
// Types
|
|
||||||
export * from './types/notification.types';
|
|
||||||
|
|
||||||
// Config
|
|
||||||
export * from './config/notification.config';
|
|
||||||
|
|
||||||
// Services
|
|
||||||
export { EmailService } from './services/email.service';
|
|
||||||
export { PushService } from './services/push.service';
|
|
||||||
export { SMSService } from './services/sms.service';
|
|
||||||
export { NotificationService, createNotificationService } from './services/notification.service';
|
|
||||||
@@ -1,171 +0,0 @@
|
|||||||
import { Resend } from 'resend';
|
|
||||||
import { ResendConfig } from '../config/notification.config';
|
|
||||||
import {
|
|
||||||
EmailNotification,
|
|
||||||
NotificationStatus,
|
|
||||||
NotificationPriority,
|
|
||||||
NotificationRecipient,
|
|
||||||
NotificationChannel,
|
|
||||||
} from '../types/notification.types';
|
|
||||||
|
|
||||||
export class EmailService {
|
|
||||||
private resend: Resend;
|
|
||||||
private config: ResendConfig;
|
|
||||||
|
|
||||||
constructor(config: ResendConfig) {
|
|
||||||
this.config = config;
|
|
||||||
this.resend = new Resend(config.apiKey);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Send a transactional email
|
|
||||||
*/
|
|
||||||
async sendEmail(
|
|
||||||
recipient: NotificationRecipient,
|
|
||||||
subject: string,
|
|
||||||
htmlBody: string,
|
|
||||||
textBody?: string,
|
|
||||||
templateId?: string,
|
|
||||||
attachments?: Array<{
|
|
||||||
filename: string;
|
|
||||||
content: Buffer | string;
|
|
||||||
mimeType?: string;
|
|
||||||
}>
|
|
||||||
): Promise<EmailNotification> {
|
|
||||||
const notification: EmailNotification = {
|
|
||||||
id: `email_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`,
|
|
||||||
userId: recipient.userId,
|
|
||||||
channel: NotificationChannel.EMAIL,
|
|
||||||
templateId: templateId || 'custom',
|
|
||||||
priority: NotificationPriority.NORMAL,
|
|
||||||
status: NotificationStatus.PENDING,
|
|
||||||
to: recipient.email!,
|
|
||||||
subject,
|
|
||||||
htmlBody,
|
|
||||||
textBody,
|
|
||||||
attachments,
|
|
||||||
createdAt: new Date(),
|
|
||||||
};
|
|
||||||
|
|
||||||
try {
|
|
||||||
const { data, error } = await this.resend.emails.send({
|
|
||||||
from: `${this.config.fromName} <${this.config.fromEmail}>`,
|
|
||||||
to: [recipient.email!],
|
|
||||||
subject,
|
|
||||||
html: htmlBody,
|
|
||||||
text: textBody,
|
|
||||||
attachments: attachments?.map((att) => ({
|
|
||||||
filename: att.filename,
|
|
||||||
content: typeof att.content === 'string' ? Buffer.from(att.content) : att.content,
|
|
||||||
mimeType: att.mimeType,
|
|
||||||
})),
|
|
||||||
metadata: {
|
|
||||||
userId: recipient.userId,
|
|
||||||
notificationId: notification.id,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
if (error) {
|
|
||||||
notification.status = NotificationStatus.FAILED;
|
|
||||||
notification.failedAt = new Date();
|
|
||||||
notification.errorMessage = error.message;
|
|
||||||
} else {
|
|
||||||
notification.status = NotificationStatus.SENT;
|
|
||||||
notification.sentAt = new Date();
|
|
||||||
}
|
|
||||||
|
|
||||||
return notification;
|
|
||||||
} catch (error) {
|
|
||||||
notification.status = NotificationStatus.FAILED;
|
|
||||||
notification.failedAt = new Date();
|
|
||||||
notification.errorMessage = error instanceof Error ? error.message : 'Unknown error';
|
|
||||||
return notification;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Send email with retry logic
|
|
||||||
*/
|
|
||||||
async sendEmailWithRetry(
|
|
||||||
recipient: NotificationRecipient,
|
|
||||||
subject: string,
|
|
||||||
htmlBody: string,
|
|
||||||
textBody?: string,
|
|
||||||
maxRetries: number = 3
|
|
||||||
): Promise<EmailNotification> {
|
|
||||||
let lastError: Error | undefined;
|
|
||||||
|
|
||||||
for (let attempt = 0; attempt < maxRetries; attempt++) {
|
|
||||||
try {
|
|
||||||
const result = await this.sendEmail(recipient, subject, htmlBody, textBody);
|
|
||||||
|
|
||||||
if (result.status === NotificationStatus.SENT) {
|
|
||||||
return result;
|
|
||||||
}
|
|
||||||
|
|
||||||
lastError = new Error(result.errorMessage);
|
|
||||||
} catch (error) {
|
|
||||||
lastError = error as Error;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Wait before retry (exponential backoff)
|
|
||||||
await new Promise((resolve) => setTimeout(resolve, 1000 * Math.pow(2, attempt)));
|
|
||||||
}
|
|
||||||
|
|
||||||
return {
|
|
||||||
id: `email_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`,
|
|
||||||
userId: recipient.userId,
|
|
||||||
channel: NotificationChannel.EMAIL,
|
|
||||||
templateId: 'custom',
|
|
||||||
priority: NotificationPriority.NORMAL,
|
|
||||||
status: NotificationStatus.FAILED,
|
|
||||||
to: recipient.email!,
|
|
||||||
subject,
|
|
||||||
htmlBody,
|
|
||||||
textBody,
|
|
||||||
createdAt: new Date(),
|
|
||||||
failedAt: new Date(),
|
|
||||||
errorMessage: lastError?.message,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check email delivery status
|
|
||||||
*/
|
|
||||||
async getDeliveryStatus(notificationId: string): Promise<NotificationStatus> {
|
|
||||||
try {
|
|
||||||
const emailId = notificationId.replace('email_', '');
|
|
||||||
const { data } = await this.resend.emails.get(emailId);
|
|
||||||
|
|
||||||
if (data.status === 'sent') {
|
|
||||||
return NotificationStatus.SENT;
|
|
||||||
} else if (data.status === 'delivered') {
|
|
||||||
return NotificationStatus.DELIVERED;
|
|
||||||
} else if (data.status === 'failed') {
|
|
||||||
return NotificationStatus.FAILED;
|
|
||||||
}
|
|
||||||
|
|
||||||
return NotificationStatus.PENDING;
|
|
||||||
} catch {
|
|
||||||
return NotificationStatus.PENDING;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Batch send emails
|
|
||||||
*/
|
|
||||||
async batchSendEmails(
|
|
||||||
recipients: Array<{
|
|
||||||
recipient: NotificationRecipient;
|
|
||||||
subject: string;
|
|
||||||
htmlBody: string;
|
|
||||||
textBody?: string;
|
|
||||||
}>
|
|
||||||
): Promise<EmailNotification[]> {
|
|
||||||
const promises = recipients.map(async ({ recipient, subject, htmlBody, textBody }) => {
|
|
||||||
return this.sendEmail(recipient, subject, htmlBody, textBody);
|
|
||||||
});
|
|
||||||
|
|
||||||
return Promise.all(promises);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,271 +0,0 @@
|
|||||||
import { EmailService } from './email.service';
|
|
||||||
import { PushService } from './push.service';
|
|
||||||
import { SMSService } from './sms.service';
|
|
||||||
import {
|
|
||||||
Notification,
|
|
||||||
NotificationChannel,
|
|
||||||
NotificationPreferences,
|
|
||||||
NotificationRecipient,
|
|
||||||
NotificationStatus,
|
|
||||||
NotificationPriority,
|
|
||||||
} from '../types/notification.types';
|
|
||||||
import { NotificationConfig } from '../config/notification.config';
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Main notification service that orchestrates all notification channels
|
|
||||||
*/
|
|
||||||
export class NotificationService {
|
|
||||||
private emailService?: EmailService;
|
|
||||||
private pushService?: PushService;
|
|
||||||
private smsService?: SMSService;
|
|
||||||
private config: NotificationConfig;
|
|
||||||
|
|
||||||
constructor(config: NotificationConfig) {
|
|
||||||
this.config = config;
|
|
||||||
|
|
||||||
// Initialize services based on configuration
|
|
||||||
if (config.resend) {
|
|
||||||
this.emailService = new EmailService(config.resend);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (config.fcm || config.apns) {
|
|
||||||
this.pushService = new PushService(config.fcm, config.apns);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (config.twilio) {
|
|
||||||
this.smsService = new SMSService(config.twilio);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Send notification to all enabled channels for a user
|
|
||||||
*/
|
|
||||||
async sendMultiChannelNotification(
|
|
||||||
recipient: NotificationRecipient,
|
|
||||||
channel: NotificationChannel | NotificationChannel[],
|
|
||||||
subject: string,
|
|
||||||
body: string,
|
|
||||||
priority: NotificationPriority = NotificationPriority.NORMAL,
|
|
||||||
metadata?: Record<string, unknown>
|
|
||||||
): Promise<Notification[]> {
|
|
||||||
const channels = Array.isArray(channel) ? channel : [channel];
|
|
||||||
const notifications: Notification[] = [];
|
|
||||||
|
|
||||||
for (const ch of channels) {
|
|
||||||
const prefs = await this.getNotificationPreferences(recipient.userId);
|
|
||||||
|
|
||||||
if (!this.isChannelEnabled(prefs, ch)) {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
let notification: Notification;
|
|
||||||
|
|
||||||
switch (ch) {
|
|
||||||
case NotificationChannel.EMAIL:
|
|
||||||
if (this.emailService && recipient.email) {
|
|
||||||
notification = await this.emailService.sendEmail(
|
|
||||||
recipient,
|
|
||||||
subject,
|
|
||||||
body,
|
|
||||||
body // Plain text fallback
|
|
||||||
);
|
|
||||||
notifications.push(notification);
|
|
||||||
}
|
|
||||||
break;
|
|
||||||
|
|
||||||
case NotificationChannel.PUSH:
|
|
||||||
if (this.pushService) {
|
|
||||||
notification = await this.pushService.sendPush(
|
|
||||||
recipient,
|
|
||||||
subject,
|
|
||||||
body,
|
|
||||||
metadata as Record<string, unknown>,
|
|
||||||
undefined,
|
|
||||||
'default'
|
|
||||||
);
|
|
||||||
notifications.push(notification);
|
|
||||||
}
|
|
||||||
break;
|
|
||||||
|
|
||||||
case NotificationChannel.SMS:
|
|
||||||
if (this.smsService && recipient.phone) {
|
|
||||||
notification = await this.smsService.sendSMS(recipient, body);
|
|
||||||
notifications.push(notification);
|
|
||||||
}
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return notifications;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Send email notification
|
|
||||||
*/
|
|
||||||
async sendEmail(
|
|
||||||
recipient: NotificationRecipient,
|
|
||||||
subject: string,
|
|
||||||
htmlBody: string,
|
|
||||||
textBody?: string
|
|
||||||
): Promise<Notification | null> {
|
|
||||||
if (!this.emailService) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
return this.emailService.sendEmail(recipient, subject, htmlBody, textBody);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Send push notification
|
|
||||||
*/
|
|
||||||
async sendPush(
|
|
||||||
recipient: NotificationRecipient,
|
|
||||||
title: string,
|
|
||||||
body: string,
|
|
||||||
data?: Record<string, unknown>
|
|
||||||
): Promise<Notification | null> {
|
|
||||||
if (!this.pushService) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
return this.pushService.sendPush(recipient, title, body, data);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Send SMS notification
|
|
||||||
*/
|
|
||||||
async sendSMS(
|
|
||||||
recipient: NotificationRecipient,
|
|
||||||
body: string,
|
|
||||||
fromNumber?: string
|
|
||||||
): Promise<Notification | null> {
|
|
||||||
if (!this.smsService) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
return this.smsService.sendSMS(recipient, body, fromNumber);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get notification preferences for a user
|
|
||||||
*/
|
|
||||||
async getNotificationPreferences(
|
|
||||||
userId: string
|
|
||||||
): Promise<NotificationPreferences> {
|
|
||||||
// TODO: Fetch from database
|
|
||||||
// For now, return default preferences
|
|
||||||
return {
|
|
||||||
userId,
|
|
||||||
email: {
|
|
||||||
enabled: true,
|
|
||||||
categories: ['marketing', 'transactional', 'alerts'],
|
|
||||||
},
|
|
||||||
push: {
|
|
||||||
enabled: true,
|
|
||||||
categories: ['marketing', 'transactional', 'alerts'],
|
|
||||||
},
|
|
||||||
sms: {
|
|
||||||
enabled: true,
|
|
||||||
categories: ['alerts'],
|
|
||||||
},
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check if a channel is enabled for a user
|
|
||||||
*/
|
|
||||||
private isChannelEnabled(
|
|
||||||
prefs: NotificationPreferences,
|
|
||||||
channel: NotificationChannel
|
|
||||||
): boolean {
|
|
||||||
switch (channel) {
|
|
||||||
case NotificationChannel.EMAIL:
|
|
||||||
return prefs.email.enabled;
|
|
||||||
case NotificationChannel.PUSH:
|
|
||||||
return prefs.push.enabled;
|
|
||||||
case NotificationChannel.SMS:
|
|
||||||
return prefs.sms.enabled;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Deduplicate notifications (prevent duplicate sends)
|
|
||||||
*/
|
|
||||||
async deduplicateNotification(
|
|
||||||
userId: string,
|
|
||||||
templateId: string,
|
|
||||||
windowMs: number = 5 * 60 * 1000 // 5 minutes default
|
|
||||||
): Promise<boolean> {
|
|
||||||
// TODO: Check recent notifications in database
|
|
||||||
// For now, return true (not a duplicate)
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check rate limit for a channel
|
|
||||||
*/
|
|
||||||
async checkRateLimit(
|
|
||||||
userId: string,
|
|
||||||
channel: NotificationChannel
|
|
||||||
): Promise<{ allowed: boolean; remaining: number; resetAt: Date }> {
|
|
||||||
const rateLimit = this.config.rateLimits[channel];
|
|
||||||
|
|
||||||
// TODO: Implement actual rate limiting with Redis or database
|
|
||||||
// For now, return default values
|
|
||||||
return {
|
|
||||||
allowed: true,
|
|
||||||
remaining: rateLimit.maxPerWindow,
|
|
||||||
resetAt: new Date(Date.now() + rateLimit.windowMs),
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get email service instance
|
|
||||||
*/
|
|
||||||
getEmailService(): EmailService | undefined {
|
|
||||||
return this.emailService;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get push service instance
|
|
||||||
*/
|
|
||||||
getPushService(): PushService | undefined {
|
|
||||||
return this.pushService;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get SMS service instance
|
|
||||||
*/
|
|
||||||
getSMSService(): SMSService | undefined {
|
|
||||||
return this.smsService;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check if all services are initialized
|
|
||||||
*/
|
|
||||||
isFullyConfigured(): boolean {
|
|
||||||
return !!(this.emailService && this.pushService && this.smsService);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get configuration summary
|
|
||||||
*/
|
|
||||||
getConfigSummary(): {
|
|
||||||
email: boolean;
|
|
||||||
push: boolean;
|
|
||||||
sms: boolean;
|
|
||||||
} {
|
|
||||||
return {
|
|
||||||
email: !!this.emailService,
|
|
||||||
push: !!this.pushService,
|
|
||||||
sms: !!this.smsService,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Export singleton instance creator
|
|
||||||
export function createNotificationService(
|
|
||||||
config: NotificationConfig
|
|
||||||
): NotificationService {
|
|
||||||
return new NotificationService(config);
|
|
||||||
}
|
|
||||||
@@ -1,262 +0,0 @@
|
|||||||
import admin from 'firebase-admin';
|
|
||||||
import * as path from 'path';
|
|
||||||
import {
|
|
||||||
PushNotification,
|
|
||||||
NotificationStatus,
|
|
||||||
NotificationPriority,
|
|
||||||
NotificationRecipient,
|
|
||||||
NotificationChannel,
|
|
||||||
} from '../types/notification.types';
|
|
||||||
import { FCMConfig, APNsConfig } from '../config/notification.config';
|
|
||||||
|
|
||||||
export class PushService {
|
|
||||||
private fcm?: admin.app.App;
|
|
||||||
private apnsConfig?: APNsConfig;
|
|
||||||
|
|
||||||
constructor(fcmConfig?: FCMConfig, apnsConfig?: APNsConfig) {
|
|
||||||
if (fcmConfig) {
|
|
||||||
// Use named app instance for multi-tenant support
|
|
||||||
const appName = fcmConfig.keyPath
|
|
||||||
? `fcm_${fcmConfig.projectId}`
|
|
||||||
: 'fcm_default';
|
|
||||||
|
|
||||||
// Check if app with this name already exists
|
|
||||||
const existingApp = admin.app(appName);
|
|
||||||
|
|
||||||
if (!existingApp) {
|
|
||||||
this.fcm = admin.initializeApp({
|
|
||||||
credential: admin.credential.cert({
|
|
||||||
projectId: fcmConfig.projectId,
|
|
||||||
privateKey: fcmConfig.privateKey.replace(/\\n/g, '\n'),
|
|
||||||
clientEmail: fcmConfig.clientEmail,
|
|
||||||
}),
|
|
||||||
...(fcmConfig.keyPath && {
|
|
||||||
storageBucket: `${fcmConfig.projectId}.appspot.com`,
|
|
||||||
}),
|
|
||||||
}, appName);
|
|
||||||
} else {
|
|
||||||
this.fcm = existingApp;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
this.apnsConfig = apnsConfig;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Send push notification to FCM device
|
|
||||||
*/
|
|
||||||
async sendFCMPush(
|
|
||||||
recipient: NotificationRecipient,
|
|
||||||
title: string,
|
|
||||||
body: string,
|
|
||||||
data?: Record<string, unknown>,
|
|
||||||
badge?: number,
|
|
||||||
sound?: string
|
|
||||||
): Promise<PushNotification> {
|
|
||||||
const notification: PushNotification = {
|
|
||||||
id: `push_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`,
|
|
||||||
userId: recipient.userId,
|
|
||||||
channel: NotificationChannel.PUSH,
|
|
||||||
templateId: 'custom',
|
|
||||||
priority: NotificationPriority.NORMAL,
|
|
||||||
status: NotificationStatus.PENDING,
|
|
||||||
title,
|
|
||||||
body,
|
|
||||||
data,
|
|
||||||
badge,
|
|
||||||
sound,
|
|
||||||
fcmToken: recipient.fcmToken,
|
|
||||||
createdAt: new Date(),
|
|
||||||
};
|
|
||||||
|
|
||||||
if (!this.fcm || !recipient.fcmToken) {
|
|
||||||
notification.status = NotificationStatus.FAILED;
|
|
||||||
notification.failedAt = new Date();
|
|
||||||
notification.errorMessage = !this.fcm ? 'FCM not configured' : 'Missing FCM token';
|
|
||||||
return notification;
|
|
||||||
}
|
|
||||||
|
|
||||||
try {
|
|
||||||
const message: admin.messaging.Message = {
|
|
||||||
token: recipient.fcmToken,
|
|
||||||
notification: {
|
|
||||||
title,
|
|
||||||
body,
|
|
||||||
},
|
|
||||||
data: data
|
|
||||||
? Object.fromEntries(
|
|
||||||
Object.entries(data).map(([key, value]) => [key, String(value)])
|
|
||||||
)
|
|
||||||
: undefined,
|
|
||||||
apns: {
|
|
||||||
payload: {
|
|
||||||
aps: {
|
|
||||||
badge,
|
|
||||||
sound: sound || 'default',
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
android: {
|
|
||||||
priority: 'high',
|
|
||||||
notification: {
|
|
||||||
title,
|
|
||||||
body,
|
|
||||||
clickAction: 'FLUTTER_NOTIFICATION_CLICK',
|
|
||||||
},
|
|
||||||
},
|
|
||||||
};
|
|
||||||
|
|
||||||
const response = await this.fcm.messaging().send(message);
|
|
||||||
|
|
||||||
notification.status = NotificationStatus.SENT;
|
|
||||||
notification.sentAt = new Date();
|
|
||||||
|
|
||||||
return notification;
|
|
||||||
} catch (error) {
|
|
||||||
notification.status = NotificationStatus.FAILED;
|
|
||||||
notification.failedAt = new Date();
|
|
||||||
notification.errorMessage = error instanceof Error ? error.message : 'Unknown error';
|
|
||||||
return notification;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Send push notification using APNs
|
|
||||||
*/
|
|
||||||
async sendAPNSPush(
|
|
||||||
recipient: NotificationRecipient,
|
|
||||||
title: string,
|
|
||||||
body: string,
|
|
||||||
data?: Record<string, unknown>,
|
|
||||||
badge?: number
|
|
||||||
): Promise<PushNotification> {
|
|
||||||
const notification: PushNotification = {
|
|
||||||
id: `push_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`,
|
|
||||||
userId: recipient.userId,
|
|
||||||
channel: NotificationChannel.PUSH,
|
|
||||||
templateId: 'custom',
|
|
||||||
priority: NotificationPriority.NORMAL,
|
|
||||||
status: NotificationStatus.PENDING,
|
|
||||||
title,
|
|
||||||
body,
|
|
||||||
data,
|
|
||||||
badge,
|
|
||||||
apnsToken: recipient.apnsToken,
|
|
||||||
createdAt: new Date(),
|
|
||||||
};
|
|
||||||
|
|
||||||
if (!this.apnsConfig || !recipient.apnsToken) {
|
|
||||||
notification.status = NotificationStatus.FAILED;
|
|
||||||
notification.failedAt = new Date();
|
|
||||||
notification.errorMessage = !this.apnsConfig ? 'APNs not configured' : 'Missing APNs token';
|
|
||||||
return notification;
|
|
||||||
}
|
|
||||||
|
|
||||||
// FCM supports sending to APNs tokens (iOS devices)
|
|
||||||
// This leverages FCM's unified push infrastructure for iOS
|
|
||||||
// APNs token format: device-specific token from iOS
|
|
||||||
if (this.fcm && recipient.apnsToken) {
|
|
||||||
const message: admin.messaging.Message = {
|
|
||||||
token: recipient.apnsToken,
|
|
||||||
notification: {
|
|
||||||
title,
|
|
||||||
body,
|
|
||||||
},
|
|
||||||
data: data
|
|
||||||
? Object.fromEntries(
|
|
||||||
Object.entries(data).map(([key, value]) => [key, String(value)])
|
|
||||||
)
|
|
||||||
: undefined,
|
|
||||||
apns: {
|
|
||||||
payload: {
|
|
||||||
aps: {
|
|
||||||
badge,
|
|
||||||
sound: 'default',
|
|
||||||
contentAvailable: true,
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
};
|
|
||||||
|
|
||||||
try {
|
|
||||||
await this.fcm.messaging().send(message);
|
|
||||||
notification.status = NotificationStatus.SENT;
|
|
||||||
notification.sentAt = new Date();
|
|
||||||
} catch (error) {
|
|
||||||
notification.status = NotificationStatus.FAILED;
|
|
||||||
notification.failedAt = new Date();
|
|
||||||
notification.errorMessage = error instanceof Error ? error.message : 'Unknown error';
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return notification;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Send push notification (auto-detect platform)
|
|
||||||
*/
|
|
||||||
async sendPush(
|
|
||||||
recipient: NotificationRecipient,
|
|
||||||
title: string,
|
|
||||||
body: string,
|
|
||||||
data?: Record<string, unknown>,
|
|
||||||
badge?: number,
|
|
||||||
sound?: string
|
|
||||||
): Promise<PushNotification> {
|
|
||||||
// Prefer APNs for iOS tokens, FCM for Android
|
|
||||||
if (recipient.apnsToken) {
|
|
||||||
return this.sendAPNSPush(recipient, title, body, data, badge);
|
|
||||||
} else if (recipient.fcmToken) {
|
|
||||||
return this.sendFCMPush(recipient, title, body, data, badge, sound);
|
|
||||||
} else {
|
|
||||||
return {
|
|
||||||
id: `push_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`,
|
|
||||||
userId: recipient.userId,
|
|
||||||
channel: NotificationChannel.PUSH,
|
|
||||||
templateId: 'custom',
|
|
||||||
priority: NotificationPriority.NORMAL,
|
|
||||||
status: NotificationStatus.FAILED,
|
|
||||||
title,
|
|
||||||
body,
|
|
||||||
data,
|
|
||||||
badge,
|
|
||||||
sound,
|
|
||||||
createdAt: new Date(),
|
|
||||||
failedAt: new Date(),
|
|
||||||
errorMessage: 'No push token available',
|
|
||||||
};
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Send broadcast push to multiple devices
|
|
||||||
*/
|
|
||||||
async sendBroadcastPush(
|
|
||||||
recipients: Array<NotificationRecipient>,
|
|
||||||
title: string,
|
|
||||||
body: string,
|
|
||||||
data?: Record<string, unknown>
|
|
||||||
): Promise<PushNotification[]> {
|
|
||||||
const promises = recipients.map((recipient) =>
|
|
||||||
this.sendPush(recipient, title, body, data)
|
|
||||||
);
|
|
||||||
|
|
||||||
return Promise.all(promises);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check if FCM is properly configured
|
|
||||||
*/
|
|
||||||
isFCMConfigured(): boolean {
|
|
||||||
return !!this.fcm;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Shutdown FCM app
|
|
||||||
*/
|
|
||||||
async shutdown(): Promise<void> {
|
|
||||||
if (this.fcm) {
|
|
||||||
await this.fcm.delete();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,175 +0,0 @@
|
|||||||
import { Twilio } from 'twilio';
|
|
||||||
import {
|
|
||||||
SMSNotification,
|
|
||||||
NotificationStatus,
|
|
||||||
NotificationPriority,
|
|
||||||
NotificationRecipient,
|
|
||||||
NotificationChannel,
|
|
||||||
} from '../types/notification.types';
|
|
||||||
import { TwilioConfig } from '../config/notification.config';
|
|
||||||
|
|
||||||
export class SMSService {
|
|
||||||
private twilio: Twilio;
|
|
||||||
private config: TwilioConfig;
|
|
||||||
private defaultFromNumber?: string;
|
|
||||||
|
|
||||||
constructor(config: TwilioConfig) {
|
|
||||||
this.config = config;
|
|
||||||
this.twilio = new Twilio(config.accountSid, config.authToken);
|
|
||||||
this.defaultFromNumber = config.fromNumber;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Send SMS message
|
|
||||||
*/
|
|
||||||
async sendSMS(
|
|
||||||
recipient: NotificationRecipient,
|
|
||||||
body: string,
|
|
||||||
fromNumber?: string
|
|
||||||
): Promise<SMSNotification> {
|
|
||||||
const notification: SMSNotification = {
|
|
||||||
id: `sms_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`,
|
|
||||||
userId: recipient.userId,
|
|
||||||
channel: NotificationChannel.SMS,
|
|
||||||
templateId: 'custom',
|
|
||||||
priority: NotificationPriority.NORMAL,
|
|
||||||
status: NotificationStatus.PENDING,
|
|
||||||
to: recipient.phone!,
|
|
||||||
body,
|
|
||||||
from: fromNumber || this.defaultFromNumber,
|
|
||||||
createdAt: new Date(),
|
|
||||||
};
|
|
||||||
|
|
||||||
if (!recipient.phone) {
|
|
||||||
notification.status = NotificationStatus.FAILED;
|
|
||||||
notification.failedAt = new Date();
|
|
||||||
notification.errorMessage = 'Missing phone number';
|
|
||||||
return notification;
|
|
||||||
}
|
|
||||||
|
|
||||||
try {
|
|
||||||
const message = await this.twilio.messages.create({
|
|
||||||
body,
|
|
||||||
from: fromNumber || this.defaultFromNumber,
|
|
||||||
to: recipient.phone,
|
|
||||||
});
|
|
||||||
|
|
||||||
notification.status = NotificationStatus.SENT;
|
|
||||||
notification.sentAt = new Date();
|
|
||||||
notification.id = message.sid;
|
|
||||||
|
|
||||||
return notification;
|
|
||||||
} catch (error) {
|
|
||||||
notification.status = NotificationStatus.FAILED;
|
|
||||||
notification.failedAt = new Date();
|
|
||||||
notification.errorMessage = error instanceof Error ? error.message : 'Unknown error';
|
|
||||||
return notification;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Send SMS with delivery status tracking
|
|
||||||
*/
|
|
||||||
async sendSMSWithTracking(
|
|
||||||
recipient: NotificationRecipient,
|
|
||||||
body: string,
|
|
||||||
fromNumber?: string
|
|
||||||
): Promise<SMSNotification> {
|
|
||||||
const notification = await this.sendSMS(recipient, body, fromNumber);
|
|
||||||
|
|
||||||
if (notification.status === NotificationStatus.SENT && notification.id) {
|
|
||||||
try {
|
|
||||||
const message = await this.twilio.messages(notification.id).fetch();
|
|
||||||
|
|
||||||
if (message.status === 'delivered') {
|
|
||||||
notification.status = NotificationStatus.DELIVERED;
|
|
||||||
notification.deliveredAt = new Date(message.dateUpdated || message.dateSent);
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.warn(`Failed to fetch delivery status for SMS ${notification.id}:`, error);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return notification;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check SMS delivery status
|
|
||||||
*/
|
|
||||||
async getDeliveryStatus(smsId: string): Promise<NotificationStatus> {
|
|
||||||
try {
|
|
||||||
const message = await this.twilio.messages(smsId).fetch();
|
|
||||||
|
|
||||||
switch (message.status) {
|
|
||||||
case 'sent':
|
|
||||||
case 'delivered':
|
|
||||||
return NotificationStatus.DELIVERED;
|
|
||||||
case 'failed':
|
|
||||||
case 'undelivered':
|
|
||||||
return NotificationStatus.FAILED;
|
|
||||||
case 'queued':
|
|
||||||
case 'sending':
|
|
||||||
return NotificationStatus.PENDING;
|
|
||||||
default:
|
|
||||||
return NotificationStatus.PENDING;
|
|
||||||
}
|
|
||||||
} catch {
|
|
||||||
return NotificationStatus.PENDING;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Send bulk SMS messages
|
|
||||||
*/
|
|
||||||
async bulkSendSMS(
|
|
||||||
recipients: Array<{
|
|
||||||
recipient: NotificationRecipient;
|
|
||||||
body: string;
|
|
||||||
fromNumber?: string;
|
|
||||||
}>
|
|
||||||
): Promise<SMSNotification[]> {
|
|
||||||
const promises = recipients.map(async ({ recipient, body, fromNumber }) => {
|
|
||||||
return this.sendSMS(recipient, body, fromNumber);
|
|
||||||
});
|
|
||||||
|
|
||||||
return Promise.all(promises);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Send transactional SMS (e.g., verification codes)
|
|
||||||
*/
|
|
||||||
async sendTransactionSMS(
|
|
||||||
recipient: NotificationRecipient,
|
|
||||||
template: 'verification' | 'password_reset' | 'welcome',
|
|
||||||
variables?: Record<string, string>
|
|
||||||
): Promise<SMSNotification> {
|
|
||||||
const templates: Record<string, (vars?: Record<string, string>) => string> = {
|
|
||||||
verification: (vars) =>
|
|
||||||
`Your verification code is: ${vars?.code || '123456'}. Valid for 10 minutes.`,
|
|
||||||
password_reset: (vars) =>
|
|
||||||
`Password reset requested for ${vars?.email || 'your account'}. Click the link to reset.`,
|
|
||||||
welcome: (vars) =>
|
|
||||||
`Welcome ${vars?.name || 'there'}! Your account has been created successfully.`,
|
|
||||||
};
|
|
||||||
|
|
||||||
const body = templates[template](variables);
|
|
||||||
|
|
||||||
return this.sendSMS(recipient, body);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Validate phone number format
|
|
||||||
*/
|
|
||||||
isValidPhoneNumber(phone: string): boolean {
|
|
||||||
// Basic E.164 format validation
|
|
||||||
const e164Regex = /^\+[1-9]\d{1,14}$/;
|
|
||||||
return e164Regex.test(phone);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get Twilio client for advanced operations
|
|
||||||
*/
|
|
||||||
getClient(): Twilio {
|
|
||||||
return this.twilio;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,133 +0,0 @@
|
|||||||
// Notification channels
|
|
||||||
export enum NotificationChannel {
|
|
||||||
EMAIL = 'email',
|
|
||||||
PUSH = 'push',
|
|
||||||
SMS = 'sms',
|
|
||||||
}
|
|
||||||
|
|
||||||
// Notification priorities
|
|
||||||
export enum NotificationPriority {
|
|
||||||
LOW = 'low',
|
|
||||||
NORMAL = 'normal',
|
|
||||||
HIGH = 'high',
|
|
||||||
URGENT = 'urgent',
|
|
||||||
}
|
|
||||||
|
|
||||||
// Notification status
|
|
||||||
export enum NotificationStatus {
|
|
||||||
PENDING = 'pending',
|
|
||||||
SENT = 'sent',
|
|
||||||
DELIVERED = 'delivered',
|
|
||||||
READ = 'read',
|
|
||||||
FAILED = 'failed',
|
|
||||||
}
|
|
||||||
|
|
||||||
// Template types
|
|
||||||
export enum TemplateType {
|
|
||||||
WELCOME = 'welcome',
|
|
||||||
PASSWORD_RESET = 'password_reset',
|
|
||||||
EMAIL_VERIFICATION = 'email_verification',
|
|
||||||
SMS_VERIFICATION = 'sms_verification',
|
|
||||||
PUSH_WELCOME = 'push_welcome',
|
|
||||||
CUSTOM = 'custom',
|
|
||||||
}
|
|
||||||
|
|
||||||
// Notification recipient
|
|
||||||
export interface NotificationRecipient {
|
|
||||||
userId: string;
|
|
||||||
email?: string;
|
|
||||||
phone?: string;
|
|
||||||
fcmToken?: string;
|
|
||||||
apnsToken?: string;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Notification template
|
|
||||||
export interface NotificationTemplate {
|
|
||||||
id: string;
|
|
||||||
type: TemplateType;
|
|
||||||
channel: NotificationChannel;
|
|
||||||
subject?: string; // For email
|
|
||||||
title?: string; // For push
|
|
||||||
body: string;
|
|
||||||
locale: string;
|
|
||||||
variables: Record<string, string>;
|
|
||||||
isActive: boolean;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Notification preferences
|
|
||||||
export interface NotificationPreferences {
|
|
||||||
userId: string;
|
|
||||||
email: {
|
|
||||||
enabled: boolean;
|
|
||||||
categories: string[]; // e.g., ['marketing', 'transactional', 'alerts']
|
|
||||||
};
|
|
||||||
push: {
|
|
||||||
enabled: boolean;
|
|
||||||
categories: string[];
|
|
||||||
};
|
|
||||||
sms: {
|
|
||||||
enabled: boolean;
|
|
||||||
categories: string[];
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
// Base notification interface
|
|
||||||
export interface BaseNotification {
|
|
||||||
id: string;
|
|
||||||
userId: string;
|
|
||||||
channel: NotificationChannel;
|
|
||||||
templateId: string;
|
|
||||||
priority: NotificationPriority;
|
|
||||||
status: NotificationStatus;
|
|
||||||
metadata?: Record<string, unknown>;
|
|
||||||
createdAt: Date;
|
|
||||||
sentAt?: Date;
|
|
||||||
deliveredAt?: Date;
|
|
||||||
readAt?: Date;
|
|
||||||
failedAt?: Date;
|
|
||||||
errorMessage?: string;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Email-specific notification
|
|
||||||
export interface EmailNotification extends BaseNotification {
|
|
||||||
channel: NotificationChannel.EMAIL;
|
|
||||||
to: string;
|
|
||||||
subject: string;
|
|
||||||
htmlBody?: string;
|
|
||||||
textBody?: string;
|
|
||||||
attachments?: Array<{
|
|
||||||
filename: string;
|
|
||||||
content: Buffer | string;
|
|
||||||
mimeType?: string;
|
|
||||||
}>;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Push notification
|
|
||||||
export interface PushNotification extends BaseNotification {
|
|
||||||
channel: NotificationChannel.PUSH;
|
|
||||||
title: string;
|
|
||||||
body: string;
|
|
||||||
data?: Record<string, unknown>;
|
|
||||||
badge?: number;
|
|
||||||
sound?: string;
|
|
||||||
fcmToken?: string;
|
|
||||||
apnsToken?: string;
|
|
||||||
}
|
|
||||||
|
|
||||||
// SMS notification
|
|
||||||
export interface SMSNotification extends BaseNotification {
|
|
||||||
channel: NotificationChannel.SMS;
|
|
||||||
to: string;
|
|
||||||
body: string;
|
|
||||||
from?: string;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Union type for all notification types
|
|
||||||
export type Notification = EmailNotification | PushNotification | SMSNotification;
|
|
||||||
|
|
||||||
// Rate limit configuration
|
|
||||||
export interface RateLimitConfig {
|
|
||||||
maxPerWindow: number;
|
|
||||||
windowMs: number;
|
|
||||||
key: string; // User ID or template ID
|
|
||||||
}
|
|
||||||
@@ -1,12 +0,0 @@
|
|||||||
{
|
|
||||||
"extends": "../../tsconfig.base.json",
|
|
||||||
"compilerOptions": {
|
|
||||||
"outDir": "./dist",
|
|
||||||
"rootDir": "./src",
|
|
||||||
"declaration": true,
|
|
||||||
"declarationMap": true,
|
|
||||||
"sourceMap": true
|
|
||||||
},
|
|
||||||
"include": ["src/**/*"],
|
|
||||||
"exclude": ["node_modules", "dist"]
|
|
||||||
}
|
|
||||||
@@ -1,17 +0,0 @@
|
|||||||
{
|
|
||||||
"name": "@shieldsai/shared-ui",
|
|
||||||
"version": "0.1.0",
|
|
||||||
"private": true,
|
|
||||||
"type": "module",
|
|
||||||
"main": "src/index.tsx",
|
|
||||||
"types": "src/index.tsx",
|
|
||||||
"scripts": {
|
|
||||||
"lint": "eslint src/"
|
|
||||||
},
|
|
||||||
"dependencies": {
|
|
||||||
"solid-js": "^1.8.14"
|
|
||||||
},
|
|
||||||
"devDependencies": {
|
|
||||||
"typescript": "^5.3.3"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,16 +0,0 @@
|
|||||||
{
|
|
||||||
"name": "@shieldsai/shared-utils",
|
|
||||||
"version": "0.1.0",
|
|
||||||
"private": true,
|
|
||||||
"type": "module",
|
|
||||||
"main": "src/index.ts",
|
|
||||||
"types": "src/index.ts",
|
|
||||||
"scripts": {
|
|
||||||
"lint": "eslint src/",
|
|
||||||
"test": "vitest"
|
|
||||||
},
|
|
||||||
"devDependencies": {
|
|
||||||
"typescript": "^5.3.3",
|
|
||||||
"vitest": "^1.3.1"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,415 +0,0 @@
|
|||||||
/**
|
|
||||||
* WebSocket Alert Server
|
|
||||||
* Real-time alert broadcasting for call analysis events and anomalies
|
|
||||||
* Connects to CallAnalysisEngine and pushes alerts to subscribed clients
|
|
||||||
*/
|
|
||||||
|
|
||||||
import { WebSocketServer, WebSocket } from 'ws';
|
|
||||||
import { CallAnalysisEngine, CallEvent, Anomaly, SentimentAnalysis, AnalysisResult } from '../../src/lib/inference/call-analysis-engine';
|
|
||||||
import { jwtVerify, SignJWT } from 'jose';
|
|
||||||
|
|
||||||
export type AlertType =
|
|
||||||
| 'anomaly'
|
|
||||||
| 'call_event'
|
|
||||||
| 'quality_degraded'
|
|
||||||
| 'sentiment_shift'
|
|
||||||
| 'call_summary'
|
|
||||||
| 'connection'
|
|
||||||
| 'disconnection';
|
|
||||||
|
|
||||||
export type AlertSeverity = 'info' | 'low' | 'medium' | 'high' | 'critical';
|
|
||||||
|
|
||||||
export interface AlertPayload {
|
|
||||||
id: string;
|
|
||||||
type: AlertType;
|
|
||||||
severity: AlertSeverity;
|
|
||||||
timestamp: number;
|
|
||||||
callId?: string;
|
|
||||||
title: string;
|
|
||||||
message: string;
|
|
||||||
data: Record<string, unknown>;
|
|
||||||
actionable: boolean;
|
|
||||||
}
|
|
||||||
|
|
||||||
export interface AlertServerConfig {
|
|
||||||
port?: number;
|
|
||||||
enableAuth?: boolean;
|
|
||||||
jwtSecret?: string;
|
|
||||||
allowedOrigins?: string[];
|
|
||||||
alertCooldownMs?: number;
|
|
||||||
maxSubscribers?: number;
|
|
||||||
enableCallCorrelation?: boolean;
|
|
||||||
}
|
|
||||||
|
|
||||||
export interface SubscriberSession {
|
|
||||||
ws: WebSocket;
|
|
||||||
userId?: string;
|
|
||||||
callIds: Set<string>;
|
|
||||||
lastAlertTime: Map<string, number>;
|
|
||||||
subscribedAt: number;
|
|
||||||
}
|
|
||||||
|
|
||||||
const DEFAULT_CONFIG: Required<AlertServerConfig> = {
|
|
||||||
port: 8088,
|
|
||||||
enableAuth: true,
|
|
||||||
jwtSecret: process.env.ALERT_SERVER_JWT_SECRET || '',
|
|
||||||
allowedOrigins: ['http://localhost:3000'],
|
|
||||||
alertCooldownMs: 5000,
|
|
||||||
maxSubscribers: 100,
|
|
||||||
enableCallCorrelation: true,
|
|
||||||
};
|
|
||||||
|
|
||||||
/**
|
|
||||||
* JWT verification helper
|
|
||||||
*/
|
|
||||||
async function verifyJWT(token: string, secret: string): Promise<any | null> {
|
|
||||||
try {
|
|
||||||
const decoded = await jwtVerify(token, new TextEncoder().encode(secret), {
|
|
||||||
algorithms: ['HS256'],
|
|
||||||
});
|
|
||||||
return decoded;
|
|
||||||
} catch (error) {
|
|
||||||
console.error('[AlertServer] JWT verification failed:', (error as Error).message);
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
export class AlertServer {
|
|
||||||
private wss: WebSocketServer | null = null;
|
|
||||||
private config: Required<AlertServerConfig>;
|
|
||||||
private subscribers: Map<string, SubscriberSession> = new Map();
|
|
||||||
private analysisEngines: Map<string, CallAnalysisEngine> = new Map();
|
|
||||||
private alertHistory: AlertPayload[] = [];
|
|
||||||
private maxAlertHistory: number = 500;
|
|
||||||
private isRunning: boolean = false;
|
|
||||||
|
|
||||||
constructor(config: AlertServerConfig = {}) {
|
|
||||||
this.config = { ...DEFAULT_CONFIG, ...config };
|
|
||||||
}
|
|
||||||
|
|
||||||
async start(): Promise<void> {
|
|
||||||
this.wss = new WebSocketServer({
|
|
||||||
port: this.config.port,
|
|
||||||
maxPayload: 1024 * 1024,
|
|
||||||
});
|
|
||||||
|
|
||||||
this.wss.on('connection', (ws: WebSocket, req) => {
|
|
||||||
this.handleConnection(ws, req);
|
|
||||||
});
|
|
||||||
|
|
||||||
this.wss.on('error', (error: Error) => {
|
|
||||||
console.error(`[AlertServer] WebSocket error: ${error.message}`);
|
|
||||||
});
|
|
||||||
|
|
||||||
this.isRunning = true;
|
|
||||||
console.log(`[AlertServer] Listening on port ${this.config.port}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
private handleConnection(ws: WebSocket, req: import('http').IncomingMessage): void {
|
|
||||||
const url = new URL(req.url || '', `http://${req.headers.host}`);
|
|
||||||
const sessionId = url.searchParams.get('sessionId') || `sub-${Date.now()}-${Math.random().toString(36).slice(2)}`;
|
|
||||||
let userId = url.searchParams.get('userId') || undefined;
|
|
||||||
const callId = url.searchParams.get('callId') || undefined;
|
|
||||||
|
|
||||||
// Origin validation
|
|
||||||
const origin = req.headers.origin;
|
|
||||||
if (origin && !this.config.allowedOrigins.includes(origin)) {
|
|
||||||
ws.close(1008, 'Origin not allowed');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// JWT Authentication (if enabled)
|
|
||||||
if (this.config.enableAuth && this.config.jwtSecret) {
|
|
||||||
const authHeader = req.headers.authorization;
|
|
||||||
if (!authHeader || !authHeader.startsWith('Bearer ')) {
|
|
||||||
ws.close(4001, 'Missing or invalid JWT token');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
const token = authHeader.substring(7);
|
|
||||||
const decoded = verifyJWT(token, this.config.jwtSecret);
|
|
||||||
|
|
||||||
if (!decoded) {
|
|
||||||
ws.close(4002, 'Invalid or expired JWT token');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Extract user ID from token if present
|
|
||||||
userId = (decoded as any).sub || userId;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (this.subscribers.size >= this.config.maxSubscribers) {
|
|
||||||
ws.close(1013, 'Too many subscribers');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
const session: SubscriberSession = {
|
|
||||||
ws,
|
|
||||||
userId,
|
|
||||||
callIds: callId ? new Set([callId]) : new Set(),
|
|
||||||
lastAlertTime: new Map(),
|
|
||||||
subscribedAt: Date.now(),
|
|
||||||
};
|
|
||||||
|
|
||||||
this.subscribers.set(sessionId, session);
|
|
||||||
|
|
||||||
ws.send(JSON.stringify({
|
|
||||||
type: 'connected',
|
|
||||||
payload: { sessionId, userId, timestamp: Date.now() },
|
|
||||||
}));
|
|
||||||
|
|
||||||
ws.on('message', (data: Buffer | ArrayBuffer) => {
|
|
||||||
this.handleMessage(sessionId, data);
|
|
||||||
});
|
|
||||||
|
|
||||||
ws.on('close', () => {
|
|
||||||
this.subscribers.delete(sessionId);
|
|
||||||
console.log(`[AlertServer] Subscriber disconnected: ${sessionId}`);
|
|
||||||
});
|
|
||||||
|
|
||||||
ws.on('error', (error: Error) => {
|
|
||||||
console.error(`[AlertServer] Subscriber error (${sessionId}): ${error.message}`);
|
|
||||||
});
|
|
||||||
|
|
||||||
console.log(`[AlertServer] Subscriber connected: ${sessionId}${callId ? ` (call: ${callId})` : ''}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
private handleMessage(sessionId: string, data: Buffer | ArrayBuffer): void {
|
|
||||||
try {
|
|
||||||
const message = JSON.parse(data.toString());
|
|
||||||
const session = this.subscribers.get(sessionId);
|
|
||||||
if (!session) return;
|
|
||||||
|
|
||||||
switch (message.type) {
|
|
||||||
case 'subscribe':
|
|
||||||
if (message.callId) {
|
|
||||||
session.callIds.add(message.callId);
|
|
||||||
}
|
|
||||||
break;
|
|
||||||
|
|
||||||
case 'unsubscribe':
|
|
||||||
if (message.callId) {
|
|
||||||
session.callIds.delete(message.callId);
|
|
||||||
}
|
|
||||||
break;
|
|
||||||
|
|
||||||
case 'ping':
|
|
||||||
session.ws.send(JSON.stringify({ type: 'pong', timestamp: Date.now() }));
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.error(`[AlertServer] Message parse error: ${(error as Error).message}`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
bindAnalysisEngine(callId: string, engine: CallAnalysisEngine): void {
|
|
||||||
this.analysisEngines.set(callId, engine);
|
|
||||||
|
|
||||||
engine.on('result', (result: AnalysisResult) => {
|
|
||||||
this.processAnalysisResult(callId, result);
|
|
||||||
});
|
|
||||||
|
|
||||||
engine.on('events', (events: CallEvent[]) => {
|
|
||||||
events.forEach(event => {
|
|
||||||
this.sendAlert({
|
|
||||||
type: 'call_event',
|
|
||||||
severity: event.severity as AlertSeverity,
|
|
||||||
callId,
|
|
||||||
title: this.formatEventType(event.type),
|
|
||||||
message: this.formatEventMessage(event),
|
|
||||||
data: { event, timestamp: event.timestamp },
|
|
||||||
actionable: event.severity === 'high',
|
|
||||||
});
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
engine.on('anomalies', (anomalies: Anomaly[]) => {
|
|
||||||
anomalies.forEach(anomaly => {
|
|
||||||
this.sendAlert({
|
|
||||||
type: 'anomaly',
|
|
||||||
severity: anomaly.severity as AlertSeverity,
|
|
||||||
callId,
|
|
||||||
title: this.formatAnomalyType(anomaly.type),
|
|
||||||
message: anomaly.description,
|
|
||||||
data: {
|
|
||||||
anomaly,
|
|
||||||
confidence: anomaly.confidence,
|
|
||||||
recommendation: anomaly.recommendation,
|
|
||||||
},
|
|
||||||
actionable: anomaly.severity === 'high' || anomaly.severity === 'critical',
|
|
||||||
});
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
console.log(`[AlertServer] Bound analysis engine for call: ${callId}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
private processAnalysisResult(callId: string, result: AnalysisResult): void {
|
|
||||||
if (result.callQuality.mosScore < 3.0) {
|
|
||||||
this.sendAlert({
|
|
||||||
type: 'quality_degraded',
|
|
||||||
severity: result.callQuality.mosScore < 2.5 ? 'high' : 'medium',
|
|
||||||
callId,
|
|
||||||
title: 'Call Quality Degraded',
|
|
||||||
message: `MOS score: ${result.callQuality.mosScore.toFixed(1)} (threshold: 3.0)`,
|
|
||||||
data: result.callQuality as unknown as Record<string, unknown>,
|
|
||||||
actionable: true,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
if (result.sentiment.sentiment === 'negative' && result.sentiment.confidence > 0.7) {
|
|
||||||
this.sendAlert({
|
|
||||||
type: 'sentiment_shift',
|
|
||||||
severity: 'medium',
|
|
||||||
callId,
|
|
||||||
title: 'Negative Sentiment Detected',
|
|
||||||
message: `Confidence: ${(result.sentiment.confidence * 100).toFixed(0)}%`,
|
|
||||||
data: result.sentiment as unknown as Record<string, unknown>,
|
|
||||||
actionable: false,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
sendAlert(options: {
|
|
||||||
type: AlertType;
|
|
||||||
severity: AlertSeverity;
|
|
||||||
callId?: string;
|
|
||||||
title: string;
|
|
||||||
message: string;
|
|
||||||
data: Record<string, unknown>;
|
|
||||||
actionable: boolean;
|
|
||||||
}): void {
|
|
||||||
const cooldownKey = `${options.callId}:${options.type}`;
|
|
||||||
const now = Date.now();
|
|
||||||
|
|
||||||
const sessionKeys = Array.from(this.subscribers.keys());
|
|
||||||
for (const key of sessionKeys) {
|
|
||||||
const session = this.subscribers.get(key);
|
|
||||||
if (!session) continue;
|
|
||||||
|
|
||||||
const lastTime = session.lastAlertTime.get(cooldownKey) || 0;
|
|
||||||
if (now - lastTime < this.config.alertCooldownMs) continue;
|
|
||||||
|
|
||||||
if (options.callId && session.callIds.size > 0 && !session.callIds.has(options.callId)) continue;
|
|
||||||
|
|
||||||
const alert: AlertPayload = {
|
|
||||||
id: `alert-${Date.now()}-${Math.random().toString(36).slice(2, 8)}`,
|
|
||||||
type: options.type,
|
|
||||||
severity: options.severity,
|
|
||||||
timestamp: now,
|
|
||||||
callId: options.callId,
|
|
||||||
title: options.title,
|
|
||||||
message: options.message,
|
|
||||||
data: options.data,
|
|
||||||
actionable: options.actionable,
|
|
||||||
};
|
|
||||||
|
|
||||||
this.alertHistory.push(alert);
|
|
||||||
if (this.alertHistory.length > this.maxAlertHistory) {
|
|
||||||
this.alertHistory.shift();
|
|
||||||
}
|
|
||||||
|
|
||||||
if (session.ws.readyState === WebSocket.OPEN) {
|
|
||||||
session.ws.send(JSON.stringify(alert));
|
|
||||||
}
|
|
||||||
session.lastAlertTime.set(cooldownKey, now);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
broadcastCallSummary(callId: string, summary: string): void {
|
|
||||||
this.sendAlert({
|
|
||||||
type: 'call_summary',
|
|
||||||
severity: 'info',
|
|
||||||
callId,
|
|
||||||
title: 'Call Analysis Summary',
|
|
||||||
message: summary,
|
|
||||||
data: { summary },
|
|
||||||
actionable: false,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
getAlertHistory(limit: number = 50, callId?: string): AlertPayload[] {
|
|
||||||
let history = this.alertHistory;
|
|
||||||
if (callId) {
|
|
||||||
history = history.filter(a => a.callId === callId);
|
|
||||||
}
|
|
||||||
return history.slice(-limit);
|
|
||||||
}
|
|
||||||
|
|
||||||
getSubscriberCount(): number {
|
|
||||||
return this.subscribers.size;
|
|
||||||
}
|
|
||||||
|
|
||||||
getActiveCalls(): string[] {
|
|
||||||
return Array.from(this.analysisEngines.keys());
|
|
||||||
}
|
|
||||||
|
|
||||||
getEngine(callId: string): CallAnalysisEngine | undefined {
|
|
||||||
return this.analysisEngines.get(callId);
|
|
||||||
}
|
|
||||||
|
|
||||||
async stop(): Promise<void> {
|
|
||||||
this.isRunning = false;
|
|
||||||
|
|
||||||
this.subscribers.forEach((session) => {
|
|
||||||
if (session.ws.readyState === WebSocket.OPEN) {
|
|
||||||
session.ws.send(JSON.stringify({
|
|
||||||
type: 'server_shutdown',
|
|
||||||
payload: { timestamp: Date.now() },
|
|
||||||
}));
|
|
||||||
session.ws.close(1001, 'Server shutting down');
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
this.analysisEngines.forEach((engine) => {
|
|
||||||
engine.destroy();
|
|
||||||
});
|
|
||||||
|
|
||||||
if (this.wss) {
|
|
||||||
await new Promise<void>((resolve) => {
|
|
||||||
this.wss!.close(() => resolve());
|
|
||||||
});
|
|
||||||
this.wss = null;
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log('[AlertServer] Stopped');
|
|
||||||
}
|
|
||||||
|
|
||||||
private formatEventType(type: string): string {
|
|
||||||
const labels: Record<string, string> = {
|
|
||||||
interrupt: 'Speaker Interrupt',
|
|
||||||
overlap: 'Speech Overlap',
|
|
||||||
long_pause: 'Long Pause',
|
|
||||||
volume_spike: 'Volume Spike',
|
|
||||||
silence: 'Silence Detected',
|
|
||||||
speaker_change: 'Speaker Change',
|
|
||||||
};
|
|
||||||
return labels[type] || type;
|
|
||||||
}
|
|
||||||
|
|
||||||
private formatEventMessage(event: CallEvent): string {
|
|
||||||
const messages: Record<string, string> = {
|
|
||||||
interrupt: `Interrupt detected (${event.duration}ms)`,
|
|
||||||
overlap: `Speech overlap detected (${event.duration}ms)`,
|
|
||||||
long_pause: `Pause duration: ${(event.duration / 1000).toFixed(1)}s`,
|
|
||||||
volume_spike: `Volume spike: ${(event.metadata.level as number)?.toFixed(2) || 'unknown'}`,
|
|
||||||
silence: `Silence detected for ${(event.duration * 1000).toFixed(0)}ms`,
|
|
||||||
speaker_change: 'Speaker change detected',
|
|
||||||
};
|
|
||||||
return messages[event.type] || 'Event detected';
|
|
||||||
}
|
|
||||||
|
|
||||||
private formatAnomalyType(type: string): string {
|
|
||||||
const labels: Record<string, string> = {
|
|
||||||
background_noise: 'Background Noise',
|
|
||||||
echo: 'Echo Detected',
|
|
||||||
distortion: 'Audio Distortion',
|
|
||||||
dropouts: 'Audio Dropout',
|
|
||||||
excessive_silence: 'Excessive Silence',
|
|
||||||
volume_inconsistency: 'Volume Inconsistency',
|
|
||||||
};
|
|
||||||
return labels[type] || type;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
export default AlertServer;
|
|
||||||
@@ -1,259 +0,0 @@
|
|||||||
/**
|
|
||||||
* WebRTC Signaling Server
|
|
||||||
* Reuses WebSocket infrastructure for WebRTC signaling
|
|
||||||
* Handles peer connection negotiation and ICE candidate exchange
|
|
||||||
*/
|
|
||||||
|
|
||||||
import { WebSocketServer } from 'ws';
|
|
||||||
import { Peer } from 'peerjs';
|
|
||||||
|
|
||||||
type SignalingMessage = {
|
|
||||||
type: 'offer' | 'answer' | 'ice-candidate' | 'disconnect';
|
|
||||||
payload: RTCSessionDescriptionInit | RTCIceCandidate | { peerId: string };
|
|
||||||
targetPeerId: string;
|
|
||||||
};
|
|
||||||
|
|
||||||
interface PeerConnection {
|
|
||||||
peer: Peer;
|
|
||||||
connections: Map<string, any>;
|
|
||||||
iceCandidates: Map<string, RTCIceCandidate[]>;
|
|
||||||
}
|
|
||||||
|
|
||||||
export class WebRTCSignalingServer {
|
|
||||||
private wss: WebSocketServer;
|
|
||||||
private peers: Map<string, PeerConnection> = new Map();
|
|
||||||
private port: number;
|
|
||||||
|
|
||||||
constructor(port: number) {
|
|
||||||
this.port = port;
|
|
||||||
this.wss = new WebSocketServer({ port });
|
|
||||||
this.initialize();
|
|
||||||
}
|
|
||||||
|
|
||||||
private initialize(): void {
|
|
||||||
console.log(`WebRTC Signaling Server starting on port ${this.port}`);
|
|
||||||
|
|
||||||
this.wss.on('connection', (ws: any, req) => {
|
|
||||||
const url = new URL(req.url || '', `http://${req.headers.host}`);
|
|
||||||
const peerId = url.searchParams.get('peerId') || `peer-${Date.now()}-${Math.random()}`;
|
|
||||||
|
|
||||||
console.log(`WebRTC peer connected: ${peerId}`);
|
|
||||||
|
|
||||||
const peerConnection: PeerConnection = {
|
|
||||||
peer: new Peer(peerId, {
|
|
||||||
host: 'localhost',
|
|
||||||
port: this.port,
|
|
||||||
path: '/webrtc',
|
|
||||||
}),
|
|
||||||
connections: new Map(),
|
|
||||||
iceCandidates: new Map(),
|
|
||||||
};
|
|
||||||
|
|
||||||
this.peers.set(peerId, peerConnection);
|
|
||||||
|
|
||||||
// Handle incoming signaling messages
|
|
||||||
ws.on('message', (data: Buffer) => {
|
|
||||||
try {
|
|
||||||
const message: SignalingMessage = JSON.parse(data.toString());
|
|
||||||
this.handleSignalingMessage(peerId, message, ws);
|
|
||||||
} catch (error) {
|
|
||||||
console.error('Error parsing signaling message:', error);
|
|
||||||
ws.send(JSON.stringify({ type: 'error', payload: { message: 'Invalid message format' } }));
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// Handle disconnection
|
|
||||||
ws.on('close', () => {
|
|
||||||
console.log(`WebRTC peer disconnected: ${peerId}`);
|
|
||||||
this.cleanupPeer(peerId);
|
|
||||||
});
|
|
||||||
|
|
||||||
// Send confirmation
|
|
||||||
ws.send(JSON.stringify({
|
|
||||||
type: 'connected',
|
|
||||||
payload: { peerId }
|
|
||||||
}));
|
|
||||||
});
|
|
||||||
|
|
||||||
console.log(`WebRTC Signaling Server started on port ${this.port}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
private handleSignalingMessage(
|
|
||||||
sourcePeerId: string,
|
|
||||||
message: SignalingMessage,
|
|
||||||
ws: any
|
|
||||||
): void {
|
|
||||||
const { type, payload, targetPeerId } = message;
|
|
||||||
const sourceConnection = this.peers.get(sourcePeerId);
|
|
||||||
|
|
||||||
if (!sourceConnection) {
|
|
||||||
console.warn(`Source peer not found: ${sourcePeerId}`);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
switch (type) {
|
|
||||||
case 'offer':
|
|
||||||
this.handleOffer(sourcePeerId, targetPeerId, payload as RTCSessionDescriptionInit, ws);
|
|
||||||
break;
|
|
||||||
|
|
||||||
case 'answer':
|
|
||||||
this.handleAnswer(sourcePeerId, targetPeerId, payload as RTCSessionDescriptionInit);
|
|
||||||
break;
|
|
||||||
|
|
||||||
case 'ice-candidate':
|
|
||||||
this.handleIceCandidate(sourcePeerId, targetPeerId, payload as RTCIceCandidate);
|
|
||||||
break;
|
|
||||||
|
|
||||||
case 'disconnect':
|
|
||||||
this.disconnectPeer(sourcePeerId, targetPeerId);
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private async handleOffer(
|
|
||||||
sourcePeerId: string,
|
|
||||||
targetPeerId: string,
|
|
||||||
offer: RTCSessionDescriptionInit,
|
|
||||||
ws: any
|
|
||||||
): Promise<void> {
|
|
||||||
console.log(`Offer received from ${sourcePeerId} to ${targetPeerId}`);
|
|
||||||
|
|
||||||
const targetConnection = this.peers.get(targetPeerId);
|
|
||||||
if (!targetConnection) {
|
|
||||||
console.warn(`Target peer not found: ${targetPeerId}`);
|
|
||||||
ws.send(JSON.stringify({
|
|
||||||
type: 'error',
|
|
||||||
payload: { message: `Target peer ${targetPeerId} not found` },
|
|
||||||
}));
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Store the connection
|
|
||||||
if (!targetConnection.connections.has(sourcePeerId)) {
|
|
||||||
const conn = targetConnection.peer.call(sourcePeerId, new MediaStream());
|
|
||||||
targetConnection.connections.set(sourcePeerId, conn);
|
|
||||||
|
|
||||||
// Handle connection events
|
|
||||||
conn.on('stream', (stream: MediaStream) => {
|
|
||||||
console.log(`Media stream received: ${targetPeerId} from ${sourcePeerId}`);
|
|
||||||
});
|
|
||||||
|
|
||||||
conn.on('close', () => {
|
|
||||||
console.log(`Connection closed: ${targetPeerId} <-> ${sourcePeerId}`);
|
|
||||||
});
|
|
||||||
|
|
||||||
conn.on('error', (error: Error) => {
|
|
||||||
console.error(`Connection error: ${targetPeerId} <-> ${sourcePeerId}`, error);
|
|
||||||
});
|
|
||||||
|
|
||||||
// Send accumulated ICE candidates
|
|
||||||
const accumulatedCandidates = targetConnection.iceCandidates.get(sourcePeerId) || [];
|
|
||||||
accumulatedCandidates.forEach(candidate => {
|
|
||||||
conn.dataChannel.send(JSON.stringify({
|
|
||||||
type: 'ice-candidate',
|
|
||||||
payload: candidate,
|
|
||||||
targetPeerId,
|
|
||||||
}));
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Forward offer to target peer
|
|
||||||
const targetConn = targetConnection.connections.get(sourcePeerId);
|
|
||||||
if (targetConn) {
|
|
||||||
(targetConn as any).dataChannel.send(JSON.stringify({
|
|
||||||
type: 'offer',
|
|
||||||
payload: offer,
|
|
||||||
targetPeerId: targetPeerId,
|
|
||||||
}));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private handleAnswer(
|
|
||||||
sourcePeerId: string,
|
|
||||||
targetPeerId: string,
|
|
||||||
answer: RTCSessionDescriptionInit
|
|
||||||
): void {
|
|
||||||
console.log(`Answer received from ${sourcePeerId} to ${targetPeerId}`);
|
|
||||||
|
|
||||||
const targetConnection = this.peers.get(targetPeerId);
|
|
||||||
if (targetConnection?.connections.has(sourcePeerId)) {
|
|
||||||
targetConnection.connections.get(sourcePeerId).send(JSON.stringify({
|
|
||||||
type: 'answer',
|
|
||||||
payload: answer,
|
|
||||||
targetPeerId: targetPeerId,
|
|
||||||
}));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private handleIceCandidate(
|
|
||||||
sourcePeerId: string,
|
|
||||||
targetPeerId: string,
|
|
||||||
candidate: RTCIceCandidate
|
|
||||||
): void {
|
|
||||||
const targetConnection = this.peers.get(targetPeerId);
|
|
||||||
|
|
||||||
if (!targetConnection) {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Forward ICE candidate to target peer
|
|
||||||
if (targetConnection.connections.has(sourcePeerId)) {
|
|
||||||
const conn = targetConnection.connections.get(sourcePeerId);
|
|
||||||
if (conn) {
|
|
||||||
(conn as any).send(JSON.stringify({
|
|
||||||
type: 'ice-candidate',
|
|
||||||
payload: candidate,
|
|
||||||
targetPeerId: targetPeerId,
|
|
||||||
}));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private disconnectPeer(sourcePeerId: string, targetPeerId: string): void {
|
|
||||||
const sourceConnection = this.peers.get(sourcePeerId);
|
|
||||||
if (sourceConnection?.connections.has(targetPeerId)) {
|
|
||||||
sourceConnection.connections.get(targetPeerId).close();
|
|
||||||
sourceConnection.connections.delete(targetPeerId);
|
|
||||||
console.log(`Connection closed: ${sourcePeerId} <-> ${targetPeerId}`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private cleanupPeer(peerId: string): void {
|
|
||||||
const peerConnection = this.peers.get(peerId);
|
|
||||||
if (peerConnection) {
|
|
||||||
// Close all connections
|
|
||||||
peerConnection.connections.forEach((conn, connectedPeerId) => {
|
|
||||||
conn.close();
|
|
||||||
console.log(`Cleaned up connection: ${peerId} <-> ${connectedPeerId}`);
|
|
||||||
});
|
|
||||||
|
|
||||||
// Destroy PeerJS instance
|
|
||||||
peerConnection.peer.destroy();
|
|
||||||
|
|
||||||
// Remove from registry
|
|
||||||
this.peers.delete(peerId);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
getPeerCount(): number {
|
|
||||||
return this.peers.size;
|
|
||||||
}
|
|
||||||
|
|
||||||
getPeers(): string[] {
|
|
||||||
return Array.from(this.peers.keys());
|
|
||||||
}
|
|
||||||
|
|
||||||
getPeerConnections(peerId: string): string[] {
|
|
||||||
const peerConnection = this.peers.get(peerId);
|
|
||||||
if (!peerConnection) return [];
|
|
||||||
return Array.from(peerConnection.connections.keys());
|
|
||||||
}
|
|
||||||
|
|
||||||
close(): void {
|
|
||||||
this.peers.forEach((_, peerId) => this.cleanupPeer(peerId));
|
|
||||||
this.wss.close();
|
|
||||||
console.log('WebRTC Signaling Server closed');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
export default WebRTCSignalingServer;
|
|
||||||
@@ -1,15 +0,0 @@
|
|||||||
FROM python:3.11-slim
|
|
||||||
|
|
||||||
WORKDIR /app
|
|
||||||
|
|
||||||
COPY requirements.txt .
|
|
||||||
RUN pip install --no-cache-dir -r requirements.txt
|
|
||||||
|
|
||||||
COPY . .
|
|
||||||
|
|
||||||
# Create models directory
|
|
||||||
RUN mkdir -p models
|
|
||||||
|
|
||||||
EXPOSE 8001
|
|
||||||
|
|
||||||
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8001"]
|
|
||||||
@@ -1,172 +0,0 @@
|
|||||||
"""
|
|
||||||
VoicePrint ML Service — ECAPA-TDNN inference microservice.
|
|
||||||
|
|
||||||
Provides endpoints for:
|
|
||||||
- Audio preprocessing (VAD, noise reduction, normalization)
|
|
||||||
- Voice embedding extraction using ECAPA-TDNN
|
|
||||||
- Synthetic voice detection
|
|
||||||
|
|
||||||
For MVP, uses a mock model. Replace with real ECAPA-TDNN model when available.
|
|
||||||
"""
|
|
||||||
|
|
||||||
from fastapi import FastAPI, File, UploadFile, HTTPException
|
|
||||||
from pydantic import BaseModel
|
|
||||||
from typing import Optional
|
|
||||||
import numpy as np
|
|
||||||
import io
|
|
||||||
|
|
||||||
app = FastAPI(
|
|
||||||
title="VoicePrint ML Service",
|
|
||||||
description="ECAPA-TDNN inference for voice cloning detection",
|
|
||||||
version="0.1.0",
|
|
||||||
)
|
|
||||||
|
|
||||||
# Model configuration
|
|
||||||
MODEL_PATH = "./models/ecapa-tdnn"
|
|
||||||
EMBEDDING_DIMENSIONS = 192
|
|
||||||
SAMPLE_RATE = 16000
|
|
||||||
CHANNELS = 1
|
|
||||||
|
|
||||||
|
|
||||||
class EmbeddingResponse(BaseModel):
|
|
||||||
embedding: list[float]
|
|
||||||
duration: float
|
|
||||||
sample_rate: int
|
|
||||||
|
|
||||||
|
|
||||||
class AnalysisResponse(BaseModel):
|
|
||||||
is_synthetic: bool
|
|
||||||
confidence: float
|
|
||||||
detection_type: str
|
|
||||||
features: dict[str, float]
|
|
||||||
embedding: list[float]
|
|
||||||
|
|
||||||
|
|
||||||
class PreprocessRequest(BaseModel):
|
|
||||||
sample_rate: int = SAMPLE_RATE
|
|
||||||
channels: int = CHANNELS
|
|
||||||
apply_vad: bool = True
|
|
||||||
noise_reduction: bool = True
|
|
||||||
|
|
||||||
|
|
||||||
# Mock model — replace with real ECAPA-TDNN inference
|
|
||||||
class MockECAPATDNN:
|
|
||||||
def __init__(self):
|
|
||||||
self.dimensions = EMBEDDING_DIMENSIONS
|
|
||||||
self.initialized = False
|
|
||||||
|
|
||||||
def initialize(self):
|
|
||||||
# TODO: Load real ECAPA-TDNN model
|
|
||||||
# self.model = torch.load(MODEL_PATH)
|
|
||||||
self.initialized = True
|
|
||||||
|
|
||||||
def extract_embedding(self, audio_bytes: bytes) -> list[float]:
|
|
||||||
if not self.initialized:
|
|
||||||
self.initialize()
|
|
||||||
|
|
||||||
# Mock: generate deterministic embedding based on audio content
|
|
||||||
hash_val = sum(audio_bytes[:256]) & 0xFFFFFFFF
|
|
||||||
embedding = []
|
|
||||||
for i in range(self.dimensions):
|
|
||||||
hash_val = ((hash_val << 5) - hash_val + i) & 0xFFFFFFFF
|
|
||||||
embedding.append((hash_val % 1000) / 1000.0)
|
|
||||||
|
|
||||||
# L2 normalize
|
|
||||||
norm = np.sqrt(sum(v * v for v in embedding))
|
|
||||||
return [v / norm for v in embedding]
|
|
||||||
|
|
||||||
def analyze(self, audio_bytes: bytes) -> dict:
|
|
||||||
embedding = self.extract_embedding(audio_bytes)
|
|
||||||
|
|
||||||
# Mock: estimate synthetic confidence from audio statistics
|
|
||||||
mean_amplitude = np.mean(np.frombuffer(audio_bytes[:1024], dtype=np.uint8)) / 255.0
|
|
||||||
confidence = min(1.0, abs(mean_amplitude - 0.5) * 2 * 0.3 + np.random.random() * 0.7)
|
|
||||||
|
|
||||||
detection_type = "synthetic_voice" if confidence >= 0.75 else "natural"
|
|
||||||
|
|
||||||
return {
|
|
||||||
"is_synthetic": confidence >= 0.75,
|
|
||||||
"confidence": float(confidence),
|
|
||||||
"detection_type": detection_type,
|
|
||||||
"features": {
|
|
||||||
"mean_amplitude": float(mean_amplitude),
|
|
||||||
"embedding_energy": float(sum(v * v for v in embedding)),
|
|
||||||
},
|
|
||||||
"embedding": embedding,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
model = MockECAPATDNN()
|
|
||||||
|
|
||||||
|
|
||||||
@app.get("/health")
|
|
||||||
async def health():
|
|
||||||
return {
|
|
||||||
"status": "ok",
|
|
||||||
"model": "ecapa-tdnn-v1-mock",
|
|
||||||
"initialized": model.initialized,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
@app.post("/initialize")
|
|
||||||
async def initialize():
|
|
||||||
model.initialize()
|
|
||||||
return {"status": "initialized", "model": "ecapa-tdnn-v1-mock"}
|
|
||||||
|
|
||||||
|
|
||||||
@app.post("/preprocess")
|
|
||||||
async def preprocess(audio: UploadFile = File(...)):
|
|
||||||
"""Preprocess audio: VAD, noise reduction, normalization to 16kHz mono."""
|
|
||||||
audio_bytes = await audio.read()
|
|
||||||
|
|
||||||
# TODO: Integrate with librosa/torchaudio for real preprocessing
|
|
||||||
# audio_array, sr = librosa.load(io.BytesIO(audio_bytes), sr=SAMPLE_RATE, mono=CHANNELS)
|
|
||||||
|
|
||||||
return {
|
|
||||||
"status": "processed",
|
|
||||||
"sample_rate": SAMPLE_RATE,
|
|
||||||
"channels": CHANNELS,
|
|
||||||
"duration": len(audio_bytes) / (SAMPLE_RATE * 2 * CHANNELS),
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
@app.post("/embed", response_model=EmbeddingResponse)
|
|
||||||
async def extract_embedding(audio: UploadFile = File(...)):
|
|
||||||
"""Extract voice embedding using ECAPA-TDNN."""
|
|
||||||
audio_bytes = await audio.read()
|
|
||||||
|
|
||||||
if len(audio_bytes) < SAMPLE_RATE * 2:
|
|
||||||
raise HTTPException(
|
|
||||||
status_code=422,
|
|
||||||
detail=f"Audio too short: minimum {SAMPLE_RATE * 2} bytes (1 second at 16kHz)",
|
|
||||||
)
|
|
||||||
|
|
||||||
embedding = model.extract_embedding(audio_bytes)
|
|
||||||
duration = len(audio_bytes) / (SAMPLE_RATE * 2 * CHANNELS)
|
|
||||||
|
|
||||||
return EmbeddingResponse(
|
|
||||||
embedding=embedding,
|
|
||||||
duration=duration,
|
|
||||||
sample_rate=SAMPLE_RATE,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
@app.post("/analyze", response_model=AnalysisResponse)
|
|
||||||
async def analyze_audio(audio: UploadFile = File(...)):
|
|
||||||
"""Analyze audio for synthetic voice detection."""
|
|
||||||
audio_bytes = await audio.read()
|
|
||||||
|
|
||||||
if len(audio_bytes) < SAMPLE_RATE * 2 * 3:
|
|
||||||
raise HTTPException(
|
|
||||||
status_code=422,
|
|
||||||
detail=f"Audio too short: minimum {SAMPLE_RATE * 2 * 3} bytes (3 seconds at 16kHz)",
|
|
||||||
)
|
|
||||||
|
|
||||||
result = model.analyze(audio_bytes)
|
|
||||||
|
|
||||||
return AnalysisResponse(**result)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
import uvicorn
|
|
||||||
uvicorn.run(app, host="0.0.0.0", port=8001)
|
|
||||||
@@ -1,8 +0,0 @@
|
|||||||
fastapi==0.104.1
|
|
||||||
uvicorn==0.24.0
|
|
||||||
pydantic==2.5.0
|
|
||||||
numpy==1.26.0
|
|
||||||
librosa==0.10.0
|
|
||||||
torch==2.1.0
|
|
||||||
faiss-cpu==1.7.4
|
|
||||||
python-multipart==0.0.6
|
|
||||||
@@ -22,6 +22,6 @@
|
|||||||
},
|
},
|
||||||
"types": ["vite-plugin-solid"]
|
"types": ["vite-plugin-solid"]
|
||||||
},
|
},
|
||||||
"include": ["src/**/*", "server/**/*"],
|
"include": ["agents/**/*"],
|
||||||
"exclude": ["node_modules", "dist"]
|
"exclude": ["node_modules", "dist"]
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -6,9 +6,9 @@ export default defineConfig({
|
|||||||
plugins: [solid()],
|
plugins: [solid()],
|
||||||
resolve: {
|
resolve: {
|
||||||
alias: {
|
alias: {
|
||||||
'@lib': resolve(__dirname, './src/lib'),
|
'@agents': resolve(__dirname, './agents'),
|
||||||
'@components': resolve(__dirname, './src/components'),
|
'@analysis': resolve(__dirname, './analysis'),
|
||||||
'@types': resolve(__dirname, './src/types'),
|
'@plans': resolve(__dirname, './plans'),
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
build: {
|
build: {
|
||||||
|
|||||||
Reference in New Issue
Block a user