diff --git a/.claude/commands/setup-security-tools.md b/.claude/commands/setup-security-tools.md new file mode 100644 index 000000000..6462f04fa --- /dev/null +++ b/.claude/commands/setup-security-tools.md @@ -0,0 +1,38 @@ +Set up all Socket security tools for local development. + +## What this sets up + +1. **AgentShield** — scans Claude config for prompt injection and secrets +2. **Zizmor** — static analysis for GitHub Actions workflows +3. **SFW (Socket Firewall)** — intercepts package manager commands to scan for malware + +## Setup + +First, ask the user if they have a Socket API key for SFW enterprise features. + +If they do: +1. Ask them to provide it +2. Write it to `.env.local` as `SOCKET_API_KEY=` (create if needed) +3. Verify `.env.local` is in `.gitignore` — if not, add it and warn + +If they don't, proceed with SFW free mode. + +Then run: +```bash +node .claude/hooks/setup-security-tools/index.mts +``` + +After the script completes, add the SFW shim directory to PATH: +```bash +export PATH="$HOME/.socket/sfw/shims:$PATH" +``` + +## Notes + +- Safe to re-run (idempotent) +- AgentShield needs `pnpm install` (it's a devDep) +- Zizmor is cached at `~/.socket/zizmor/bin/` +- SFW binary is cached via dlx at `~/.socket/_dlx/` +- SFW shims are shared across repos at `~/.socket/sfw/shims/` +- `.env.local` must NEVER be committed +- `/update` will check for new versions of these tools via `node .claude/hooks/setup-security-tools/update.mts` diff --git a/.claude/hooks/check-new-deps/README.md b/.claude/hooks/check-new-deps/README.md new file mode 100644 index 000000000..5be7f3a68 --- /dev/null +++ b/.claude/hooks/check-new-deps/README.md @@ -0,0 +1,102 @@ +# check-new-deps Hook + +A Claude Code pre-tool hook that checks new dependencies against [Socket.dev](https://socket.dev) before they're added to the project. It runs automatically every time Claude tries to edit or create a dependency manifest file. + +## What it does + +When Claude edits a file like `package.json`, `requirements.txt`, `Cargo.toml`, or any of 17+ supported ecosystems, this hook: + +1. **Detects the file type** and extracts dependency names from the content +2. **Diffs against the old content** (for edits) so only *newly added* deps are checked +3. **Queries the Socket.dev API** to check for malware and critical security alerts +4. **Blocks the edit** (exit code 2) if malware or critical alerts are found +5. **Warns** (but allows) if a package has a low quality score +6. **Allows** (exit code 0) if everything is clean or the file isn't a manifest + +## How it works + +``` +Claude wants to edit package.json + │ + ▼ +Hook receives the edit via stdin (JSON) + │ + ▼ +Extract new deps from new_string +Diff against old_string (if Edit) + │ + ▼ +Build Package URLs (PURLs) for each dep + │ + ▼ +Call sdk.checkMalware(components) + - ≤5 deps: parallel firewall API (fast, full data) + - >5 deps: batch PURL API (efficient) + │ + ├── Malware/critical alert → EXIT 2 (blocked) + ├── Low score → warn, EXIT 0 (allowed) + └── Clean → EXIT 0 (allowed) +``` + +## Supported ecosystems + +| File | Ecosystem | Example dep format | +|------|-----------|-------------------| +| `package.json` | npm | `"express": "^4.19"` | +| `package-lock.json`, `pnpm-lock.yaml`, `yarn.lock` | npm | lockfile entries | +| `requirements.txt`, `pyproject.toml`, `setup.py` | PyPI | `flask>=3.0` | +| `Cargo.toml`, `Cargo.lock` | Cargo (Rust) | `serde = "1.0"` | +| `go.mod`, `go.sum` | Go | `github.com/gin-gonic/gin v1.9` | +| `Gemfile`, `Gemfile.lock` | RubyGems | `gem 'rails'` | +| `composer.json`, `composer.lock` | Composer (PHP) | `"vendor/package": "^3.0"` | +| `pom.xml`, `build.gradle` | Maven (Java) | `commons` | +| `pubspec.yaml`, `pubspec.lock` | Pub (Dart) | `flutter_bloc: ^8.1` | +| `.csproj` | NuGet (.NET) | `` | +| `mix.exs` | Hex (Elixir) | `{:phoenix, "~> 1.7"}` | +| `Package.swift` | Swift PM | `.package(url: "...", from: "4.0")` | +| `*.tf` | Terraform | `source = "hashicorp/aws"` | +| `Brewfile` | Homebrew | `brew "git"` | +| `conanfile.*` | Conan (C/C++) | `boost/1.83.0` | +| `flake.nix` | Nix | `github:owner/repo` | +| `.github/workflows/*.yml` | GitHub Actions | `uses: owner/repo@ref` | + +## Configuration + +The hook is registered in `.claude/settings.json`: + +```json +{ + "hooks": { + "PreToolUse": [ + { + "matcher": "Edit|Write", + "hooks": [ + { + "type": "command", + "command": "node .claude/hooks/check-new-deps/index.mts" + } + ] + } + ] + } +} +``` + +## Dependencies + +All dependencies use `catalog:` references from the workspace root (`pnpm-workspace.yaml`): + +- `@socketsecurity/sdk` — Socket.dev SDK v4 with `checkMalware()` API +- `@socketsecurity/lib` — shared constants and path utilities +- `@socketregistry/packageurl-js` — Package URL (PURL) parsing and stringification + +## Caching + +API responses are cached in-memory for 5 minutes (max 500 entries) to avoid redundant network calls when Claude checks the same dependency multiple times in a session. + +## Exit codes + +| Code | Meaning | Claude behavior | +|------|---------|----------------| +| 0 | Allow | Edit/Write proceeds normally | +| 2 | Block | Edit/Write is rejected, Claude sees the error message | diff --git a/.claude/hooks/check-new-deps/index.mts b/.claude/hooks/check-new-deps/index.mts new file mode 100644 index 000000000..d84d95d45 --- /dev/null +++ b/.claude/hooks/check-new-deps/index.mts @@ -0,0 +1,705 @@ +#!/usr/bin/env node +// Claude Code PreToolUse hook — Socket.dev dependency firewall. +// +// Intercepts Edit/Write tool calls to dependency manifest files across +// 17+ package ecosystems. Extracts newly-added dependencies, builds +// Package URLs (PURLs), and checks them against the Socket.dev API +// using the SDK v4 checkMalware() method. +// +// Diff-aware: when old_string is present (Edit), only deps that +// appear in new_string but NOT in old_string are checked. +// +// Caching: API responses are cached in-process with a TTL to avoid +// redundant network calls when the same dep is checked repeatedly. +// The cache auto-evicts expired entries and caps at MAX_CACHE_SIZE. +// +// Exit codes: +// 0 = allow (no new deps, all clean, or non-dep file) +// 2 = block (malware or critical alert from Socket.dev) + +import { + parseNpmSpecifier, + stringify, +} from '@socketregistry/packageurl-js' +import type { PackageURL } from '@socketregistry/packageurl-js' +import { + SOCKET_PUBLIC_API_TOKEN, +} from '@socketsecurity/lib/constants/socket' +import { getDefaultLogger } from '@socketsecurity/lib/logger' +import { + normalizePath, +} from '@socketsecurity/lib/paths/normalize' +import { SocketSdk } from '@socketsecurity/sdk' +import type { MalwareCheckPackage } from '@socketsecurity/sdk' + +const logger = getDefaultLogger() + +// Per-request timeout (ms) to avoid blocking the hook on slow responses. +const API_TIMEOUT = 5_000 +// Deps scoring below this threshold trigger a warning (not a block). +const LOW_SCORE_THRESHOLD = 0.5 +// Max PURLs per batch request (API limit is 1024). +const MAX_BATCH_SIZE = 1024 +// How long (ms) to cache a successful API response (5 minutes). +const CACHE_TTL = 5 * 60 * 1_000 +// Maximum cache entries before forced eviction of oldest. +const MAX_CACHE_SIZE = 500 + +// SDK instance using the public API token (no user config needed). +const sdk = new SocketSdk(SOCKET_PUBLIC_API_TOKEN, { + timeout: API_TIMEOUT, +}) + +// --- types --- + +// Extracted dependency with ecosystem type, name, and optional scope. +interface Dep { + type: string + name: string + namespace?: string + version?: string +} + +// Shape of the JSON blob Claude Code pipes to the hook via stdin. +interface HookInput { + tool_name: string + tool_input?: { + file_path?: string + new_string?: string + old_string?: string + content?: string + } +} + +// Result of checking a single dep against the Socket.dev API. +interface CheckResult { + purl: string + blocked?: boolean + warned?: boolean + reason?: string + score?: number +} + + +// A cached API lookup result with expiration timestamp. +interface CacheEntry { + result: CheckResult | undefined + expiresAt: number +} + +// Function that extracts deps from file content. +type Extractor = (content: string) => Dep[] + +// --- cache --- + +// Simple TTL + max-size cache for API responses. +// Prevents redundant network calls when the same dep is checked +// multiple times in a session. Evicts expired entries on every +// get/set, and drops oldest entries if the cache exceeds MAX_CACHE_SIZE. +const cache = new Map() + +function cacheGet(key: string): CacheEntry | undefined { + const entry = cache.get(key) + if (!entry) return + if (Date.now() > entry.expiresAt) { + cache.delete(key) + return + } + return entry +} + +function cacheSet( + key: string, + result: CheckResult | undefined, +): void { + // Evict expired entries before inserting. + if (cache.size >= MAX_CACHE_SIZE) { + const now = Date.now() + for (const [k, v] of cache) { + if (now > v.expiresAt) cache.delete(k) + } + } + // If still over capacity, drop the oldest entries (FIFO). + if (cache.size >= MAX_CACHE_SIZE) { + const excess = cache.size - MAX_CACHE_SIZE + 1 + let dropped = 0 + for (const k of cache.keys()) { + if (dropped >= excess) break + cache.delete(k) + dropped++ + } + } + cache.set(key, { + result, + expiresAt: Date.now() + CACHE_TTL, + }) +} + +// Manifest file suffix → extractor function. +// __proto__: null prevents prototype-pollution on lookups. +const extractors: Record = { + __proto__: null as unknown as Extractor, + '.csproj': extract( + // .NET: + /PackageReference\s+Include="([^"]+)"/g, + (m): Dep => ({ type: 'nuget', name: m[1] }) + ), + '.tf': extractTerraform, + 'brew': extractBrewfile, + 'Brewfile': extractBrewfile, + 'build.gradle': extractMaven, + 'build.gradle.kts': extractMaven, + 'Cargo.lock': extract( + // Rust lockfile: [[package]]\nname = "serde"\nversion = "1.0.0" + /name\s*=\s*"([\w][\w-]*)"/gm, + (m): Dep => ({ type: 'cargo', name: m[1] }) + ), + 'Cargo.toml': extract( + // Rust: serde = "1.0" or serde = { version = "1.0", features = [...] } + /^(\w[\w-]*)\s*=\s*(?:\{[^}]*version\s*=\s*"[^"]*"|\s*"[^"]*")/gm, + (m): Dep => ({ type: 'cargo', name: m[1] }) + ), + 'conanfile.py': extractConan, + 'conanfile.txt': extractConan, + 'composer.lock': extract( + // PHP lockfile: "name": "vendor/package" + /"name":\s*"([a-z][\w-]*)\/([a-z][\w-]*)"/g, + (m): Dep => ({ + type: 'composer', + namespace: m[1], + name: m[2], + }) + ), + 'composer.json': extract( + // PHP: "vendor/package": "^3.0" + /"([a-z][\w-]*)\/([a-z][\w-]*)":\s*"/g, + (m): Dep => ({ + type: 'composer', + namespace: m[1], + name: m[2], + }) + ), + 'flake.nix': extractNixFlake, + 'Gemfile.lock': extract( + // Ruby lockfile: indented gem names under GEM > specs + /^\s{4}(\w[\w-]*)\s+\(/gm, + (m): Dep => ({ type: 'gem', name: m[1] }) + ), + 'Gemfile': extract( + // Ruby: gem 'rails', '~> 7.0' + /gem\s+['"]([^'"]+)['"]/g, + (m): Dep => ({ type: 'gem', name: m[1] }) + ), + 'go.sum': extract( + // Go checksum file: module/path v1.2.3 h1:hash= + /([\w./-]+)\s+v[\d.]+/gm, + (m): Dep => { + const parts = m[1].split('/') + return { + type: 'golang', + name: parts.pop()!, + namespace: parts.join('/') || undefined, + } + } + ), + 'go.mod': extract( + // Go: github.com/gin-gonic/gin v1.9.1 + /([\w./-]+)\s+v[\d.]+/gm, + (m): Dep => { + const parts = m[1].split('/') + return { + type: 'golang', + name: parts.pop()!, + namespace: parts.join('/') || undefined, + } + } + ), + 'mix.exs': extract( + // Elixir: {:phoenix, "~> 1.7"} + /\{:(\w+),/g, + (m): Dep => ({ type: 'hex', name: m[1] }) + ), + 'package-lock.json': extractNpmLockfile, + 'package.json': extractNpm, + 'Package.swift': extract( + // Swift: .package(url: "https://github.com/vapor/vapor", from: "4.0.0") + /\.package\s*\(\s*url:\s*"https:\/\/github\.com\/([^/]+)\/([^"]+)".*?from:\s*"([^"]+)"/gs, + (m): Dep => ({ + type: 'swift', + namespace: `github.com/${m[1]}`, + name: m[2], + version: m[3], + }) + ), + 'Pipfile.lock': extractPypi, + 'pnpm-lock.yaml': extractNpmLockfile, + 'poetry.lock': extract( + // Python poetry lockfile: [[package]]\nname = "flask" + /name\s*=\s*"([a-zA-Z][\w.-]*)"/gm, + (m): Dep => ({ type: 'pypi', name: m[1] }) + ), + 'pom.xml': extractMaven, + 'Project.toml': extract( + // Julia: JSON3 = "uuid-string" + /^(\w[\w.-]*)\s*=\s*"/gm, + (m): Dep => ({ type: 'julia', name: m[1] }) + ), + 'pubspec.lock': extract( + // Dart lockfile: top-level package names at column 2 + /^ (\w[\w_-]*):/gm, + (m): Dep => ({ type: 'pub', name: m[1] }) + ), + 'pubspec.yaml': extract( + // Dart: flutter_bloc: ^8.1.3 (2-space indented under dependencies:) + /^\s{2}(\w[\w_-]*):\s/gm, + (m): Dep => ({ type: 'pub', name: m[1] }) + ), + 'pyproject.toml': extractPypi, + 'requirements.txt': extractPypi, + 'setup.py': extractPypi, + 'yarn.lock': extractNpmLockfile, +} + +// --- main (only when executed directly, not imported) --- + +if (import.meta.filename === process.argv[1]) { + // Read the full JSON blob from stdin (piped by Claude Code). + let input = '' + for await (const chunk of process.stdin) input += chunk + const hook: HookInput = JSON.parse(input) + + if (hook.tool_name !== 'Edit' && hook.tool_name !== 'Write') { + process.exitCode = 0 + } else { + process.exitCode = await check(hook) + } +} + +// --- core --- + +// Orchestrates the full check: extract deps, diff against old, query API. +async function check(hook: HookInput): Promise { + // Normalize backslashes and collapse segments for cross-platform paths. + const filePath = normalizePath( + hook.tool_input?.file_path || '' + ) + + // GitHub Actions workflows live under .github/workflows/*.yml + const isWorkflow = + /\.github\/workflows\/.*\.ya?ml$/.test(filePath) + const extractor = isWorkflow + ? extractGitHubActions + : findExtractor(filePath) + if (!extractor) return 0 + + // Edit provides new_string; Write provides content. + const newContent = + hook.tool_input?.new_string + || hook.tool_input?.content + || '' + const oldContent = hook.tool_input?.old_string || '' + + const newDeps = extractor(newContent) + if (newDeps.length === 0) return 0 + + // Diff-aware: only check deps added in this edit, not pre-existing. + const deps = oldContent + ? diffDeps(newDeps, extractor(oldContent)) + : newDeps + if (deps.length === 0) return 0 + + // Check all deps via SDK checkMalware(). + const { blocked, warned } = await checkDepsBatch(deps) + + if (warned.length > 0) { + logger.warn('Socket: low-scoring dependencies (not blocked):') + for (const w of warned) { + logger.warn(` ${w.purl}: overall score ${w.score}`) + } + } + if (blocked.length > 0) { + logger.error(`Socket: blocked ${blocked.length} dep(s):`) + for (const b of blocked) { + logger.error(` ${b.purl}: ${b.reason}`) + } + return 2 + } + return 0 +} + +// Check deps against Socket.dev using SDK v4 checkMalware(). +// The SDK automatically routes small sets (<=5) to parallel firewall +// requests and larger sets to the batch PURL API. +// Deps already in cache are skipped; results are cached after lookup. +async function checkDepsBatch( + deps: Dep[], +): Promise<{ blocked: CheckResult[]; warned: CheckResult[] }> { + const blocked: CheckResult[] = [] + const warned: CheckResult[] = [] + + // Partition deps into cached vs uncached. + const uncached: Array<{ dep: Dep; purl: string }> = [] + for (const dep of deps) { + const purl = stringify(dep as unknown as PackageURL) + const cached = cacheGet(purl) + if (cached) { + if (cached.result?.blocked) blocked.push(cached.result) + else if (cached.result?.warned) warned.push(cached.result) + continue + } + uncached.push({ dep, purl }) + } + + if (!uncached.length) return { blocked, warned } + + try { + // Process in chunks to respect API batch size limit. + for (let i = 0; i < uncached.length; i += MAX_BATCH_SIZE) { + const batch = uncached.slice(i, i + MAX_BATCH_SIZE) + const components = batch.map(({ purl }) => ({ purl })) + + const result = await sdk.checkMalware(components) + + if (!result.success) { + logger.warn( + `Socket: API returned ${result.status}, allowing all` + ) + return { blocked, warned } + } + + // Build lookup keyed by full PURL (includes namespace + version). + const purlByKey = new Map() + for (const { dep, purl } of batch) { + const ns = dep.namespace ? `${dep.namespace}/` : '' + purlByKey.set(`${dep.type}:${ns}${dep.name}`, purl) + } + + for (const pkg of result.data as MalwareCheckPackage[]) { + const ns = pkg.namespace ? `${pkg.namespace}/` : '' + const key = `${pkg.type}:${ns}${pkg.name}` + const purl = purlByKey.get(key) + if (!purl) continue + + // Check for malware or critical-severity alerts. + const critical = pkg.alerts.find( + a => a.severity === 'critical' || a.type === 'malware' + ) + if (critical) { + const cr: CheckResult = { + purl, + blocked: true, + reason: `${critical.type} — ${critical.severity ?? 'critical'}`, + } + cacheSet(purl, cr) + blocked.push(cr) + continue + } + + // Warn on low quality score. + if ( + pkg.score?.overall !== undefined + && pkg.score.overall < LOW_SCORE_THRESHOLD + ) { + const wr: CheckResult = { + purl, + warned: true, + score: pkg.score.overall, + } + cacheSet(purl, wr) + warned.push(wr) + continue + } + + // No blocking alerts — clean dep. + cacheSet(purl, undefined) + } + } + } catch (e) { + // Network failure — log and allow all deps through. + logger.warn( + `Socket: network error` + + ` (${(e as Error).message}), allowing all` + ) + } + + return { blocked, warned } +} + +// Return deps in `newDeps` that don't appear in `oldDeps` (by PURL). +function diffDeps(newDeps: Dep[], oldDeps: Dep[]): Dep[] { + const old = new Set( + oldDeps.map(d => stringify(d as unknown as PackageURL)) + ) + return newDeps.filter( + d => !old.has(stringify(d as unknown as PackageURL)) + ) +} + +// Match file path suffix against the extractors map. +function findExtractor( + filePath: string, +): Extractor | undefined { + for (const [suffix, fn] of Object.entries(extractors)) { + if (filePath.endsWith(suffix)) return fn + } +} + +// --- extractor factory --- + +// Higher-order function: takes a regex and a match→Dep transform, +// returns an Extractor that applies matchAll and collects results. +function extract( + re: RegExp, + transform: (m: RegExpExecArray) => Dep | undefined, +): Extractor { + return (content: string): Dep[] => { + const deps: Dep[] = [] + for (const m of content.matchAll(re)) { + const dep = transform(m as RegExpExecArray) + if (dep) deps.push(dep) + } + return deps + } +} + +// --- ecosystem extractors (alphabetic) --- + +// Homebrew (Brewfile): brew "package" or tap "owner/repo". +function extractBrewfile(content: string): Dep[] { + const deps: Dep[] = [] + // brew "git", cask "firefox", tap "homebrew/cask" + for (const m of content.matchAll( + /(?:brew|cask)\s+['"]([^'"]+)['"]/g + )) { + deps.push({ type: 'brew', name: m[1] }) + } + return deps +} + +// Conan (C/C++): "boost/1.83.0" in conanfile.txt, +// or requires = "zlib/1.3.0" in conanfile.py. +function extractConan(content: string): Dep[] { + const deps: Dep[] = [] + for (const m of content.matchAll( + /([a-z][\w.-]+)\/[\d.]+/gm + )) { + deps.push({ type: 'conan', name: m[1] }) + } + return deps +} + +// GitHub Actions: "uses: owner/repo@ref" in workflow YAML. +// Handles subpaths like "org/repo/subpath@v1". +function extractGitHubActions(content: string): Dep[] { + const deps: Dep[] = [] + for (const m of content.matchAll( + /uses:\s*['"]?([^@\s'"]+)@([^\s'"]+)/g + )) { + const parts = m[1].split('/') + if (parts.length >= 2) { + deps.push({ + type: 'github', + namespace: parts[0], + name: parts.slice(1).join('/'), + }) + } + } + return deps +} + +// Maven/Gradle (Java/Kotlin): +// pom.xml: org.apachecommons +// build.gradle(.kts): implementation 'group:artifact:version' +function extractMaven(content: string): Dep[] { + const deps: Dep[] = [] + // XML-style Maven POM declarations. + for (const m of content.matchAll( + /([^<]+)<\/groupId>\s*([^<]+)<\/artifactId>/g + )) { + deps.push({ + type: 'maven', + namespace: m[1], + name: m[2], + }) + } + // Gradle shorthand: implementation/api/compile 'group:artifact:ver' + for (const m of content.matchAll( + /(?:implementation|api|compile)\s+['"]([^:'"]+):([^:'"]+)(?::[^'"]*)?['"]/g + )) { + deps.push({ + type: 'maven', + namespace: m[1], + name: m[2], + }) + } + return deps +} + +// Convenience entry point for testing: route any file path +// through the correct extractor and return all deps found. +function extractNewDeps( + rawFilePath: string, + content: string, +): Dep[] { + // Normalize backslashes and collapse segments for cross-platform. + const filePath = normalizePath(rawFilePath) + const isWorkflow = + /\.github\/workflows\/.*\.ya?ml$/.test(filePath) + const extractor = isWorkflow + ? extractGitHubActions + : findExtractor(filePath) + return extractor ? extractor(content) : [] +} + +// Nix flakes (flake.nix): inputs.name.url = "github:owner/repo" +// or inputs.name = { url = "github:owner/repo"; }; +function extractNixFlake(content: string): Dep[] { + const deps: Dep[] = [] + // Match github:owner/repo patterns in flake inputs. + for (const m of content.matchAll( + /github:([^/\s"]+)\/([^/\s"]+)/g + )) { + deps.push({ + type: 'github', + namespace: m[1], + name: m[2].replace(/\/.*$/, ''), + }) + } + return deps +} + +// npm lockfiles (package-lock.json, pnpm-lock.yaml, yarn.lock): +// Each format references packages differently: +// package-lock.json: "node_modules/@scope/name" or "node_modules/name" +// pnpm-lock.yaml: /@scope/name@version or /name@version +// yarn.lock: "@scope/name@version" or name@version +function extractNpmLockfile(content: string): Dep[] { + const deps: Dep[] = [] + const seen = new Set() + + // package-lock.json: "node_modules/name" or "node_modules/@scope/name" + for (const m of content.matchAll( + /node_modules\/((?:@[\w.-]+\/)?[\w][\w.-]*)/g + )) { + addNpmDep(m[1], deps, seen) + } + // pnpm-lock.yaml: '/name@ver' or '/@scope/name@ver' + // yarn.lock: "name@ver" or "@scope/name@ver" + for (const m of content.matchAll( + /['"/]((?:@[\w.-]+\/)?[\w][\w.-]*)@/gm + )) { + addNpmDep(m[1], deps, seen) + } + return deps +} + +// Deduplicated npm dep insertion using parseNpmSpecifier. +function addNpmDep( + raw: string, + deps: Dep[], + seen: Set, +): void { + if (seen.has(raw)) return + seen.add(raw) + if (raw.startsWith('.') || raw.startsWith('/')) return + if (raw.startsWith('@') || /^[a-z]/.test(raw)) { + const { namespace, name } = parseNpmSpecifier(raw) + if (name) deps.push({ type: 'npm', namespace, name }) + } +} + +// npm (package.json): "name": "version" or "@scope/name": "ver". +// Only matches entries where the value looks like a version/range/specifier, +// not arbitrary string values like scripts or config. +function extractNpm(content: string): Dep[] { + const deps: Dep[] = [] + for (const m of content.matchAll( + /"(@?[^"]+)":\s*"([^"]*)"/g + )) { + const raw = m[1] + const val = m[2] + // Skip builtins, relative, and absolute paths. + if ( + raw.startsWith('node:') + || raw.startsWith('.') + || raw.startsWith('/') + ) continue + // Value must look like a version specifier: semver, range, workspace:, + // catalog:, npm:, *, latest, or starts with ^~><=. + if (!/^[\^~><=*]|^\d|^workspace:|^catalog:|^npm:|^latest$/.test(val)) continue + // Only lowercase or scoped names are real deps. + if (raw.startsWith('@') || /^[a-z]/.test(raw)) { + const { namespace, name } = parseNpmSpecifier(raw) + if (name) deps.push({ type: 'npm', namespace, name }) + } + } + return deps +} + +// PyPI (requirements.txt, pyproject.toml, setup.py, Pipfile.lock): +// requirements.txt: package>=1.0 or package==1.0 at line start +// pyproject.toml: "package>=1.0" in dependencies arrays +// setup.py: "package>=1.0" in install_requires lists +function extractPypi(content: string): Dep[] { + const deps: Dep[] = [] + const seen = new Set() + // requirements.txt style: package name at line start, followed by + // version specifier, extras bracket, or end of line. + for (const m of content.matchAll( + /^([a-zA-Z][\w.-]+)\s*(?:[>===18.20.4", + "pnpm": ">=10.25.0" + } + }, + "../../../node_modules/.pnpm/@socketsecurity+lib@5.15.0_typescript@5.9.3/node_modules/@socketsecurity/lib": { + "version": "5.15.0", + "license": "MIT", + "devDependencies": { + "@anthropic-ai/claude-code": "2.1.92", + "@babel/core": "7.28.4", + "@babel/parser": "7.28.4", + "@babel/traverse": "7.28.4", + "@babel/types": "7.28.4", + "@dotenvx/dotenvx": "1.49.0", + "@inquirer/checkbox": "4.3.1", + "@inquirer/confirm": "5.1.16", + "@inquirer/input": "4.2.2", + "@inquirer/password": "4.0.18", + "@inquirer/search": "3.1.1", + "@inquirer/select": "4.3.2", + "@npmcli/arborist": "9.1.4", + "@npmcli/package-json": "7.0.0", + "@npmcli/promise-spawn": "8.0.3", + "@socketregistry/is-unicode-supported": "1.0.5", + "@socketregistry/packageurl-js": "1.4.1", + "@socketregistry/yocto-spinner": "1.0.25", + "@socketsecurity/lib-stable": "npm:@socketsecurity/lib@5.14.0", + "@types/node": "24.9.2", + "@typescript/native-preview": "7.0.0-dev.20250920.1", + "@vitest/coverage-v8": "4.0.3", + "@vitest/ui": "4.0.3", + "@yarnpkg/core": "4.5.0", + "@yarnpkg/extensions": "2.0.6", + "adm-zip": "0.5.16", + "cacache": "20.0.1", + "debug": "4.4.3", + "del": "8.0.1", + "del-cli": "6.0.0", + "esbuild": "0.25.11", + "eslint-plugin-sort-destructure-keys": "2.0.0", + "fast-glob": "3.3.3", + "fast-sort": "3.4.1", + "get-east-asian-width": "1.3.0", + "globals": "16.4.0", + "has-flag": "5.0.1", + "husky": "9.1.7", + "libnpmexec": "10.2.3", + "libnpmpack": "9.0.9", + "lint-staged": "15.2.11", + "magic-string": "0.30.17", + "make-fetch-happen": "15.0.2", + "nock": "14.0.10", + "normalize-package-data": "8.0.0", + "npm-package-arg": "13.0.0", + "oxfmt": "^0.37.0", + "oxlint": "1.53.0", + "p-map": "7.0.4", + "pacote": "21.0.1", + "picomatch": "4.0.4", + "pony-cause": "2.1.11", + "semver": "7.7.2", + "signal-exit": "4.1.0", + "spdx-correct": "3.2.0", + "spdx-expression-parse": "4.0.0", + "streaming-iterables": "8.0.1", + "supports-color": "10.2.2", + "tar-fs": "3.1.2", + "tar-stream": "3.1.8", + "taze": "19.9.2", + "trash": "10.0.0", + "type-coverage": "2.29.7", + "typescript": "5.9.2", + "validate-npm-package-name": "6.0.2", + "vite-tsconfig-paths": "5.1.4", + "vitest": "4.0.3", + "which": "5.0.0", + "yargs-parser": "22.0.0", + "yoctocolors-cjs": "2.1.3", + "zod": "4.1.12" + }, + "engines": { + "node": ">=22", + "pnpm": ">=10.25.0" + }, + "peerDependencies": { + "typescript": ">=5.0.0" + }, + "peerDependenciesMeta": { + "typescript": { + "optional": true + } + } + }, + "../../../node_modules/.pnpm/@socketsecurity+sdk@4.0.0_typescript@5.9.3/node_modules/@socketsecurity/sdk": { + "version": "4.0.0", + "license": "MIT", + "dependencies": { + "@socketsecurity/lib": "5.15.0", + "form-data": "4.0.5" + }, + "devDependencies": { + "@anthropic-ai/claude-code": "2.1.92", + "@babel/generator": "7.28.5", + "@babel/parser": "7.26.3", + "@babel/traverse": "7.26.4", + "@babel/types": "7.26.3", + "@dotenvx/dotenvx": "1.54.1", + "@oxlint/migrate": "1.52.0", + "@sveltejs/acorn-typescript": "1.0.8", + "@types/babel__traverse": "7.28.0", + "@types/node": "24.9.2", + "@typescript/native-preview": "7.0.0-dev.20250926.1", + "@vitest/coverage-v8": "4.0.3", + "acorn": "8.15.0", + "del": "8.0.1", + "dev-null-cli": "2.0.0", + "ecc-agentshield": "1.4.0", + "esbuild": "0.25.11", + "fast-glob": "3.3.3", + "husky": "9.1.7", + "magic-string": "0.30.14", + "nock": "14.0.10", + "openapi-typescript": "6.7.6", + "oxfmt": "0.37.0", + "oxlint": "1.52.0", + "semver": "7.7.2", + "taze": "19.9.2", + "type-coverage": "2.29.7", + "vitest": "4.0.3" + }, + "engines": { + "node": ">=18.20.8", + "pnpm": ">=10.33.0" + } + }, + "../../../node_modules/.pnpm/@types+node@24.9.2/node_modules/@types/node": { + "version": "24.9.2", + "dev": true, + "license": "MIT", + "dependencies": { + "undici-types": "~7.16.0" + } + }, + "node_modules/@socketregistry/packageurl-js": { + "resolved": "../../../node_modules/.pnpm/@socketregistry+packageurl-js@1.4.1/node_modules/@socketregistry/packageurl-js", + "link": true + }, + "node_modules/@socketsecurity/lib": { + "resolved": "../../../node_modules/.pnpm/@socketsecurity+lib@5.15.0_typescript@5.9.3/node_modules/@socketsecurity/lib", + "link": true + }, + "node_modules/@socketsecurity/sdk": { + "resolved": "../../../node_modules/.pnpm/@socketsecurity+sdk@4.0.0_typescript@5.9.3/node_modules/@socketsecurity/sdk", + "link": true + }, + "node_modules/@types/node": { + "resolved": "../../../node_modules/.pnpm/@types+node@24.9.2/node_modules/@types/node", + "link": true + } + } +} diff --git a/.claude/hooks/check-new-deps/package.json b/.claude/hooks/check-new-deps/package.json new file mode 100644 index 000000000..cd736d1bf --- /dev/null +++ b/.claude/hooks/check-new-deps/package.json @@ -0,0 +1,20 @@ +{ + "name": "@socketsecurity/hook-check-new-deps", + "private": true, + "type": "module", + "main": "./index.mts", + "exports": { + ".": "./index.mts" + }, + "scripts": { + "test": "node --test test/*.test.mts" + }, + "dependencies": { + "@socketregistry/packageurl-js": "1.4.1", + "@socketsecurity/lib": "5.15.0", + "@socketsecurity/sdk": "4.0.0" + }, + "devDependencies": { + "@types/node": "24.9.2" + } +} diff --git a/.claude/hooks/check-new-deps/test/extract-deps.test.mts b/.claude/hooks/check-new-deps/test/extract-deps.test.mts new file mode 100644 index 000000000..c1f94db01 --- /dev/null +++ b/.claude/hooks/check-new-deps/test/extract-deps.test.mts @@ -0,0 +1,749 @@ +import { describe, it } from 'node:test' +import { strict as assert } from 'node:assert' +import { execFile } from 'node:child_process' + +import { + cache, + cacheGet, + cacheSet, + extractBrewfile, + extractNewDeps, + extractNixFlake, + extractNpmLockfile, + extractTerraform, + diffDeps, +} from '../index.mts' + +const hookScript = new URL('../index.mts', import.meta.url).pathname + +// Helper: run the full hook as a subprocess +function runHook( + toolInput: Record, + toolName = 'Edit', +): Promise<{ code: number | null; stdout: string; stderr: string }> { + return new Promise((resolve) => { + const child = execFile( + 'node', + [hookScript], + { timeout: 15000 }, + (err, stdout, stderr) => { + resolve({ + code: child.exitCode + ?? (err as NodeJS.ErrnoException)?.code as unknown as number + ?? 1, + stdout, + stderr, + }) + }, + ) + child.stdin!.write(JSON.stringify({ + tool_name: toolName, + tool_input: toolInput, + })) + child.stdin!.end() + }) +} + +// ============================================================================ +// Unit tests: extractNewDeps per ecosystem +// ============================================================================ + +describe('extractNewDeps', () => { + // npm + describe('npm', () => { + it('unscoped', () => { + const d = extractNewDeps( + 'package.json', + '"lodash": "^4.17.21"', + ) + assert.equal(d.length, 1) + assert.equal(d[0].type, 'npm') + assert.equal(d[0].name, 'lodash') + assert.equal(d[0].namespace, undefined) + }) + it('scoped', () => { + const d = extractNewDeps( + 'package.json', + '"@types/node": "^20.0.0"', + ) + assert.equal(d[0].namespace, '@types') + assert.equal(d[0].name, 'node') + }) + it('multiple', () => { + const d = extractNewDeps( + 'package.json', + '"a": "1", "@b/c": "2", "d": "3"', + ) + assert.equal(d.length, 3) + }) + it('ignores node: builtins', () => { + assert.equal( + extractNewDeps('package.json', '"node:fs": "1"').length, + 0, + ) + }) + it('ignores relative', () => { + assert.equal( + extractNewDeps('package.json', '"./foo": "1"').length, + 0, + ) + }) + it('ignores absolute', () => { + assert.equal( + extractNewDeps('package.json', '"/foo": "1"').length, + 0, + ) + }) + it('ignores capitalized keys', () => { + assert.equal( + extractNewDeps('package.json', '"Name": "my-project"').length, + 0, + ) + }) + it('handles workspace protocol', () => { + const d = extractNewDeps( + 'package.json', + '"my-lib": "workspace:*"', + ) + assert.equal(d.length, 1) + }) + }) + + // cargo + describe('cargo', () => { + it('inline version', () => { + const d = extractNewDeps('Cargo.toml', 'serde = "1.0"') + assert.deepEqual(d[0], { type: 'cargo', name: 'serde' }) + }) + it('table version', () => { + const d = extractNewDeps( + 'Cargo.toml', + 'serde = { version = "1.0", features = ["derive"] }', + ) + assert.equal(d[0].name, 'serde') + }) + it('hyphenated name', () => { + assert.equal( + extractNewDeps('Cargo.toml', 'simd-json = "0.17"')[0].name, + 'simd-json', + ) + }) + it('multiple', () => { + assert.equal( + extractNewDeps('Cargo.toml', 'a = "1"\nb = { version = "2" }').length, + 2, + ) + }) + }) + + // golang + describe('golang', () => { + it('with namespace', () => { + const d = extractNewDeps( + 'go.mod', + 'github.com/gin-gonic/gin v1.9.1', + ) + assert.equal(d[0].namespace, 'github.com/gin-gonic') + assert.equal(d[0].name, 'gin') + }) + it('stdlib extension', () => { + const d = extractNewDeps( + 'go.mod', + 'golang.org/x/sync v0.7.0', + ) + assert.equal(d[0].namespace, 'golang.org/x') + assert.equal(d[0].name, 'sync') + }) + }) + + // pypi + describe('pypi', () => { + it('requirements.txt', () => { + const d = extractNewDeps( + 'requirements.txt', + 'flask>=2.0\nrequests==2.31', + ) + assert.ok(d.some(x => x.name === 'flask')) + assert.ok(d.some(x => x.name === 'requests')) + }) + it('pyproject.toml', () => { + assert.ok( + extractNewDeps('pyproject.toml', '"django>=4.2"') + .some(x => x.name === 'django'), + ) + }) + it('setup.py', () => { + assert.ok( + extractNewDeps('setup.py', '"numpy>=1.24"') + .some(x => x.name === 'numpy'), + ) + }) + }) + + // gem + describe('gem', () => { + it('single-quoted', () => { + assert.equal( + extractNewDeps('Gemfile', "gem 'rails'")[0].name, + 'rails', + ) + }) + it('double-quoted with version', () => { + assert.equal( + extractNewDeps('Gemfile', 'gem "sinatra", "~> 3.0"')[0].name, + 'sinatra', + ) + }) + }) + + // maven + describe('maven', () => { + it('pom.xml', () => { + const d = extractNewDeps( + 'pom.xml', + 'org.apachecommons-lang3', + ) + assert.equal(d[0].namespace, 'org.apache') + assert.equal(d[0].name, 'commons-lang3') + }) + it('build.gradle', () => { + const d = extractNewDeps( + 'build.gradle', + "implementation 'com.google.guava:guava:32.1'", + ) + assert.equal(d[0].namespace, 'com.google.guava') + assert.equal(d[0].name, 'guava') + }) + it('build.gradle.kts', () => { + const d = extractNewDeps( + 'build.gradle.kts', + "implementation 'org.jetbrains:annotations:24.0'", + ) + assert.equal(d[0].name, 'annotations') + }) + }) + + // swift + describe('swift', () => { + it('github package', () => { + const d = extractNewDeps( + 'Package.swift', + '.package(url: "https://github.com/vapor/vapor", from: "4.0.0")', + ) + assert.equal(d[0].type, 'swift') + assert.equal(d[0].name, 'vapor') + }) + }) + + // pub + describe('pub', () => { + it('dart package', () => { + assert.equal( + extractNewDeps('pubspec.yaml', ' flutter_bloc: ^8.1')[0].name, + 'flutter_bloc', + ) + }) + }) + + // hex + describe('hex', () => { + it('elixir dep', () => { + assert.equal( + extractNewDeps('mix.exs', '{:phoenix, "~> 1.7"}')[0].name, + 'phoenix', + ) + }) + }) + + // composer + describe('composer', () => { + it('vendor/package', () => { + const d = extractNewDeps( + 'composer.json', + '"monolog/monolog": "^3.0"', + ) + assert.equal(d[0].namespace, 'monolog') + assert.equal(d[0].name, 'monolog') + }) + }) + + // nuget + describe('nuget', () => { + it('.csproj PackageReference', () => { + assert.equal( + extractNewDeps( + 'test.csproj', + '', + )[0].name, + 'Newtonsoft.Json', + ) + }) + }) + + // julia + describe('julia', () => { + it('Project.toml', () => { + assert.equal( + extractNewDeps('Project.toml', 'JSON3 = "0a1fb500"')[0].name, + 'JSON3', + ) + }) + }) + + // conan + describe('conan', () => { + it('conanfile.txt', () => { + assert.equal( + extractNewDeps('conanfile.txt', 'boost/1.83.0')[0].name, + 'boost', + ) + }) + it('conanfile.py', () => { + assert.equal( + extractNewDeps('conanfile.py', 'requires = "zlib/1.3.0"')[0].name, + 'zlib', + ) + }) + }) + + // github actions + describe('github actions', () => { + it('extracts action with version', () => { + const d = extractNewDeps( + '.github/workflows/ci.yml', + 'uses: actions/checkout@v4', + ) + assert.equal(d[0].type, 'github') + assert.equal(d[0].namespace, 'actions') + assert.equal(d[0].name, 'checkout') + }) + it('extracts action with SHA', () => { + const d = extractNewDeps( + '.github/workflows/ci.yml', + 'uses: actions/setup-node@abc123def', + ) + assert.equal(d[0].name, 'setup-node') + }) + it('extracts action with subpath', () => { + const d = extractNewDeps( + '.github/workflows/ci.yml', + 'uses: org/repo/subpath@v1', + ) + assert.equal(d[0].namespace, 'org') + assert.equal(d[0].name, 'repo/subpath') + }) + it('multiple actions', () => { + const d = extractNewDeps( + '.github/workflows/ci.yml', + 'uses: a/b@v1\n uses: c/d@v2', + ) + assert.equal(d.length, 2) + }) + }) + + // terraform + describe('terraform', () => { + it('registry module source', () => { + const d = extractTerraform( + 'source = "hashicorp/consul/aws"', + ) + assert.equal(d[0].type, 'terraform') + assert.equal(d[0].namespace, 'hashicorp') + assert.equal(d[0].name, 'consul') + }) + it('via extractNewDeps', () => { + const d = extractNewDeps( + 'main.tf', + 'source = "cloudflare/dns/cloudflare"', + ) + assert.equal(d.length, 1) + assert.equal(d[0].namespace, 'cloudflare') + }) + }) + + // nix flakes + describe('nix flakes', () => { + it('github input', () => { + const d = extractNixFlake( + 'inputs.nixpkgs.url = "github:NixOS/nixpkgs"', + ) + assert.equal(d[0].type, 'github') + assert.equal(d[0].namespace, 'NixOS') + assert.equal(d[0].name, 'nixpkgs') + }) + it('via extractNewDeps', () => { + const d = extractNewDeps( + 'flake.nix', + 'url = "github:nix-community/home-manager"', + ) + assert.equal(d.length, 1) + assert.equal(d[0].name, 'home-manager') + }) + }) + + // homebrew + describe('homebrew', () => { + it('brew formula', () => { + const d = extractBrewfile('brew "git"') + assert.equal(d[0].type, 'brew') + assert.equal(d[0].name, 'git') + }) + it('cask', () => { + const d = extractBrewfile('cask "firefox"') + assert.equal(d[0].name, 'firefox') + }) + it('via extractNewDeps', () => { + const d = extractNewDeps( + 'Brewfile', + 'brew "wget"\ncask "iterm2"', + ) + assert.equal(d.length, 2) + }) + }) + + // lockfiles + describe('lockfiles', () => { + it('package-lock.json', () => { + const d = extractNpmLockfile( + '"node_modules/lodash": { "version": "4.17.21" }', + ) + assert.ok(d.some(x => x.name === 'lodash')) + }) + it('pnpm-lock.yaml', () => { + const d = extractNewDeps( + 'pnpm-lock.yaml', + "'/lodash@4.17.21':\n resolution:", + ) + assert.ok(d.some(x => x.name === 'lodash')) + }) + it('yarn.lock', () => { + const d = extractNewDeps( + 'yarn.lock', + '"lodash@^4.17.21":\n version:', + ) + assert.ok(d.some(x => x.name === 'lodash')) + }) + it('Cargo.lock', () => { + const d = extractNewDeps( + 'Cargo.lock', + 'name = "serde"\nversion = "1.0.210"', + ) + assert.equal(d[0].type, 'cargo') + assert.equal(d[0].name, 'serde') + }) + it('go.sum', () => { + const d = extractNewDeps( + 'go.sum', + 'github.com/gin-gonic/gin v1.9.1 h1:abc=', + ) + assert.equal(d[0].type, 'golang') + assert.equal(d[0].name, 'gin') + }) + it('Gemfile.lock', () => { + const d = extractNewDeps( + 'Gemfile.lock', + ' rails (7.1.0)\n activerecord (7.1.0)', + ) + assert.ok(d.some(x => x.name === 'rails')) + }) + it('composer.lock', () => { + const d = extractNewDeps( + 'composer.lock', + '"name": "monolog/monolog"', + ) + assert.equal(d[0].namespace, 'monolog') + assert.equal(d[0].name, 'monolog') + }) + it('poetry.lock', () => { + const d = extractNewDeps( + 'poetry.lock', + 'name = "flask"\nversion = "3.0.0"', + ) + assert.ok(d.some(x => x.name === 'flask')) + }) + it('pubspec.lock', () => { + const d = extractNewDeps( + 'pubspec.lock', + ' flutter_bloc:\n dependency: direct', + ) + assert.ok(d.some(x => x.name === 'flutter_bloc')) + }) + }) + + // windows paths + describe('windows paths', () => { + it('handles backslash in package.json path', () => { + const d = extractNewDeps( + 'C:\\Users\\foo\\project\\package.json', + '"lodash": "^4"', + ) + assert.equal(d.length, 1) + assert.equal(d[0].name, 'lodash') + }) + it('handles backslash in workflow path', () => { + const d = extractNewDeps( + '.github\\workflows\\ci.yml', + 'uses: actions/checkout@v4', + ) + assert.equal(d.length, 1) + assert.equal(d[0].name, 'checkout') + }) + it('handles backslash in Cargo.toml path', () => { + const d = extractNewDeps( + 'src\\parser\\Cargo.toml', + 'serde = "1.0"', + ) + assert.equal(d.length, 1) + }) + }) + + // pass-through + describe('unsupported files', () => { + it('returns empty for .rs', () => { + assert.equal( + extractNewDeps('main.rs', 'fn main(){}').length, + 0, + ) + }) + it('returns empty for .js', () => { + assert.equal( + extractNewDeps('index.js', 'x').length, + 0, + ) + }) + it('returns empty for .md', () => { + assert.equal( + extractNewDeps('README.md', '# hi').length, + 0, + ) + }) + }) +}) + +// ============================================================================ +// Unit tests: diffDeps +// ============================================================================ + +describe('diffDeps', () => { + it('returns only new deps', () => { + const newDeps = [ + { type: 'npm', name: 'a' }, + { type: 'npm', name: 'b' }, + ] + const oldDeps = [{ type: 'npm', name: 'a' }] + const result = diffDeps(newDeps, oldDeps) + assert.equal(result.length, 1) + assert.equal(result[0].name, 'b') + }) + it('returns empty when no new deps', () => { + const deps = [{ type: 'npm', name: 'a' }] + assert.equal(diffDeps(deps, deps).length, 0) + }) + it('returns all when old is empty', () => { + const deps = [ + { type: 'npm', name: 'a' }, + { type: 'npm', name: 'b' }, + ] + assert.equal(diffDeps(deps, []).length, 2) + }) +}) + +// ============================================================================ +// Unit tests: cache +// ============================================================================ + +describe('cache', () => { + it('stores and retrieves entries', () => { + cache.clear() + cacheSet('pkg:npm/test', { purl: 'pkg:npm/test', blocked: true }) + const entry = cacheGet('pkg:npm/test') + assert.ok(entry) + assert.equal(entry!.result?.blocked, true) + }) + it('returns undefined for missing keys', () => { + cache.clear() + assert.equal(cacheGet('pkg:npm/missing'), undefined) + }) + it('evicts expired entries on get', () => { + cache.clear() + // Manually insert an expired entry. + cache.set('pkg:npm/expired', { + result: undefined, + expiresAt: Date.now() - 1000, + }) + assert.equal(cacheGet('pkg:npm/expired'), undefined) + assert.equal(cache.has('pkg:npm/expired'), false) + }) + it('caches undefined for clean deps', () => { + cache.clear() + cacheSet('pkg:npm/clean', undefined) + const entry = cacheGet('pkg:npm/clean') + assert.ok(entry) + assert.equal(entry!.result, undefined) + }) +}) + +// ============================================================================ +// Integration tests: full hook subprocess +// ============================================================================ + +describe('hook integration', () => { + // Blocking + it('blocks malware (npm)', async () => { + const r = await runHook({ + file_path: '/tmp/package.json', + new_string: '"bradleymeck": "^1.0.0"', + }) + assert.equal(r.code, 2) + assert.ok(r.stderr.includes('blocked')) + }) + + // Allowing + it('allows clean npm package', async () => { + const r = await runHook({ + file_path: '/tmp/package.json', + new_string: '"lodash": "^4.17.21"', + }) + assert.equal(r.code, 0) + }) + it('allows scoped npm package', async () => { + const r = await runHook({ + file_path: '/tmp/package.json', + new_string: '"@types/node": "^20"', + }) + assert.equal(r.code, 0) + }) + it('allows cargo crate', async () => { + const r = await runHook({ + file_path: '/tmp/Cargo.toml', + new_string: 'serde = "1.0"', + }) + assert.equal(r.code, 0) + }) + it('allows go module', async () => { + const r = await runHook({ + file_path: '/tmp/go.mod', + new_string: 'golang.org/x/sync v0.7.0', + }) + assert.equal(r.code, 0) + }) + it('allows pypi package', async () => { + const r = await runHook({ + file_path: '/tmp/requirements.txt', + new_string: 'flask>=2.0', + }) + assert.equal(r.code, 0) + }) + it('allows ruby gem', async () => { + const r = await runHook({ + file_path: '/tmp/Gemfile', + new_string: "gem 'rails'", + }) + assert.equal(r.code, 0) + }) + it('allows maven dep', async () => { + const r = await runHook({ + file_path: '/tmp/build.gradle', + new_string: "implementation 'com.google.guava:guava:32.1'", + }) + assert.equal(r.code, 0) + }) + it('allows nuget package', async () => { + const r = await runHook({ + file_path: '/tmp/test.csproj', + new_string: '', + }) + assert.equal(r.code, 0) + }) + it('allows github action', async () => { + const r = await runHook({ + file_path: '/tmp/.github/workflows/ci.yml', + new_string: 'uses: actions/checkout@v4', + }) + assert.equal(r.code, 0) + }) + + // Pass-through + it('passes non-dep files', async () => { + const r = await runHook({ + file_path: '/tmp/main.rs', + new_string: 'fn main(){}', + }) + assert.equal(r.code, 0) + }) + it('passes non-Edit tools', async () => { + const r = await runHook( + { file_path: '/tmp/package.json' }, + 'Read', + ) + assert.equal(r.code, 0) + }) + + // Diff-aware + it('skips pre-existing deps in old_string', async () => { + const r = await runHook({ + file_path: '/tmp/package.json', + old_string: '"lodash": "^4.17.21"', + new_string: '"lodash": "^4.17.21"', + }) + assert.equal(r.code, 0) + }) + it('checks only NEW deps when old_string present', async () => { + const r = await runHook({ + file_path: '/tmp/package.json', + old_string: '"lodash": "^4.17.21"', + new_string: '"lodash": "^4.17.21", "bradleymeck": "^1.0.0"', + }) + assert.equal(r.code, 2) + }) + + // Batch (multiple deps in one request) + it('checks multiple deps in batch (fast)', async () => { + const start = Date.now() + const r = await runHook({ + file_path: '/tmp/package.json', + new_string: '"express": "^4", "lodash": "^4", "debug": "^4"', + }) + assert.equal(r.code, 0) + assert.ok( + Date.now() - start < 5000, + 'batch should be fast', + ) + }) + + // Write tool + it('works with Write tool', async () => { + const r = await runHook( + { file_path: '/tmp/package.json', content: '"lodash": "^4"' }, + 'Write', + ) + assert.equal(r.code, 0) + }) + + // Empty content + it('handles empty content', async () => { + const r = await runHook({ + file_path: '/tmp/package.json', + new_string: '', + }) + assert.equal(r.code, 0) + }) + + // Lockfile monitoring + it('checks lockfile deps (Cargo.lock)', async () => { + const r = await runHook({ + file_path: '/tmp/Cargo.lock', + new_string: 'name = "serde"\nversion = "1.0.210"', + }) + assert.equal(r.code, 0) + }) + + // Terraform + it('checks terraform module', async () => { + const r = await runHook({ + file_path: '/tmp/main.tf', + new_string: 'source = "hashicorp/consul/aws"', + }) + assert.equal(r.code, 0) + }) +}) diff --git a/.claude/hooks/check-new-deps/tsconfig.json b/.claude/hooks/check-new-deps/tsconfig.json new file mode 100644 index 000000000..748e9587e --- /dev/null +++ b/.claude/hooks/check-new-deps/tsconfig.json @@ -0,0 +1,13 @@ +{ + "compilerOptions": { + "noEmit": true, + "target": "esnext", + "module": "nodenext", + "moduleResolution": "nodenext", + "rewriteRelativeImportExtensions": true, + "erasableSyntaxOnly": true, + "verbatimModuleSyntax": true, + "strict": true, + "skipLibCheck": true + } +} diff --git a/.claude/hooks/setup-security-tools/README.md b/.claude/hooks/setup-security-tools/README.md new file mode 100644 index 000000000..96c301596 --- /dev/null +++ b/.claude/hooks/setup-security-tools/README.md @@ -0,0 +1,73 @@ +# setup-security-tools Hook + +Sets up all three Socket security tools for local development in one command. + +## Tools + +### 1. AgentShield +Scans your Claude Code configuration (`.claude/` directory) for security issues like prompt injection, leaked secrets, and overly permissive tool permissions. + +**How it's installed**: Already a devDependency (`ecc-agentshield`). The setup script just verifies it's available — if not, run `pnpm install`. + +### 2. Zizmor +Static analysis tool for GitHub Actions workflows. Catches unpinned actions, secret exposure, template injection, and permission issues. + +**How it's installed**: Binary downloaded from [GitHub releases](https://github.com/woodruffw/zizmor/releases), SHA-256 verified, cached at `~/.socket/zizmor/bin/zizmor`. If you already have it via `brew install zizmor`, the download is skipped. + +### 3. SFW (Socket Firewall) +Intercepts package manager commands (`npm install`, `pnpm add`, etc.) and scans packages against Socket.dev's malware database before installation. + +**How it's installed**: Binary downloaded from GitHub, SHA-256 verified, cached via the dlx system at `~/.socket/_dlx/`. Small wrapper scripts ("shims") are created at `~/.socket/sfw/shims/` that transparently route commands through the firewall. + +**Free vs Enterprise**: If you have a `SOCKET_API_KEY` (in env, `.env`, or `.env.local`), enterprise mode is used with additional ecosystem support (gem, bundler, nuget, go). Otherwise, free mode covers npm, yarn, pnpm, pip, uv, and cargo. + +## How to use + +``` +/setup-security-tools +``` + +Claude will ask if you have an API key, then run the setup script. + +## What gets installed where + +| Tool | Location | Persists across repos? | +|------|----------|----------------------| +| AgentShield | `node_modules/.bin/agentshield` | No (per-repo devDep) | +| Zizmor | `~/.socket/zizmor/bin/zizmor` | Yes | +| SFW binary | `~/.socket/_dlx//sfw` | Yes | +| SFW shims | `~/.socket/sfw/shims/npm`, etc. | Yes | + +## Pre-push integration + +The `.git-hooks/pre-push` hook automatically runs: +- **AgentShield scan** (blocks push on failure) +- **Zizmor scan** (blocks push on failure) + +This means every push is checked — you don't have to remember to run `/security-scan`. + +## Re-running + +Safe to run multiple times: +- AgentShield: just re-checks availability +- Zizmor: skips download if cached binary matches expected version +- SFW: skips download if cached, only rewrites shims if content changed + +## Copying to another repo + +Self-contained. To add to another Socket repo: + +1. Copy `.claude/hooks/setup-security-tools/` and `.claude/commands/setup-security-tools.md` +2. Run `cd .claude/hooks/setup-security-tools && npm install` +3. Ensure `.claude/hooks/` is not gitignored (add `!/.claude/hooks/` to `.gitignore`) +4. Ensure `ecc-agentshield` is a devDep in the target repo + +## Troubleshooting + +**"AgentShield not found"** — Run `pnpm install`. It's the `ecc-agentshield` devDependency. + +**"zizmor found but wrong version"** — The script downloads the expected version to `~/.socket/zizmor/bin/`. Your system version (e.g. from brew) will be ignored in favor of the correct version. + +**"No supported package managers found"** — SFW only creates shims for package managers found on your PATH. Install npm/pnpm/etc. first. + +**SFW shims not intercepting** — Make sure `~/.socket/sfw/shims` is at the *front* of PATH. Run `which npm` — it should point to the shim, not the real binary. diff --git a/.claude/hooks/setup-security-tools/index.mts b/.claude/hooks/setup-security-tools/index.mts new file mode 100644 index 000000000..3d349b5a1 --- /dev/null +++ b/.claude/hooks/setup-security-tools/index.mts @@ -0,0 +1,342 @@ +#!/usr/bin/env node +// Setup script for Socket security tools. +// +// Configures three tools: +// 1. AgentShield — scans Claude AI config for prompt injection / secrets. +// Already a devDep (ecc-agentshield); this script verifies it's installed. +// 2. Zizmor — static analysis for GitHub Actions workflows. Downloads the +// correct binary, verifies SHA-256, caches at ~/.socket/zizmor/bin/zizmor. +// 3. SFW (Socket Firewall) — intercepts package manager commands to scan +// for malware. Downloads binary, verifies SHA-256, creates PATH shims. +// Enterprise vs free determined by SOCKET_API_KEY in env / .env / .env.local. + +import { createHash } from 'node:crypto' +import { existsSync, createReadStream, readFileSync, promises as fs } from 'node:fs' +import { tmpdir } from 'node:os' +import path from 'node:path' +import process from 'node:process' + +import { whichSync } from '@socketsecurity/lib/bin' +import { downloadBinary } from '@socketsecurity/lib/dlx/binary' +import { httpDownload } from '@socketsecurity/lib/http-request' +import { getDefaultLogger } from '@socketsecurity/lib/logger' +import { getSocketHomePath } from '@socketsecurity/lib/paths/socket' +import { spawn, spawnSync } from '@socketsecurity/lib/spawn' + +const logger = getDefaultLogger() + +// ── Zizmor constants ── + +const ZIZMOR_VERSION = '1.23.1' + +const ZIZMOR_CHECKSUMS: Record = { + __proto__: null as unknown as string, + 'zizmor-aarch64-apple-darwin.tar.gz': + '2632561b974c69f952258c1ab4b7432d5c7f92e555704155c3ac28a2910bd717', + 'zizmor-aarch64-unknown-linux-gnu.tar.gz': + '3725d7cd7102e4d70827186389f7d5930b6878232930d0a3eb058d7e5b47e658', + 'zizmor-x86_64-apple-darwin.tar.gz': + '89d5ed42081dd9d0433a10b7545fac42b35f1f030885c278b9712b32c66f2597', + 'zizmor-x86_64-pc-windows-msvc.zip': + '33c2293ff02834720dd7cd8b47348aafb2e95a19bdc993c0ecaca9c804ade92a', + 'zizmor-x86_64-unknown-linux-gnu.tar.gz': + '67a8df0a14352dd81882e14876653d097b99b0f4f6b6fe798edc0320cff27aff', +} + +const ZIZMOR_ASSET_MAP: Record = { + __proto__: null as unknown as string, + 'darwin-arm64': 'zizmor-aarch64-apple-darwin.tar.gz', + 'darwin-x64': 'zizmor-x86_64-apple-darwin.tar.gz', + 'linux-arm64': 'zizmor-aarch64-unknown-linux-gnu.tar.gz', + 'linux-x64': 'zizmor-x86_64-unknown-linux-gnu.tar.gz', + 'win32-x64': 'zizmor-x86_64-pc-windows-msvc.zip', +} + +// ── SFW constants ── + +const SFW_ENTERPRISE_CHECKSUMS: Record = { + __proto__: null as unknown as string, + 'linux-arm64': '671270231617142404a1564e52672f79b806f9df3f232fcc7606329c0246da55', + 'linux-x86_64': '9115b4ca8021eb173eb9e9c3627deb7f1066f8debd48c5c9d9f3caabb2a26a4b', + 'macos-arm64': 'acad0b517601bb7408e2e611c9226f47dcccbd83333d7fc5157f1d32ed2b953d', + 'macos-x86_64': '01d64d40effda35c31f8d8ee1fed1388aac0a11aba40d47fba8a36024b77500c', + 'windows-x86_64': '9a50e1ddaf038138c3f85418dc5df0113bbe6fc884f5abe158beaa9aea18d70a', +} + +const SFW_FREE_CHECKSUMS: Record = { + __proto__: null as unknown as string, + 'linux-arm64': 'df2eedb2daf2572eee047adb8bfd81c9069edcb200fc7d3710fca98ec3ca81a1', + 'linux-x86_64': '4a1e8b65e90fce7d5fd066cf0af6c93d512065fa4222a475c8d959a6bc14b9ff', + 'macos-arm64': 'bf1616fc44ac49f1cb2067fedfa127a3ae65d6ec6d634efbb3098cfa355e5555', + 'macos-x86_64': '724ccea19d847b79db8cc8e38f5f18ce2dd32336007f42b11bed7d2e5f4a2566', + 'windows-x86_64': 'c953e62ad7928d4d8f2302f5737884ea1a757babc26bed6a42b9b6b68a5d54af', +} + +const SFW_PLATFORM_MAP: Record = { + __proto__: null as unknown as string, + 'darwin-arm64': 'macos-arm64', + 'darwin-x64': 'macos-x86_64', + 'linux-arm64': 'linux-arm64', + 'linux-x64': 'linux-x86_64', + 'win32-x64': 'windows-x86_64', +} + +const SFW_FREE_ECOSYSTEMS = ['npm', 'yarn', 'pnpm', 'pip', 'uv', 'cargo'] +const SFW_ENTERPRISE_EXTRA = ['gem', 'bundler', 'nuget'] + +// ── Shared helpers ── + +function findApiKey(): string | undefined { + const envKey = process.env['SOCKET_API_KEY'] + if (envKey) return envKey + for (const filename of ['.env.local', '.env']) { + const filepath = path.join(process.cwd(), filename) + if (existsSync(filepath)) { + try { + const content = readFileSync(filepath, 'utf8') + const match = /^SOCKET_API_KEY\s*=\s*(.+)$/m.exec(content) + if (match) { + return match[1]! + .replace(/\s*#.*$/, '') // Strip inline comments. + .trim() // Strip whitespace before quote removal. + .replace(/^["']|["']$/g, '') // Strip surrounding quotes. + } + } catch { + // Ignore read errors. + } + } + } + return undefined +} + +async function sha256File(filePath: string): Promise { + return new Promise((resolve, reject) => { + const hash = createHash('sha256') + const stream = createReadStream(filePath) + stream.on('data', (chunk: Buffer) => hash.update(chunk)) + stream.on('end', () => resolve(hash.digest('hex'))) + stream.on('error', reject) + }) +} + +// ── AgentShield ── + +function setupAgentShield(): boolean { + logger.log('=== AgentShield ===') + const bin = whichSync('agentshield', { nothrow: true }) + if (bin && typeof bin === 'string') { + const result = spawnSync(bin, ['--version'], { stdio: 'pipe' }) + const ver = typeof result.stdout === 'string' + ? result.stdout.trim() + : result.stdout.toString().trim() + logger.log(`Found: ${bin} (${ver})`) + return true + } + logger.warn('Not found. Run "pnpm install" to install ecc-agentshield.') + return false +} + +// ── Zizmor ── + +async function checkZizmorVersion(binPath: string): Promise { + try { + const result = await spawn(binPath, ['--version'], { stdio: 'pipe' }) + const output = typeof result.stdout === 'string' + ? result.stdout.trim() + : result.stdout.toString().trim() + return output.includes(ZIZMOR_VERSION) + } catch { + return false + } +} + +async function setupZizmor(): Promise { + logger.log('=== Zizmor ===') + + // Check PATH first (e.g. brew install). + const systemBin = whichSync('zizmor', { nothrow: true }) + if (systemBin && typeof systemBin === 'string') { + if (await checkZizmorVersion(systemBin)) { + logger.log(`Found on PATH: ${systemBin} (v${ZIZMOR_VERSION})`) + return true + } + logger.log(`Found on PATH but wrong version (need v${ZIZMOR_VERSION})`) + } + + // Check cached binary. + const ext = process.platform === 'win32' ? '.exe' : '' + const binDir = path.join(getSocketHomePath(), 'zizmor', 'bin') + const binPath = path.join(binDir, `zizmor${ext}`) + if (existsSync(binPath) && await checkZizmorVersion(binPath)) { + logger.log(`Cached: ${binPath} (v${ZIZMOR_VERSION})`) + return true + } + + // Download. + const platformKey = `${process.platform}-${process.arch}` + const asset = ZIZMOR_ASSET_MAP[platformKey] + if (!asset) throw new Error(`Unsupported platform: ${platformKey}`) + const expectedSha = ZIZMOR_CHECKSUMS[asset] + if (!expectedSha) throw new Error(`No checksum for: ${asset}`) + const url = `https://github.com/woodruffw/zizmor/releases/download/v${ZIZMOR_VERSION}/${asset}` + const isZip = asset.endsWith('.zip') + + logger.log(`Downloading zizmor v${ZIZMOR_VERSION} (${asset})...`) + const tmpFile = path.join(tmpdir(), `zizmor-${Date.now()}-${asset}`) + try { + await httpDownload(url, tmpFile, { sha256: expectedSha }) + logger.log('Download complete, checksum verified.') + + // Extract. + const extractDir = path.join(tmpdir(), `zizmor-extract-${Date.now()}`) + await fs.mkdir(extractDir, { recursive: true }) + if (isZip) { + await spawn('powershell', ['-NoProfile', '-Command', + `Expand-Archive -Path '${tmpFile}' -DestinationPath '${extractDir}' -Force`], { stdio: 'pipe' }) + } else { + await spawn('tar', ['xzf', tmpFile, '-C', extractDir], { stdio: 'pipe' }) + } + + // Install. + const extractedBin = path.join(extractDir, `zizmor${ext}`) + if (!existsSync(extractedBin)) throw new Error(`Binary not found after extraction: ${extractedBin}`) + await fs.mkdir(binDir, { recursive: true }) + await fs.copyFile(extractedBin, binPath) + await fs.chmod(binPath, 0o755) + await fs.rm(extractDir, { recursive: true, force: true }) + + logger.log(`Installed to ${binPath}`) + return true + } finally { + if (existsSync(tmpFile)) await fs.unlink(tmpFile).catch(() => {}) + } +} + +// ── SFW ── + +async function setupSfw(apiKey: string | undefined): Promise { + const isEnterprise = !!apiKey + logger.log(`=== Socket Firewall (${isEnterprise ? 'enterprise' : 'free'}) ===`) + + // Platform. + const platformKey = `${process.platform}-${process.arch}` + const sfwPlatform = SFW_PLATFORM_MAP[platformKey] + if (!sfwPlatform) throw new Error(`Unsupported platform: ${platformKey}`) + + // Checksum + asset. + const checksums = isEnterprise ? SFW_ENTERPRISE_CHECKSUMS : SFW_FREE_CHECKSUMS + const sha256 = checksums[sfwPlatform] + if (!sha256) throw new Error(`No checksum for: ${sfwPlatform}`) + const prefix = isEnterprise ? 'sfw' : 'sfw-free' + const suffix = sfwPlatform.startsWith('windows') ? '.exe' : '' + const asset = `${prefix}-${sfwPlatform}${suffix}` + const repo = isEnterprise ? 'SocketDev/firewall-release' : 'SocketDev/sfw-free' + const url = `https://github.com/${repo}/releases/latest/download/${asset}` + const binaryName = isEnterprise ? 'sfw' : 'sfw-free' + + // Download (with cache + checksum). + const { binaryPath, downloaded } = await downloadBinary({ url, name: binaryName, sha256 }) + logger.log(downloaded ? `Downloaded to ${binaryPath}` : `Cached at ${binaryPath}`) + + // Create shims. + const isWindows = process.platform === 'win32' + const shimDir = path.join(getSocketHomePath(), 'sfw', 'shims') + await fs.mkdir(shimDir, { recursive: true }) + const ecosystems = [...SFW_FREE_ECOSYSTEMS] + if (isEnterprise) { + ecosystems.push(...SFW_ENTERPRISE_EXTRA) + if (process.platform === 'linux') ecosystems.push('go') + } + const cleanPath = (process.env['PATH'] ?? '').split(path.delimiter) + .filter(p => p !== shimDir).join(path.delimiter) + const created: string[] = [] + for (const cmd of ecosystems) { + const realBin = whichSync(cmd, { nothrow: true, path: cleanPath }) + if (!realBin || typeof realBin !== 'string') continue + + // Bash shim (macOS/Linux). + const bashLines = [ + '#!/bin/bash', + `export PATH="$(echo "$PATH" | tr ':' '\\n' | grep -vxF '${shimDir}' | paste -sd: -)"`, + ] + if (isEnterprise) { + // Read API key from env at runtime — never embed secrets in scripts. + bashLines.push( + 'if [ -z "$SOCKET_API_KEY" ]; then', + ' for f in .env.local .env; do', + ' if [ -f "$f" ]; then', + ' _val="$(grep -m1 "^SOCKET_API_KEY\\s*=" "$f" | sed "s/^[^=]*=\\s*//" | sed "s/\\s*#.*//" | sed "s/^[\"\\x27]\\(.*\\)[\"\\x27]$/\\1/")"', + ' if [ -n "$_val" ]; then SOCKET_API_KEY="$_val"; break; fi', + ' fi', + ' done', + ' export SOCKET_API_KEY', + 'fi', + ) + } + if (!isEnterprise) { + // Workaround: sfw-free does not yet set GIT_SSL_CAINFO (temporary). + bashLines.push('export GIT_SSL_NO_VERIFY=true') + } + bashLines.push(`exec "${binaryPath}" "${realBin}" "$@"`) + const bashContent = bashLines.join('\n') + '\n' + const bashPath = path.join(shimDir, cmd) + if (!existsSync(bashPath) || await fs.readFile(bashPath, 'utf8').catch(() => '') !== bashContent) { + await fs.writeFile(bashPath, bashContent, { mode: 0o755 }) + } + created.push(cmd) + + // Windows .cmd shim (strips shim dir from PATH, then execs through sfw). + if (isWindows) { + const cmdContent = + `@echo off\r\n` + + `set "PATH=;%PATH%;"\r\n` + + `set "PATH=%PATH:;${shimDir};=%"\r\n` + + `set "PATH=%PATH:~1,-1%"\r\n` + + `"${binaryPath}" "${realBin}" %*\r\n` + const cmdPath = path.join(shimDir, `${cmd}.cmd`) + if (!existsSync(cmdPath) || await fs.readFile(cmdPath, 'utf8').catch(() => '') !== cmdContent) { + await fs.writeFile(cmdPath, cmdContent) + } + } + } + + if (created.length) { + logger.log(`Shims: ${created.join(', ')}`) + logger.log(`Shim dir: ${shimDir}`) + logger.log(`Activate: export PATH="${shimDir}:$PATH"`) + } else { + logger.warn('No supported package managers found on PATH.') + } + return true +} + +// ── Main ── + +async function main(): Promise { + logger.log('Setting up Socket security tools...\n') + + const apiKey = findApiKey() + + const agentshieldOk = setupAgentShield() + logger.log('') + const zizmorOk = await setupZizmor() + logger.log('') + const sfwOk = await setupSfw(apiKey) + logger.log('') + + logger.log('=== Summary ===') + logger.log(`AgentShield: ${agentshieldOk ? 'ready' : 'NOT AVAILABLE'}`) + logger.log(`Zizmor: ${zizmorOk ? 'ready' : 'FAILED'}`) + logger.log(`SFW: ${sfwOk ? 'ready' : 'FAILED'}`) + + if (agentshieldOk && zizmorOk && sfwOk) { + logger.log('\nAll security tools ready.') + } else { + logger.warn('\nSome tools not available. See above.') + } +} + +main().catch((e: unknown) => { + logger.error(e instanceof Error ? e.message : String(e)) + process.exitCode = 1 +}) diff --git a/.claude/hooks/setup-security-tools/package-lock.json b/.claude/hooks/setup-security-tools/package-lock.json new file mode 100644 index 000000000..e5070b268 --- /dev/null +++ b/.claude/hooks/setup-security-tools/package-lock.json @@ -0,0 +1,31 @@ +{ + "name": "@socketsecurity/hook-setup-security-tools", + "lockfileVersion": 3, + "requires": true, + "packages": { + "": { + "name": "@socketsecurity/hook-setup-security-tools", + "dependencies": { + "@socketsecurity/lib": "5.15.0" + } + }, + "node_modules/@socketsecurity/lib": { + "version": "5.15.0", + "resolved": "https://registry.npmjs.org/@socketsecurity/lib/-/lib-5.15.0.tgz", + "integrity": "sha512-+I7+lR0WBCXWgRxMTQx+N70azONVGr68ndi25pz53D6QLdIQ8gfBgOgC34opECXL9lPUqVCMYNr3XFS/bHABIQ==", + "license": "MIT", + "engines": { + "node": ">=22", + "pnpm": ">=10.25.0" + }, + "peerDependencies": { + "typescript": ">=5.0.0" + }, + "peerDependenciesMeta": { + "typescript": { + "optional": true + } + } + } + } +} diff --git a/.claude/hooks/setup-security-tools/package.json b/.claude/hooks/setup-security-tools/package.json new file mode 100644 index 000000000..37fee40ad --- /dev/null +++ b/.claude/hooks/setup-security-tools/package.json @@ -0,0 +1,9 @@ +{ + "name": "@socketsecurity/hook-setup-security-tools", + "private": true, + "type": "module", + "main": "./index.mts", + "dependencies": { + "@socketsecurity/lib": "5.15.0" + } +} diff --git a/.claude/hooks/setup-security-tools/update.mts b/.claude/hooks/setup-security-tools/update.mts new file mode 100644 index 000000000..e9e6d390f --- /dev/null +++ b/.claude/hooks/setup-security-tools/update.mts @@ -0,0 +1,512 @@ +#!/usr/bin/env node +// Update script for Socket security tools. +// +// Checks for new releases of zizmor and sfw, respecting the pnpm +// minimumReleaseAge cooldown (read from pnpm-workspace.yaml) for third-party tools. +// Socket-owned tools (sfw) are excluded from cooldown. +// +// Updates embedded checksums in index.mts when new versions are found. + +import { createHash } from 'node:crypto' +import { existsSync, readFileSync, promises as fs } from 'node:fs' +import { tmpdir } from 'node:os' +import path from 'node:path' +import { fileURLToPath } from 'node:url' + +import { httpDownload, httpRequest } from '@socketsecurity/lib/http-request' +import { getDefaultLogger } from '@socketsecurity/lib/logger' +import { spawn } from '@socketsecurity/lib/spawn' + +const logger = getDefaultLogger() + +const __filename = fileURLToPath(import.meta.url) +const __dirname = path.dirname(__filename) +const INDEX_FILE = path.join(__dirname, 'index.mts') + +const MS_PER_MINUTE = 60_000 +const DEFAULT_COOLDOWN_MINUTES = 10_080 + +// Read minimumReleaseAge from pnpm-workspace.yaml (minutes → ms). +function readCooldownMs(): number { + let dir = __dirname + for (let i = 0; i < 10; i += 1) { + const candidate = path.join(dir, 'pnpm-workspace.yaml') + if (existsSync(candidate)) { + try { + const content = readFileSync(candidate, 'utf8') + const match = /^minimumReleaseAge:\s*(\d+)/m.exec(content) + if (match) return Number(match[1]) * MS_PER_MINUTE + } catch { + // Read error. + } + logger.warn(`Could not read minimumReleaseAge from ${candidate}, defaulting to ${DEFAULT_COOLDOWN_MINUTES} minutes`) + return DEFAULT_COOLDOWN_MINUTES * MS_PER_MINUTE + } + const parent = path.dirname(dir) + if (parent === dir) break + dir = parent + } + logger.warn(`pnpm-workspace.yaml not found, defaulting cooldown to ${DEFAULT_COOLDOWN_MINUTES} minutes`) + return DEFAULT_COOLDOWN_MINUTES * MS_PER_MINUTE +} + +const COOLDOWN_MS = readCooldownMs() + +// ── GitHub API helpers ── + +interface GhRelease { + assets: GhAsset[] + published_at: string + tag_name: string +} + +interface GhAsset { + browser_download_url: string + name: string +} + +async function ghApiLatestRelease(repo: string): Promise { + const result = await spawn( + 'gh', + ['api', `repos/${repo}/releases/latest`, '--cache', '1h'], + { stdio: 'pipe' }, + ) + const stdout = + typeof result.stdout === 'string' + ? result.stdout + : result.stdout.toString() + return JSON.parse(stdout) as GhRelease +} + +function isOlderThanCooldown(publishedAt: string): boolean { + const published = new Date(publishedAt).getTime() + return Date.now() - published >= COOLDOWN_MS +} + +function versionFromTag(tag: string): string { + return tag.replace(/^v/, '') +} + +// ── Checksum computation ── + +async function computeSha256(filePath: string): Promise { + const content = await fs.readFile(filePath) + return createHash('sha256').update(content).digest('hex') +} + +async function downloadAndHash(url: string): Promise { + const tmpFile = path.join(tmpdir(), `security-tools-update-${Date.now()}-${Math.random().toString(36).slice(2)}`) + try { + await httpDownload(url, tmpFile, { retries: 2 }) + return await computeSha256(tmpFile) + } finally { + await fs.unlink(tmpFile).catch(() => {}) + } +} + +// ── Index file manipulation ── + +function readIndexFile(): string { + return readFileSync(INDEX_FILE, 'utf8') +} + +async function writeIndexFile(content: string): Promise { + await fs.writeFile(INDEX_FILE, content, 'utf8') +} + +function replaceConstant( + source: string, + name: string, + oldValue: string, + newValue: string, +): string { + const escaped = oldValue.replace(/[.*+?^${}()|[\]\\]/g, '\\$&') + const pattern = new RegExp(`(const ${name}\\s*=\\s*')${escaped}'`) + return source.replace(pattern, `$1${newValue}'`) +} + +function replaceChecksumValue( + source: string, + assetName: string, + oldHash: string, + newHash: string, +): string { + // Match the specific asset line in a checksums object. + const escaped = assetName.replace(/[.*+?^${}()|[\]\\]/g, '\\$&') + const pattern = new RegExp( + `('${escaped}':\\s*\\n\\s*')${oldHash}'`, + ) + if (pattern.test(source)) { + return source.replace(pattern, `$1${newHash}'`) + } + // Single-line format: 'asset-name': 'hash', + const singleLine = new RegExp( + `('${escaped}':\\s*')${oldHash}'`, + ) + return source.replace(singleLine, `$1${newHash}'`) +} + +// ── Zizmor update ── + +interface UpdateResult { + reason: string + skipped: boolean + tool: string + updated: boolean +} + +// Map from index.mts asset names to zizmor release asset names. +const ZIZMOR_ASSETS: Record = { + __proto__: null as unknown as string, + 'zizmor-aarch64-apple-darwin.tar.gz': + 'zizmor-aarch64-apple-darwin.tar.gz', + 'zizmor-aarch64-unknown-linux-gnu.tar.gz': + 'zizmor-aarch64-unknown-linux-gnu.tar.gz', + 'zizmor-x86_64-apple-darwin.tar.gz': + 'zizmor-x86_64-apple-darwin.tar.gz', + 'zizmor-x86_64-pc-windows-msvc.zip': + 'zizmor-x86_64-pc-windows-msvc.zip', + 'zizmor-x86_64-unknown-linux-gnu.tar.gz': + 'zizmor-x86_64-unknown-linux-gnu.tar.gz', +} + +async function updateZizmor(source: string): Promise<{ + result: UpdateResult + source: string +}> { + const tool = 'zizmor' + logger.log(`=== Checking ${tool} ===`) + + let release: GhRelease + try { + release = await ghApiLatestRelease('woodruffw/zizmor') + } catch (e) { + const msg = e instanceof Error ? e.message : String(e) + logger.warn(`Failed to fetch zizmor releases: ${msg}`) + return { + result: { tool, skipped: true, updated: false, reason: `API error: ${msg}` }, + source, + } + } + + const latestVersion = versionFromTag(release.tag_name) + // Extract current version from source. + const currentMatch = /const ZIZMOR_VERSION = '([^']+)'/.exec(source) + const currentVersion = currentMatch ? currentMatch[1] : '' + + logger.log(`Current: v${currentVersion}, Latest: v${latestVersion}`) + + if (latestVersion === currentVersion) { + logger.log('Already current.') + return { + result: { tool, skipped: false, updated: false, reason: 'already current' }, + source, + } + } + + // Respect cooldown for third-party tools. + if (!isOlderThanCooldown(release.published_at)) { + const daysOld = ((Date.now() - new Date(release.published_at).getTime()) / 86_400_000).toFixed(1) + const cooldownDays = (COOLDOWN_MS / 86_400_000).toFixed(0) + logger.log(`v${latestVersion} is only ${daysOld} days old (need ${cooldownDays}). Skipping.`) + return { + result: { tool, skipped: true, updated: false, reason: `too new (${daysOld} days, need ${cooldownDays})` }, + source, + } + } + + logger.log(`Updating to v${latestVersion}...`) + + // Try to get checksums from the release's checksums.txt asset first. + let checksumMap: Record | undefined + const checksumsAsset = release.assets.find(a => a.name === 'checksums.txt') + if (checksumsAsset) { + try { + const resp = await httpRequest(checksumsAsset.browser_download_url) + if (resp.ok) { + checksumMap = { __proto__: null } as unknown as Record + for (const line of resp.text().split('\n')) { + const match = /^([a-f0-9]{64})\s+(.+)$/.exec(line.trim()) + if (match) { + checksumMap[match[2]!] = match[1]! + } + } + } + } catch { + // Fall through to per-asset download. + } + } + + // Compute checksums for each platform asset. + let updated = source + let allFound = true + for (const assetName of Object.keys(ZIZMOR_ASSETS)) { + let newHash: string | undefined + + // Try checksums.txt first. + if (checksumMap && checksumMap[assetName]) { + newHash = checksumMap[assetName] + } else { + // Download and compute. + const asset = release.assets.find(a => a.name === assetName) + if (!asset) { + logger.warn(` Asset not found in release: ${assetName}`) + allFound = false + continue + } + logger.log(` Computing checksum for ${assetName}...`) + try { + newHash = await downloadAndHash(asset.browser_download_url) + } catch (e) { + const msg = e instanceof Error ? e.message : String(e) + logger.warn(` Failed to download ${assetName}: ${msg}`) + allFound = false + continue + } + } + + if (!newHash) { + allFound = false + continue + } + + // Find and replace the old hash. + const oldHashMatch = new RegExp( + `'${assetName.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}':\\s*\\n\\s*'([a-f0-9]{64})'`, + ).exec(updated) + const oldHashSingle = new RegExp( + `'${assetName.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}':\\s*'([a-f0-9]{64})'`, + ).exec(updated) + const oldHash = oldHashMatch?.[1] ?? oldHashSingle?.[1] + if (oldHash && oldHash !== newHash) { + updated = replaceChecksumValue(updated, assetName, oldHash, newHash) + logger.log(` ${assetName}: ${oldHash.slice(0, 12)}... -> ${newHash.slice(0, 12)}...`) + } else if (oldHash === newHash) { + logger.log(` ${assetName}: unchanged`) + } + } + + if (!allFound) { + logger.warn('Some assets could not be verified. Skipping version bump.') + return { + result: { tool, skipped: true, updated: false, reason: 'incomplete asset checksums' }, + source, + } + } + + // Update version constant. + updated = replaceConstant(updated, 'ZIZMOR_VERSION', currentVersion!, latestVersion) + logger.log(`Updated ZIZMOR_VERSION: ${currentVersion} -> ${latestVersion}`) + + return { + result: { tool, skipped: false, updated: true, reason: `${currentVersion} -> ${latestVersion}` }, + source: updated, + } +} + +// ── SFW update ── + +const SFW_FREE_ASSET_NAMES: Record = { + __proto__: null as unknown as string, + 'linux-arm64': 'sfw-free-linux-arm64', + 'linux-x86_64': 'sfw-free-linux-x86_64', + 'macos-arm64': 'sfw-free-macos-arm64', + 'macos-x86_64': 'sfw-free-macos-x86_64', + 'windows-x86_64': 'sfw-free-windows-x86_64.exe', +} + +const SFW_ENTERPRISE_ASSET_NAMES: Record = { + __proto__: null as unknown as string, + 'linux-arm64': 'sfw-linux-arm64', + 'linux-x86_64': 'sfw-linux-x86_64', + 'macos-arm64': 'sfw-macos-arm64', + 'macos-x86_64': 'sfw-macos-x86_64', + 'windows-x86_64': 'sfw-windows-x86_64.exe', +} + +async function fetchSfwChecksums( + repo: string, + label: string, + assetNames: Record, + currentChecksums: Record, +): Promise<{ + checksums: Record + changed: boolean +}> { + let release: GhRelease + try { + release = await ghApiLatestRelease(repo) + } catch (e) { + const msg = e instanceof Error ? e.message : String(e) + logger.warn(`Failed to fetch ${label} releases: ${msg}`) + return { checksums: currentChecksums, changed: false } + } + + logger.log(` ${label}: latest ${release.tag_name} (published ${release.published_at.slice(0, 10)})`) + + const newChecksums: Record = { __proto__: null } as unknown as Record + let changed = false + + for (const { 0: platform, 1: assetName } of Object.entries(assetNames)) { + const asset = release.assets.find(a => a.name === assetName) + if (!asset) { + // Use latest/download URL pattern for sfw (uses /releases/latest/download/). + const url = `https://github.com/${repo}/releases/latest/download/${assetName}` + logger.log(` Computing checksum for ${assetName}...`) + try { + const hash = await downloadAndHash(url) + newChecksums[platform] = hash + if (currentChecksums[platform] !== hash) { + logger.log(` ${platform}: ${(currentChecksums[platform] ?? '').slice(0, 12)}... -> ${hash.slice(0, 12)}...`) + changed = true + } + } catch (e) { + const msg = e instanceof Error ? e.message : String(e) + logger.warn(` Failed to download ${assetName}: ${msg}`) + newChecksums[platform] = currentChecksums[platform] ?? '' + } + } else { + logger.log(` Computing checksum for ${assetName}...`) + try { + const hash = await downloadAndHash(asset.browser_download_url) + newChecksums[platform] = hash + if (currentChecksums[platform] !== hash) { + logger.log(` ${platform}: ${(currentChecksums[platform] ?? '').slice(0, 12)}... -> ${hash.slice(0, 12)}...`) + changed = true + } + } catch (e) { + const msg = e instanceof Error ? e.message : String(e) + logger.warn(` Failed to download ${assetName}: ${msg}`) + newChecksums[platform] = currentChecksums[platform] ?? '' + } + } + } + + return { checksums: newChecksums, changed } +} + +function extractChecksums( + source: string, + objectName: string, +): Record { + const result: Record = { __proto__: null } as unknown as Record + // Find the object in source. + const objPattern = new RegExp( + `const ${objectName}[^{]*\\{[^}]*?(?:'([^']+)':\\s*'([a-f0-9]{64})'[,\\s]*)+`, + 's', + ) + const objMatch = objPattern.exec(source) + if (!objMatch) return result + + const block = objMatch[0] + const entryPattern = /'([^']+)':\s*\n?\s*'([a-f0-9]{64})'/g + let match: RegExpExecArray | null + while ((match = entryPattern.exec(block)) !== null) { + if (match[1] !== '__proto__') { + result[match[1]!] = match[2]! + } + } + return result +} + +async function updateSfw(source: string): Promise<{ + results: UpdateResult[] + source: string +}> { + logger.log('=== Checking SFW ===') + // Socket-owned tools: no cooldown. + logger.log('Socket-owned tool: cooldown excluded.') + + const results: UpdateResult[] = [] + + // Extract current checksums from source. + const currentFree = extractChecksums(source, 'SFW_FREE_CHECKSUMS') + const currentEnterprise = extractChecksums(source, 'SFW_ENTERPRISE_CHECKSUMS') + + // Check sfw-free. + logger.log('') + const free = await fetchSfwChecksums( + 'SocketDev/sfw-free', + 'sfw-free', + SFW_FREE_ASSET_NAMES, + currentFree, + ) + + let updated = source + if (free.changed) { + for (const { 0: platform, 1: hash } of Object.entries(free.checksums)) { + if (currentFree[platform] && currentFree[platform] !== hash) { + updated = replaceChecksumValue(updated, platform, currentFree[platform]!, hash) + } + } + results.push({ tool: 'sfw-free', skipped: false, updated: true, reason: 'checksums updated' }) + } else { + results.push({ tool: 'sfw-free', skipped: false, updated: false, reason: 'already current' }) + } + + // Check sfw enterprise. + logger.log('') + const enterprise = await fetchSfwChecksums( + 'SocketDev/firewall-release', + 'sfw-enterprise', + SFW_ENTERPRISE_ASSET_NAMES, + currentEnterprise, + ) + + if (enterprise.changed) { + for (const { 0: platform, 1: hash } of Object.entries(enterprise.checksums)) { + if (currentEnterprise[platform] && currentEnterprise[platform] !== hash) { + updated = replaceChecksumValue(updated, platform, currentEnterprise[platform]!, hash) + } + } + results.push({ tool: 'sfw-enterprise', skipped: false, updated: true, reason: 'checksums updated' }) + } else { + results.push({ tool: 'sfw-enterprise', skipped: false, updated: false, reason: 'already current' }) + } + + return { results, source: updated } +} + +// ── Main ── + +async function main(): Promise { + logger.log('Checking for security tool updates...\n') + + let source = readIndexFile() + const allResults: UpdateResult[] = [] + + // 1. Check zizmor (third-party, respects cooldown). + const zizmor = await updateZizmor(source) + source = zizmor.source + allResults.push(zizmor.result) + logger.log('') + + // 2. Check sfw (Socket-owned, no cooldown). + const sfw = await updateSfw(source) + source = sfw.source + allResults.push(...sfw.results) + logger.log('') + + // Write updated index.mts if anything changed. + const anyUpdated = allResults.some(r => r.updated) + if (anyUpdated) { + await writeIndexFile(source) + logger.log('Updated index.mts with new checksums.\n') + } + + // Report. + logger.log('=== Summary ===') + for (const r of allResults) { + const status = r.updated ? 'UPDATED' : r.skipped ? 'SKIPPED' : 'CURRENT' + logger.log(` ${r.tool}: ${status} (${r.reason})`) + } + + if (!anyUpdated) { + logger.log('\nNo updates needed.') + } +} + +main().catch((e: unknown) => { + logger.error(e instanceof Error ? e.message : String(e)) + process.exitCode = 1 +}) diff --git a/.claude/settings.json b/.claude/settings.json new file mode 100644 index 000000000..ac130fc10 --- /dev/null +++ b/.claude/settings.json @@ -0,0 +1,15 @@ +{ + "hooks": { + "PreToolUse": [ + { + "matcher": "Edit|Write", + "hooks": [ + { + "type": "command", + "command": "node .claude/hooks/check-new-deps/index.mts" + } + ] + } + ] + } +} diff --git a/.claude/skills/security-scan/SKILL.md b/.claude/skills/security-scan/SKILL.md index 161fb5bfa..640bf210d 100644 --- a/.claude/skills/security-scan/SKILL.md +++ b/.claude/skills/security-scan/SKILL.md @@ -7,6 +7,13 @@ description: Runs a multi-tool security scan — AgentShield for Claude config, Multi-tool security scanning pipeline for the repository. +## Related: check-new-deps Hook + +This repo includes a pre-tool hook (`.claude/hooks/check-new-deps/`) that automatically +checks new dependencies against Socket.dev's malware API before Claude adds them. +The hook runs on every Edit/Write to manifest files — see its README for details. +This skill covers broader security scanning; the hook provides real-time dependency protection. + ## When to Use - After modifying `.claude/` config, settings, hooks, or agent definitions diff --git a/.claude/skills/updating/SKILL.md b/.claude/skills/updating/SKILL.md index 8d8d3b207..f8d50f96f 100644 --- a/.claude/skills/updating/SKILL.md +++ b/.claude/skills/updating/SKILL.md @@ -26,10 +26,13 @@ Your task is to update all dependencies in socket-cli: npm packages via `pnpm ru 1. **Validate Environment** - Verify clean working directory; detect CI vs interactive mode. 2. **Update npm Packages** - Run `pnpm run update`; commit if changes detected. 3. **Update External Tool Checksums** - Invoke the `updating-checksums` skill. +3b. **Update Security Tools** - Run `node .claude/hooks/setup-security-tools/update.mts` to check for new zizmor/sfw releases. Respects pnpm `minimumReleaseAge` cooldown for third-party tools (zizmor) but updates Socket tools (sfw) immediately. Updates embedded checksums in the setup hook. +3c. **Sync Claude Code version** - Run `claude --version` to get the installed version. If it's newer than the `@anthropic-ai/claude-code` entry in `pnpm-workspace.yaml` catalog, update both the catalog entry AND the `minimumReleaseAgeExclude` pinned version. This bypasses cooldown since we're the ones running it. Then run `pnpm install` to update the lockfile. 4. **Final Validation** - In interactive mode: `pnpm run fix --all`, `pnpm run check --all`, `pnpm test`. Skipped in CI. 5. **Report Summary** - List updates applied, commits created, validation results, and next steps. ## Coordinates - `updating-checksums` skill for external tool checksums +- `node .claude/hooks/setup-security-tools/update.mts` for security tool version updates - `pnpm run update` for npm packages diff --git a/.git-hooks/pre-push b/.git-hooks/pre-push new file mode 100755 index 000000000..2bf9a70e6 --- /dev/null +++ b/.git-hooks/pre-push @@ -0,0 +1,222 @@ +#!/bin/bash +# Socket Security Pre-push Hook +# Security enforcement layer for all pushes. +# Validates all commits being pushed for security issues and AI attribution. + +set -e + +# Colors for output. +RED='\033[0;31m' +YELLOW='\033[1;33m' +GREEN='\033[0;32m' +NC='\033[0m' + +printf "${GREEN}Running mandatory pre-push validation...${NC}\n" + +# Allowed public API key (used in socket-lib). +ALLOWED_PUBLIC_KEY="sktsec_t_--RAN5U4ivauy4w37-6aoKyYPDt5ZbaT5JBVMqiwKo_api" + +# Get the remote name and URL. +remote="$1" +url="$2" + +TOTAL_ERRORS=0 + +# ============================================================================ +# PRE-CHECK 1: AgentShield scan on Claude config (blocks push on failure) +# ============================================================================ +if command -v agentshield >/dev/null 2>&1 || [ -x "$(pnpm bin 2>/dev/null)/agentshield" ]; then + AGENTSHIELD="$(command -v agentshield 2>/dev/null || echo "$(pnpm bin)/agentshield")" + if ! "$AGENTSHIELD" scan --quiet 2>/dev/null; then + printf "${RED}✗ AgentShield: security issues found in Claude config${NC}\n" + printf "Run 'pnpm exec agentshield scan' for details\n" + TOTAL_ERRORS=$((TOTAL_ERRORS + 1)) + fi +fi + +# ============================================================================ +# PRE-CHECK 2: zizmor scan on GitHub Actions workflows +# ============================================================================ +ZIZMOR="" +if command -v zizmor >/dev/null 2>&1; then + ZIZMOR="$(command -v zizmor)" +elif [ -x "$HOME/.socket/zizmor/bin/zizmor" ]; then + ZIZMOR="$HOME/.socket/zizmor/bin/zizmor" +fi +if [ -n "$ZIZMOR" ] && [ -d ".github/" ]; then + if ! "$ZIZMOR" .github/ 2>/dev/null; then + printf "${RED}✗ Zizmor: workflow security issues found${NC}\n" + printf "Run 'zizmor .github/' for details\n" + TOTAL_ERRORS=$((TOTAL_ERRORS + 1)) + fi +fi + +# Read stdin for refs being pushed. +while read local_ref local_sha remote_ref remote_sha; do + # Skip tag pushes: tags point to existing commits already validated. + if echo "$local_ref" | grep -q '^refs/tags/'; then + printf "${GREEN}Skipping tag push: %s${NC}\n" "$local_ref" + continue + fi + + # Skip delete pushes. + if [ "$local_sha" = "0000000000000000000000000000000000000000" ]; then + continue + fi + + # Get the range of commits being pushed. + if [ "$remote_sha" = "0000000000000000000000000000000000000000" ]; then + # New branch - only check commits not on the default remote branch. + default_branch=$(git symbolic-ref refs/remotes/origin/HEAD 2>/dev/null | sed 's@^refs/remotes/origin/@@') + if [ -z "$default_branch" ]; then + default_branch="main" + fi + if git rev-parse "origin/$default_branch" >/dev/null 2>&1; then + range="origin/$default_branch..$local_sha" + else + # No remote default branch, fall back to release tag. + latest_release=$(git tag --list 'v*' --sort=-version:refname --merged "$local_sha" | head -1) + if [ -n "$latest_release" ]; then + range="$latest_release..$local_sha" + else + range="$local_sha" + fi + fi + else + # Existing branch - check new commits since remote. + # Limit scope to commits after the latest published release on this branch. + latest_release=$(git tag --list 'v*' --sort=-version:refname --merged "$remote_sha" | head -1) + if [ -n "$latest_release" ]; then + # Only check commits after the latest release that are being pushed. + range="$latest_release..$local_sha" + else + # No release tags found, check new commits only. + range="$remote_sha..$local_sha" + fi + fi + + # Validate the computed range before using it. + if ! git rev-list "$range" >/dev/null 2>&1; then + printf "${RED}✗ Invalid commit range: %s${NC}\n" "$range" >&2 + exit 1 + fi + + ERRORS=0 + + # ============================================================================ + # CHECK 1: Scan commit messages for AI attribution + # ============================================================================ + printf "Checking commit messages for AI attribution...\n" + + # Check each commit in the range for AI patterns. + while IFS= read -r commit_sha; do + full_msg=$(git log -1 --format='%B' "$commit_sha") + + if echo "$full_msg" | grep -qiE "(Generated with.*(Claude|AI)|Co-Authored-By: Claude|Co-Authored-By: AI|🤖 Generated|AI generated|@anthropic\.com|Assistant:|Generated by Claude|Machine generated)"; then + if [ $ERRORS -eq 0 ]; then + printf "${RED}✗ BLOCKED: AI attribution found in commit messages!${NC}\n" + printf "Commits with AI attribution:\n" + fi + printf " - %s\n" "$(git log -1 --oneline "$commit_sha")" + ERRORS=$((ERRORS + 1)) + fi + done < <(git rev-list "$range") + + if [ $ERRORS -gt 0 ]; then + printf "\n" + printf "These commits were likely created with --no-verify, bypassing the\n" + printf "commit-msg hook that strips AI attribution.\n" + printf "\n" + printf "To fix:\n" + printf " git rebase -i %s\n" "$remote_sha" + printf " Mark commits as 'reword', remove AI attribution, save\n" + printf " git push\n" + fi + + # ============================================================================ + # CHECK 2: File content security checks + # ============================================================================ + printf "Checking files for security issues...\n" + + # Get all files changed in these commits. + CHANGED_FILES=$(git diff --name-only "$range" 2>/dev/null || echo "") + + if [ -n "$CHANGED_FILES" ]; then + # Check for sensitive files. + if echo "$CHANGED_FILES" | grep -qE '^\.env(\.local)?$'; then + printf "${RED}✗ BLOCKED: Attempting to push .env file!${NC}\n" + printf "Files: %s\n" "$(echo "$CHANGED_FILES" | grep -E '^\.env(\.local)?$')" + ERRORS=$((ERRORS + 1)) + fi + + # Check for .DS_Store. + if echo "$CHANGED_FILES" | grep -q '\.DS_Store'; then + printf "${RED}✗ BLOCKED: .DS_Store file in push!${NC}\n" + printf "Files: %s\n" "$(echo "$CHANGED_FILES" | grep '\.DS_Store')" + ERRORS=$((ERRORS + 1)) + fi + + # Check for log files. + if echo "$CHANGED_FILES" | grep -E '\.log$' | grep -v 'test.*\.log' | grep -q .; then + printf "${RED}✗ BLOCKED: Log file in push!${NC}\n" + printf "Files: %s\n" "$(echo "$CHANGED_FILES" | grep -E '\.log$' | grep -v 'test.*\.log')" + ERRORS=$((ERRORS + 1)) + fi + + # Check file contents for secrets. + while IFS= read -r file; do + if [ -f "$file" ] && [ ! -d "$file" ]; then + # Skip test files, example files, and hook scripts. + if echo "$file" | grep -qE '\.(test|spec)\.(m?[jt]s|tsx?)$|\.example$|/test/|/tests/|fixtures/|\.git-hooks/|\.husky/'; then + continue + fi + + # Check for hardcoded user paths. + if grep -E '(/Users/[^/\s]+/|/home/[^/\s]+/|C:\\Users\\[^\\]+\\)' "$file" 2>/dev/null | grep -q .; then + printf "${RED}✗ BLOCKED: Hardcoded personal path found in: %s${NC}\n" "$file" + grep -n -E '(/Users/[^/\s]+/|/home/[^/\s]+/|C:\\Users\\[^\\]+\\)' "$file" | head -3 + ERRORS=$((ERRORS + 1)) + fi + + # Check for Socket API keys. + if grep -E 'sktsec_[a-zA-Z0-9_-]+' "$file" 2>/dev/null | grep -v "$ALLOWED_PUBLIC_KEY" | grep -v 'your_api_key_here' | grep -v 'SOCKET_SECURITY_API_KEY=' | grep -v 'fake-token' | grep -v 'test-token' | grep -q .; then + printf "${RED}✗ BLOCKED: Real API key detected in: %s${NC}\n" "$file" + grep -n 'sktsec_' "$file" | grep -v "$ALLOWED_PUBLIC_KEY" | grep -v 'your_api_key_here' | grep -v 'fake-token' | grep -v 'test-token' | head -3 + ERRORS=$((ERRORS + 1)) + fi + + # Check for AWS keys. + if grep -iE '(aws_access_key|aws_secret|AKIA[0-9A-Z]{16})' "$file" 2>/dev/null | grep -q .; then + printf "${RED}✗ BLOCKED: Potential AWS credentials found in: %s${NC}\n" "$file" + grep -n -iE '(aws_access_key|aws_secret|AKIA[0-9A-Z]{16})' "$file" | head -3 + ERRORS=$((ERRORS + 1)) + fi + + # Check for GitHub tokens. + if grep -E 'gh[ps]_[a-zA-Z0-9]{36}' "$file" 2>/dev/null | grep -q .; then + printf "${RED}✗ BLOCKED: Potential GitHub token found in: %s${NC}\n" "$file" + grep -n -E 'gh[ps]_[a-zA-Z0-9]{36}' "$file" | head -3 + ERRORS=$((ERRORS + 1)) + fi + + # Check for private keys. + if grep -E '-----BEGIN (RSA |EC |DSA )?PRIVATE KEY-----' "$file" 2>/dev/null | grep -q .; then + printf "${RED}✗ BLOCKED: Private key found in: %s${NC}\n" "$file" + ERRORS=$((ERRORS + 1)) + fi + fi + done <<< "$CHANGED_FILES" + fi + + TOTAL_ERRORS=$((TOTAL_ERRORS + ERRORS)) +done + +if [ $TOTAL_ERRORS -gt 0 ]; then + printf "\n" + printf "${RED}✗ Push blocked by mandatory validation!${NC}\n" + printf "Fix the issues above before pushing.\n" + exit 1 +fi + +printf "${GREEN}✓ All mandatory validation passed!${NC}\n" +exit 0 diff --git a/.gitignore b/.gitignore index 240ac97eb..0da4c3ab5 100644 --- a/.gitignore +++ b/.gitignore @@ -77,7 +77,9 @@ yarn-error.log* /.claude/* !/.claude/agents/ !/.claude/commands/ +!/.claude/hooks/ !/.claude/ops/ +!/.claude/settings.json !/.claude/skills/ # ============================================================================ diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml index df3f6b4bf..c857e39ec 100644 --- a/pnpm-lock.yaml +++ b/pnpm-lock.yaml @@ -7,8 +7,8 @@ settings: catalogs: default: '@anthropic-ai/claude-code': - specifier: 2.1.92 - version: 2.1.92 + specifier: 2.1.98 + version: 2.1.98 '@babel/core': specifier: 7.28.4 version: 7.28.4 @@ -97,8 +97,8 @@ catalogs: specifier: 2.0.2 version: 2.0.2 '@socketsecurity/sdk': - specifier: 3.4.1 - version: 3.4.1 + specifier: 4.0.0 + version: 4.0.0 '@types/adm-zip': specifier: 0.5.7 version: 0.5.7 @@ -335,7 +335,7 @@ importers: devDependencies: '@anthropic-ai/claude-code': specifier: 'catalog:' - version: 2.1.92 + version: 2.1.98 '@babel/core': specifier: 'catalog:' version: 7.28.4 @@ -425,7 +425,7 @@ importers: version: 2.0.2(typescript@5.9.3) '@socketsecurity/sdk': specifier: 'catalog:' - version: 3.4.1(typescript@5.9.3) + version: 4.0.0(typescript@5.9.3) '@types/cmd-shim': specifier: 'catalog:' version: 5.0.2 @@ -665,7 +665,7 @@ importers: version: 2.0.2(typescript@5.9.3) '@socketsecurity/sdk': specifier: 'catalog:' - version: 3.4.1(typescript@5.9.3) + version: 4.0.0(typescript@5.9.3) '@types/adm-zip': specifier: 'catalog:' version: 0.5.7 @@ -794,8 +794,8 @@ packages: engines: {node: '>=20'} hasBin: true - '@anthropic-ai/claude-code@2.1.92': - resolution: {integrity: sha512-mNGw/IK3+1yHsQBeKaNtdTPCrQDkUEuNTJtm3OBTXs4bBkUVdIgRme/34ZnbZkl2VMMYPoNaTvqX2qJZ9EdSxQ==} + '@anthropic-ai/claude-code@2.1.98': + resolution: {integrity: sha512-qecREauMWXHplkpjqsuDuUv4ww+NprMl71k9sMuLkZU7qwjLMkTPxRBjuKvZWWMrAPvZWdGZE9LljUTfCQ1lWQ==} engines: {node: '>=18.0.0'} hasBin: true @@ -2242,10 +2242,6 @@ packages: resolution: {integrity: sha512-DM81ydAjO2GJKkNf2Vn17InJ37sEYLK1YyhxpDX16OdbOpYlsDIw8QyeFEUZtc7GqsQXbcPKJmz3j/2qS+BhKQ==} engines: {node: '>=18'} - '@socketregistry/packageurl-js@1.3.5': - resolution: {integrity: sha512-Fl4GNUJ/z3IBJBGj4IsJfuRGUBCRMgX0df0mb5x5buaCPDKC+NhMhAFuxpc3viLSHV12CO2rGaNCf4fBYWI0FA==} - engines: {node: '>=18', pnpm: '>=10.16.0'} - '@socketregistry/packageurl-js@1.4.1': resolution: {integrity: sha512-t/UrOd1DMYXcGuKo2v07WMbuHCMlKBKOriTHu4cn9OIxfj1qWKoF/kpOswGHOWkG5zwj2Ke/2+qLiDugmx5z+A==} engines: {node: '>=18.20.4', pnpm: '>=10.25.0'} @@ -2292,9 +2288,9 @@ packages: typescript: optional: true - '@socketsecurity/sdk@3.4.1': - resolution: {integrity: sha512-Znpqi0GPBNk1j6QzKzcnP069Umpdn4mOuYtalux1qnz8/9X7CEcOFk8z8gUwaeQfsfwSP4NEgRcQvZZDkcg8wQ==} - engines: {node: '>=18', pnpm: '>=10.25.0'} + '@socketsecurity/sdk@4.0.0': + resolution: {integrity: sha512-e7MAVhjkeCMVoqYC8lmFk8GdwlNp8ZYTq9izkOrFf2ZZJMPaREC83lbk0xKTYIJKc09lxVhFLYLtDT/n4LgA4A==} + engines: {node: '>=18.20.8', pnpm: '>=10.33.0'} '@standard-schema/spec@1.0.0': resolution: {integrity: sha512-m2bOd0f2RT9k8QJx1JN85cZYyH1RqFBdlwtkSlf4tBDYLCiiZnv1fIIwacK6cqwXavOydf0NPToMQgpKq+dVlA==} @@ -4634,7 +4630,7 @@ snapshots: tinyexec: 1.0.2 tinyglobby: 0.2.15 - '@anthropic-ai/claude-code@2.1.92': + '@anthropic-ai/claude-code@2.1.98': optionalDependencies: '@img/sharp-darwin-arm64': 0.34.5 '@img/sharp-darwin-x64': 0.34.5 @@ -5978,8 +5974,6 @@ snapshots: '@socketregistry/isarray@1.0.8': {} - '@socketregistry/packageurl-js@1.3.5': {} - '@socketregistry/packageurl-js@1.4.1': dependencies: picomatch: 4.0.3 @@ -6011,9 +6005,8 @@ snapshots: optionalDependencies: typescript: 5.9.3 - '@socketsecurity/sdk@3.4.1(typescript@5.9.3)': + '@socketsecurity/sdk@4.0.0(typescript@5.9.3)': dependencies: - '@socketregistry/packageurl-js': 1.3.5 '@socketsecurity/lib': 5.15.0(typescript@5.9.3) form-data: 4.0.5 transitivePeerDependencies: diff --git a/pnpm-workspace.yaml b/pnpm-workspace.yaml index 3a26a4c69..6b6423870 100644 --- a/pnpm-workspace.yaml +++ b/pnpm-workspace.yaml @@ -1,7 +1,7 @@ # Wait 7 days (10080 minutes) before installing newly published packages. minimumReleaseAge: 10080 minimumReleaseAgeExclude: - - '@anthropic-ai/claude-code@2.1.92' + - '@anthropic-ai/claude-code@2.1.98' - '@socketaddon/*' - '@socketbin/*' - '@socketregistry/*' @@ -12,7 +12,7 @@ packages: - '!packages/package-builder/build' catalog: - '@anthropic-ai/claude-code': 2.1.92 + '@anthropic-ai/claude-code': 2.1.98 '@babel/core': 7.28.4 '@babel/generator': 7.28.5 '@babel/parser': 7.28.4 @@ -47,7 +47,7 @@ catalog: '@socketsecurity/config': 3.0.1 '@socketsecurity/lib': 5.15.0 '@socketsecurity/registry': 2.0.2 - '@socketsecurity/sdk': 3.4.1 + '@socketsecurity/sdk': 4.0.0 '@types/adm-zip': 0.5.7 '@types/cmd-shim': 5.0.2 '@types/js-yaml': 4.0.9