mirror of
https://github.com/supabase/supabase.git
synced 2026-05-10 02:39:56 -04:00
b3dc867f90
Proposed mitigation, the obvious tradeoff is that clearing the cache will make compilation slower on subsequent dev server starts, but more consistent. Various people have been observing `next-server` use up to ~34 GB memory. I've observed 12.59 GB memory, with ~1.5k `postcss` processes: ``` ps aux | grep postcss | grep -v grep | wc -l 1526 ``` Going down to 3 `postcss` process and 4.71 GB memory after clearing cache: ``` ps aux | grep postcss | grep -v grep | wc -l 3 ``` <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit * **Chores** * Development infrastructure: adds an automated pre-development step that clears the local dev cache before starting the development server by introducing a new lifecycle hook and supporting cleanup script; purely maintenance-oriented with no user-facing changes or functional impact. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: Ivan Vasilov <vasilov.ivan@gmail.com>
22 lines
688 B
JavaScript
22 lines
688 B
JavaScript
import { existsSync, readdirSync, rmSync, statSync } from 'fs'
|
|
import { join } from 'path'
|
|
|
|
// This script cleans up the Turbopack cache by removing files that haven't been modified in the last 3 days. This is to
|
|
// prevent the cache from growing indefinitely and consuming too much RAM.
|
|
const dir = '.next/dev/cache/turbopack'
|
|
const cutoff = Date.now() - 3 * 24 * 60 * 60 * 1000 // 3 days in milliseconds
|
|
|
|
function clean(d) {
|
|
if (!existsSync(d)) return
|
|
for (const entry of readdirSync(d, { withFileTypes: true })) {
|
|
const p = join(d, entry.name)
|
|
if (entry.isDirectory()) {
|
|
clean(p)
|
|
} else if (statSync(p).mtimeMs < cutoff) {
|
|
rmSync(p)
|
|
}
|
|
}
|
|
}
|
|
|
|
clean(dir)
|