JavaScript Bundle Size: How to Shrink It by 60% Without Removing a Single Feature
JavaScript is the most expensive resource type on the web — not because of download time, but because of parse and execution time. A 500KB JavaScript bundle requires the browser to:
- Download it (fast on broadband, slow on 3G)
- Parse it (CPU-intensive, blocks the main thread)
- Compile it to machine code (more CPU)
- Execute it (your actual code finally runs)
For users on mid-range Android devices — which is the majority of the global market — a large JavaScript bundle is the difference between a 2-second and a 6-second interactive time. This is not a performance optimization exercise. It is a UX and conversion rate exercise.
Step 1: Measure Before You Optimize
You cannot optimize what you haven't measured. Start with Next.js's built-in bundle analysis:
# Install the analyzer
npm install --save-dev @next/bundle-analyzer
# Generate the analysis (opens browser automatically)
ANALYZE=true npm run build
Configure it in next.config.mjs:
// next.config.mjs
import bundleAnalyzer from '@next/bundle-analyzer';
const withBundleAnalyzer = bundleAnalyzer({
enabled: process.env.ANALYZE === 'true',
openAnalyzer: true,
});
export default withBundleAnalyzer({
// Your normal next config
});
The analyzer produces an interactive treemap. Look for:
- Large rectangles: dependencies taking up disproportionate space
- Duplicated rectangles: the same library appearing multiple times (different versions or un-deduplicated imports)
- Known-heavy packages: moment.js, lodash, date-fns full bundle, pdf.js
The Typical Offenders
After analyzing hundreds of Next.js bundles, these are the most common heavy packages that can be replaced or trimmed:
| Package | Typical Size | Leaner Alternative |
|---|---|---|
moment |
329KB | date-fns (tree-shakeable, ~20KB per function) |
lodash (full) |
71KB | lodash-es + tree shaking, or individual methods |
xlsx / sheetjs |
800KB+ | Lazy-load on user interaction only |
pdf.js |
600KB+ | Dynamic import only when needed |
recharts |
200KB | recharts lazy-loaded, or visx with tree shaking |
@aws-sdk/client-s3 (full) |
400KB+ | @aws-sdk/client-s3 v3 modular import |
firebase (full) |
300KB+ | Individual Firebase packages |
Code Splitting: Route-Level
Next.js automatically code-splits at the page level. But within pages, heavy components (charts, editors, file uploaders) should be split further:
flowchart TD
A[User visits /dashboard] --> B[Dashboard JS loaded]
B --> C{User clicks\n"Analytics" tab?}
C -->|No| D[Heavy chart library\nnever loaded]
C -->|Yes| E[Dynamic import triggers]
E --> F[Chart library downloads\nand renders]
style D fill:#22c55e,color:#fff
style F fill:#3b82f6,color:#fff
// components/Dashboard.tsx
import dynamic from 'next/dynamic';
import { Suspense } from 'react';
// ✅ Only loads when this component renders
const HeavyChartComponent = dynamic(() => import('./AnalyticsChart'), {
loading: () => <ChartSkeleton />,
ssr: false, // Charts often don't need SSR
});
// ✅ Even more control: load only on interaction
const PdfExporter = dynamic(() => import('./PdfExporter'), {
ssr: false,
});
export function Dashboard() {
const [showPdf, setShowPdf] = useState(false);
return (
<div>
<button onClick={() => setShowPdf(true)}>Export PDF</button>
{showPdf && (
<Suspense fallback={<Spinner />}>
<PdfExporter />
</Suspense>
)}
</div>
);
}
Tree Shaking: Fixing Bad Import Patterns
Tree shaking eliminates unused code from your bundle — but only if you import correctly:
// ❌ Imports the ENTIRE lodash library (71KB)
import _ from 'lodash';
const result = _.groupBy(items, 'category');
// ✅ Imports only the groupBy function (~3KB)
import groupBy from 'lodash-es/groupBy';
const result = groupBy(items, 'category');
// Even better — native JavaScript (0KB added)
const result = items.reduce(
(acc, item) => {
(acc[item.category] ??= []).push(item);
return acc;
},
{} as Record<string, Item[]>,
);
// ❌ Pulls in all of date-fns (200KB+)
import { format, parseISO, addDays } from 'date-fns';
// ✅ Same API, same result, tree-shaken properly
// (date-fns v3 is ESM and tree-shakes correctly by default)
import { format, parseISO, addDays } from 'date-fns';
// ↑ This is actually fine in date-fns v3 — it's already tree-shakeable
The critical step: verify your tsconfig.json and bundler are configured for ESM:
{
"compilerOptions": {
"moduleResolution": "bundler", // or "node16"
"module": "ESNext",
"target": "ES2022"
}
}
Measuring Bundle Size in CI
Track bundle size automatically so regressions are caught before merge:
# .github/workflows/bundle-size.yml
name: Bundle Size Check
on: [pull_request]
jobs:
analyze:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: '22'
cache: 'pnpm'
- name: Install dependencies
run: pnpm install
- name: Build and analyze
run: |
pnpm run build
# Extract bundle sizes from Next.js build output
cat .next/build-manifest.json | \
jq '[.pages | to_entries[] | {page: .key, chunks: .value}]' \
> bundle-stats.json
- name: Check bundle size budget
run: node scripts/check-bundle-budget.js
# Fails CI if any page chunk exceeds defined budget
- uses: andresz1/size-limit-action@v1
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
Define size budgets in .size-limit.json:
[
{
"path": ".next/static/chunks/pages/index*.js",
"limit": "150 kB",
"name": "Homepage JS"
},
{
"path": ".next/static/chunks/pages/products-*.js",
"limit": "200 kB",
"name": "Product page JS"
},
{
"path": ".next/static/css/*.css",
"limit": "50 kB",
"name": "Global CSS"
}
]
The Optimization Priority Matrix
When deciding where to start, prioritize by impact-to-effort ratio:
| Optimization | Bundle Reduction | Effort | Priority |
|---|---|---|---|
Replace moment with date-fns |
300KB+ | Low | 🔴 Do first |
Fix lodash imports to lodash-es |
50–70KB | Low | 🔴 Do first |
| Dynamic import for PDF/Excel exporters | 500KB–1MB | Medium | 🔴 Do first |
| Route-based code splitting (Next.js default) | Automatic | None | Already done |
| Lazy load chart libraries | 100–200KB | Medium | 🟡 High value |
| Audit and remove unused dependencies | 50–300KB | Medium | 🟡 High value |
| Switch icon library to tree-shakeable version | 50–150KB | Low | 🟡 High value |
| Subpath imports for heavy SDKs | 100–400KB | Medium | 🟡 High value |
A focused two-day bundle audit typically yields a 30–50% reduction in JavaScript for pages that haven't been optimized before.
Related articles: Also see the full performance roadmap JavaScript bundle reduction feeds into, how bundle bloat blocks LCP element loading, and caching optimisations that maximise the gains from a leaner bundle.
Tracking Performance Over Time
Bundle analysis is a one-time activity that needs to become a continuous habit. The combination of:
- Size budgets in CI (catches regressions before merge)
- Lighthouse/Web Vitals monitoring in production (catches real-user impact)
- Quarterly bundle audits (proactively finds new bloat)
...is what separates teams that maintain fast sites from teams who periodically do "performance sprints" to undo accumulated bloat.
JavaScript bundle size directly affects your Core Web Vitals — particularly Time to Interactive and Total Blocking Time, which feed into Google's page experience signals and your search ranking.
Further Reading
- Optimize JavaScript Bundle Size — web.dev: Google's guide to code splitting, lazy loading, and removing unused JavaScript
- webpack Bundle Analyzer: The most widely-used tool for visualizing webpack bundle composition and identifying large dependencies
- Bundlephobia: Check the size cost of any npm package before adding it to your bundle
- Total Blocking Time — web.dev: Understanding Total Blocking Time and how large JavaScript payloads block the main thread
Monitor your site's performance metrics continuously: Try ScanlyApp free and set up automated Lighthouse-based performance checks on every deploy.
