← Writing
Engineering·Mar 1, 2026·10 min read

7 Next.js App Router Mistakes That Kill Your SaaS Performance

The App Router is genuinely powerful. But it comes with a real learning curve — and most Next.js 15 SaaS codebases I review have the same seven mistakes baked in from day one. Each one costs you performance, bundle size, or both. Here's the full list, with fixes you can apply today.


TL;DR — The 7 Mistakes

1.Putting everything in Client ComponentsShips unnecessary JS to the browser.
2.Not using cache() for repeated DB fetchesSame query runs multiple times per request.
3.Missing Suspense boundariesEntire page blocks on your slowest fetch.
4.Over-fetching in layoutsSession fetches waterfall on every page render.
5.Not using generateStaticParamsDynamic routes re-render on every single request.
6.Importing heavy client libraries in Server ComponentsBreaks or bloats your bundle unnecessarily.
7.No cache headers on Route HandlersAPI routes return fresh data on every request.

The Next.js App Router — released stable in Next.js 13, matured in 14, and now the recommended way to build in 15 — completely changes how you think about rendering. Server Components run on the server. No client JavaScript is sent for them. Data fetching happens at the component level. Streaming is built in.

The problem: most developers come from the Pages Router (or plain React), where every component is a client component by default. Old habits die hard. The seven mistakes below are a direct result of that mental model being carried into a paradigm that works very differently.


Mistake 1

Making Everything a Client Component

Adding 'use client' at the top of every file is the single most common App Router mistake. It's understandable — in the Pages Router, everything was a client component. But in the App Router, the default is a Server Component, and that default exists for a good reason: Server Components ship zero JavaScript to the browser.

Every time you add 'use client' to a component that doesn't need interactivity, you push that component — and all of its imports — into the client bundle. Do this enough times and you've negated most of the App Router's performance benefits.

The fix is simple: only use 'use client' when you genuinely need browser APIs — onClick, useState, useEffect, event listeners. Everything else should stay as a Server Component. When you need a small interactive piece inside a larger server-rendered component, extract just the interactive part:

// ✅ Server Component — fetches data, no 'use client'
// app/dashboard/page.tsx
import { SubscribeButton } from '@/components/SubscribeButton';

export default async function DashboardPage() {
  const user = await getUser(); // runs on server, no client JS

  return (
    <div>
      <h1>Welcome, {user.name}</h1>
      <p>Plan: {user.plan}</p>
      {/* Only this tiny button is a Client Component */}
      <SubscribeButton userId={user.id} />
    </div>
  );
}
// ✅ Client Component — only what NEEDS interactivity
// components/SubscribeButton.tsx
'use client';
import { useState } from 'react';

export function SubscribeButton({ userId }: { userId: string }) {
  const [loading, setLoading] = useState(false);

  return (
    <button onClick={() => { setLoading(true); /* ... */ }}>
      {loading ? 'Processing...' : 'Subscribe'}
    </button>
  );
}

The parent page stays on the server. The interactive button island lives on the client. You get the best of both worlds — server-rendered data with client-side interactivity — without shipping the entire page to the browser.


Mistake 2

Not Using cache() for Repeated DB Fetches

Here's a scenario that happens all the time: your layout.tsx fetches the current user to render the nav. Your page.tsx fetches the current user again to get their subscription status. A component inside that page fetches the current user a third time to check their role. Three identical database queries on a single request.

React's cache() function was built for exactly this. It memoizes the result of an async function for the duration of a single server request. Call the same cached function ten times with the same arguments — the database is only hit once.

// lib/queries.ts
import { cache } from 'react';
import { db } from '@/lib/db';

// Deduplicated across an entire request tree
export const getUser = cache(async (id: string) => {
  console.log('DB hit:', id); // Only logs ONCE per request
  return db.users.findUnique({ where: { id } });
});

export const getSubscription = cache(async (userId: string) => {
  return db.subscriptions.findUnique({ where: { userId } });
});
// layout.tsx — calls getUser
const user = await getUser(userId); // → DB hit

// page.tsx — calls getUser again
const user = await getUser(userId); // → cache hit, no DB query

// SomeComponent.tsx — calls getUser yet again
const user = await getUser(userId); // → cache hit, no DB query
cache() is request-scoped, not global. Each new request starts with a fresh cache. This is intentional — you never want user A's data leaking into user B's request. Use a separate global cache layer (like Redis) if you want cross-request memoization.

Mistake 3

Missing Suspense Boundaries

Without Suspense, an async Server Component blocks rendering until all of its awaited data resolves. If your page has three data fetches — user, posts, analytics — the entire page waits for the slowest one before sending a single byte to the browser.

React's Suspense and Next.js's streaming work together to fix this. Wrap each async component in a <Suspense> boundary with a fallback. The fast parts render and stream immediately; the slow parts stream in as their data resolves.

// ❌ Without Suspense — entire page waits for slowest fetch
export default async function DashboardPage() {
  const [user, posts, analytics] = await Promise.all([
    getUser(),         // 50ms
    getPosts(),        // 200ms
    getAnalytics(),    // 800ms ← page is blocked until this resolves
  ]);

  return <div>...</div>;
}
// ✅ With Suspense — fast content streams first
import { Suspense } from 'react';

export default function DashboardPage() {
  return (
    <div>
      {/* Renders immediately (no async data) */}
      <PageHeader />

      {/* Streams in ~50ms */}
      <Suspense fallback={<UserSkeleton />}>
        <UserSection />
      </Suspense>

      {/* Streams in ~200ms */}
      <Suspense fallback={<PostsSkeleton />}>
        <PostsList />
      </Suspense>

      {/* Streams in ~800ms — only THIS section waits */}
      <Suspense fallback={<AnalyticsSkeleton />}>
        <AnalyticsPanel />
      </Suspense>
    </div>
  );
}

// Each of these is an async Server Component
async function UserSection() {
  const user = await getUser(); // 50ms
  return <div>{user.name}</div>;
}

async function AnalyticsPanel() {
  const data = await getAnalytics(); // 800ms
  return <div>{data.views}</div>;
}

The user sees content progressively appear instead of staring at a blank screen for 800ms. Time to first byte drops dramatically. The loading skeletons show exactly where content is still loading, which is far better UX than a full-page spinner.


Mistake 4

Over-Fetching in Layouts for Per-Request Data

Layouts in Next.js are persistent — they wrap every page under their route segment. That sounds like a great place to fetch the user session once and share it everywhere. The problem: layouts re-run on every navigation, which means your session fetch runs on every page render, creating an unavoidable waterfall.

Even with cache(), a layout that awaits an async fetch adds that latency to every single page load under it. The layout must resolve before any child page can start rendering.

// ❌ Problematic — layout fetch waterfalls with every page
// app/dashboard/layout.tsx
export default async function DashboardLayout({ children }) {
  const session = await getSession(); // Runs on every navigation

  return (
    <div>
      <Nav user={session.user} />
      {children}
    </div>
  );
}

Better approaches, in order of preference:

  • Read session data from a cookie or header directly — no async fetch, no waterfall. Next.js middleware can write user data into a signed cookie on login.
  • Use Next.js middleware to validate auth and attach user context to request headers, then read from headers() in your layout — synchronous, instant.
  • If you must fetch, use React cache() and make sure child pages share the same cached call so the DB is only hit once total.
// ✅ Read from cookie — no async waterfall
// app/dashboard/layout.tsx
import { cookies } from 'next/headers';

export default function DashboardLayout({ children }) {
  // Synchronous — reads from cookie, no DB hit
  const session = cookies().get('session')?.value;
  const user = session ? JSON.parse(atob(session)) : null;

  return (
    <div>
      <Nav user={user} />
      {children}
    </div>
  );
}

Mistake 5

Not Using generateStaticParams for Dynamic Routes

If your SaaS has blog posts, product pages, docs, or any route with a dynamic segment like [slug], every request to those pages hits your server by default. For content that changes infrequently, this is wasted compute — you're re-rendering the same HTML on every request.

generateStaticParams tells Next.js which dynamic paths to pre-render at build time. Those pages are then served as static HTML — zero server compute per request, near-instant response times, and full CDN caching.

// app/blog/[slug]/page.tsx

// Tell Next.js which slugs to pre-render at build time
export async function generateStaticParams() {
  const posts = await getPosts();
  return posts.map((post) => ({ slug: post.slug }));
}

// This page is now static HTML — served from CDN edge
export default async function BlogPost({ params }: { params: { slug: string } }) {
  const post = await getPost(params.slug);

  return (
    <article>
      <h1>{post.title}</h1>
      <div dangerouslySetInnerHTML={{ __html: post.content }} />
    </article>
  );
}

For pages that update occasionally but don't need to be perfectly fresh, combine with revalidate to get static generation with time-based regeneration:

// Regenerate this page in the background every 60 seconds
export const revalidate = 60;

export async function generateStaticParams() {
  const posts = await getPosts();
  return posts.map((post) => ({ slug: post.slug }));
}

Any slug not in the generateStaticParams list falls through to dynamic rendering by default, so new content published after the last build still works — it just gets rendered on first request and then cached.


Mistake 6

Importing Heavy Client Libraries in Server Components

Some libraries are client-only — they access window, document, or browser APIs that don't exist on the server. Importing them directly in a Server Component either causes a build error or forces that entire module into the client bundle when it shouldn't be there.

Common offenders: framer-motion, recharts, react-map-gl, @tiptap/react, canvas-heavy libraries, and most data visualization packages. These can be hundreds of kilobytes and have no business running at server render time.

// ❌ Wrong — heavy library imported directly in a Server Component
import { BarChart } from 'recharts'; // ~150KB client-only library
import { motion } from 'framer-motion';

export default async function AnalyticsPage() {
  const data = await getAnalyticsData();
  return <BarChart data={data} />; // Will break or bloat bundle
}
// ✅ Correct — lazy-loaded client component via next/dynamic
import dynamic from 'next/dynamic';

// Only loaded in the browser, after hydration
const AnalyticsChart = dynamic(
  () => import('@/components/AnalyticsChart'),
  {
    ssr: false,                              // Never runs on server
    loading: () => <ChartSkeleton />,        // Show while loading
  }
);

const AnimatedHero = dynamic(
  () => import('@/components/AnimatedHero'),
  { ssr: false }
);

export default async function AnalyticsPage() {
  const data = await getAnalyticsData(); // Runs on server ✅

  return (
    <div>
      <h1>Analytics</h1>
      {/* Chart loads client-side only, no server error */}
      <AnalyticsChart data={data} />
    </div>
  );
}
Note: ssr: false means the component is not included in the server render at all — the placeholder (or loading fallback) is shown server-side, and the real component hydrates client-side. This prevents server errors and avoids the weight of the library in your server bundle.

Mistake 7

No Cache Headers on Route Handlers

Route Handlers (the files in app/api/) are not edge functions and are not cached by default. Every request hits your Node.js server cold. For data that doesn't change frequently — a list of public posts, a pricing page, a product catalog — this is a lot of unnecessary compute.

Next.js gives you two ways to add caching to Route Handlers. For simple time-based revalidation, use the route segment revalidate config:

// app/api/posts/route.ts

// Cache this response — revalidate in the background every 60 seconds
export const revalidate = 60;

export async function GET() {
  const posts = await db.posts.findMany({ where: { published: true } });
  return Response.json(posts);
}

For finer-grained control — different TTLs for fresh vs. stale data — set Cache-Control headers directly on the response:

// app/api/pricing/route.ts
export async function GET() {
  const plans = await getPricingPlans();

  return Response.json(plans, {
    headers: {
      // Serve from CDN for 60s, allow stale for up to 5 minutes in background
      'Cache-Control': 'public, s-maxage=60, stale-while-revalidate=300',
    },
  });
}
// For user-specific data — cache privately in the browser
return Response.json(userData, {
  headers: {
    // Never cache on CDN, but browser can cache for 30s
    'Cache-Control': 'private, max-age=30',
  },
});

The key distinction: public caches on CDN edges (Vercel Edge Network, Cloudflare, etc.), private caches only in the user's browser. Never send public cache headers for authenticated or user-specific responses — you'll serve user A's data to user B.


Bonus: The Right Mental Model

All seven mistakes trace back to the same root cause: applying a client-first mental model to a server-first framework. Once the mental model clicks, the right choices become obvious.

Here's the model that works:

1
Server Components = runs on server, ships no JS
The default. Use for data fetching, layout, rendering static content. These components have direct access to your database, file system, and secrets. Nothing they do appears in the client bundle.
2
Client Components = interactive islands
Only for components that need onClick, onChange, useState, useEffect, or browser APIs. Mark with "use client". They can still receive data from Server Components as props — the Server Component fetches, the Client Component handles interaction.
3
The data flow: Server fetches → passes props → Client handles interaction
Server Components fetch all data and pass it down as props to Client Components. Client Components never fetch directly on mount if it can be avoided — they receive ready data and make it interactive. This pattern keeps fetching fast (server-side) and bundles lean (no fetch libraries in client).

Applied consistently, this model gives you a fast initial load (server-rendered HTML), a lean client bundle (only interactive components ship JS), efficient data fetching (deduplicated, cached, streamed), and a codebase that scales well as features are added.

The App Router isn't harder than the Pages Router — it's different. Once the mental model is solid, it's substantially cleaner for the kind of data-heavy, authenticated apps that most SaaS products are.


Quick Reference

MistakeFix
'use client' everywhereOnly on interactive components
Duplicate DB fetchesWrap with cache() from react
No SuspenseWrap async components in <Suspense>
Async data in layoutsUse cookies/headers or middleware
Dynamic routes render on every requestUse generateStaticParams
Heavy libs in Server ComponentsUse next/dynamic with ssr: false
API routes with no cachingSet revalidate or Cache-Control headers
Abanoub Boctor
Abanoub Rodolf Boctor
Founder & CTO, ThynkQ · Mar 1, 2026
More articles →

Ready to build?

I turn ideas into shipped products. Fast.

Free 30-minute discovery call. Tell me what you're building — I'll tell you exactly how I'd approach it.

Book a free strategy call →

Related articles

Stripe + Next.js: The Complete Integration Guide for 2026Firebase vs Supabase: Which Should You Use for Your SaaS in 2026?How to Pick a Tech Stack for Your Startup
← Stripe + Next.jsAll articles →