It is popular to talk about how Server Side Rendering (SSR) in Next.js is good for SEO (Search-Engine Optimization). While SSR can be a net positive for SEO it is not strictly necessary and it hurts if you’re not careful. I’ll give you an example of using the Next.js App Router made SEO worse. I’ll also show you how to fix it.
When using React the browser sees mostly JavaScript. The browser must execute the JavaScript to dynamically generate the HTML that makes up the browser content. So when we serve this simple react app the browser gets this for the page:
import ReactDOM from "react-dom/client"
const root = ReactDOM.createRoot(document.getElementById("root") as HTMLElement)
root.render(
<div>
<main>
<h1>Hello World React</h1>
<p>Hello world. This is the simplest app I could think of.</p>
</main>
</div>
)
The browser only gets this:
<!doctype html>
<html lang="en">
<head>
...
</head>
<body>
<noscript>You need to enable JavaScript to run this app.</noscript>
<div id="root"></div>
</body>
</html>
When we enable server-side React rendering with Next.js instead of what we saw above we see this:
<!doctype html>
<html lang="en">
<head>
...
</head>
<body>
<div>
<main>
<h1>Hello World Next.js</h1>
<p>Hello world. This is the simplest app I could think of.</p>
</main>
</div>
<script
src="/_next/static/chunks/webpack-641da8633e7bf30d.js"
crossorigin=""
async=""
></script>
<script>
...
</script>
</body>
</html>
In the above, note how the content we want the user to see is already in the HTML sent to the browser - without the browser needing to execute any JavaScript.
It is easy to assume that the Search Engine crawlers (Googlbot, bingbot, et. al.) only “see” that too. I’ve assumed the same, but as I’ve recently been thinking about SEO working on a project exploring price/performance of GPUs for machine learning, I’ve learned otherwise.
The reality is that Google and Bing’s crawlers do render JavaScript and your dynamically generated front-end app will be indexed.12 Google’s indexing process has two steps for indexing content. The processing step will index the server-side-rendered content and queue that content for the rendering step is where client-side-rendered content is rendered is then indexed.3 Google even provides a video about how to optimize client-side-rendered React for SEO.
While these two major crawlers index client-rendered content, both Google and Bing recommend server-side-rendered content for their crawlers.4 Presumably because it’s less compute for them to process and they can also index the content sooner than if a full render is required.
Google will index up to 15MB of a page 5 and although Bing Webmaster tools explicitly warns on any page >125Kb (albeit with a low priority warning) they do index much larger pages.6 While a reasonably large page likely won’t prevent indexing, large pages do have a drawback on performance and push your main content down. It is easier than you might think to generate a large page with server-side rendering.
Below is a simple example of an interactive table in React. This is obviously an example, but one representative of many web applications. I have a table similar to this on my GPU price/performance ranking for AI project here.
export const RandomTable = ({ initialRows }: { initialRows: Row[] }) => {
const [rows, setRows] = useState(initialRows)
return (
<Suspense>
<table>
<thead>
<tr>
<th>
Column 1{" "}
<button
onClick={() => {
const col = Math.floor(4 * Math.random())
const newRows = [...rows]
const reverse = Math.random() < 0.5 ? 1 : -1
newRows.sort(
(a, b) => reverse * a[col].text.localeCompare(b[0].text)
)
setRows(newRows)
}}
>
</button>
</th>
<th>Column 2</th>
<th>Column 3</th>
<th>Column 4</th>
</tr>
</thead>
<tbody>
{rows.map((row, i) => (
<tr key={i}>
{row.map((cell, j) => (
<td key={j}>{cell.text}</td>
))}
</tr>
))}
</tbody>
</table>
</Suspense>
)
}
In my test this interactive React table rendered client-side comes in at 547 bytes. Rendering it server-side with Next.js comes to the browser 150x larger at 84kB! 7
Here are screenshots of the loading times and transfer sizes of each:
Both the client-rendered and server-rendered examples come in with high scores in Lighthouse 8. The lesson is that using server-rendered components can have some surprising costs that can lead to meaningful performance penalty. The biggest thing I’ve found you can do is ensure that all of your stateful objects are as small as possible. In the example I’m using, the table is showing rows and each Row shaped as follows:
export type Cell = { text: string; extra?: string }
export type Row = [Cell, Cell, Cell, Cell]
For each table cell only the Cell.text
value is used in the table, but the Cell.extra
field and all of its data is embedded into the page. Despite not using the the extra
field, the table state ends up being embedded in the page’s <script>
tags and adds to the page size while slowing the main thread’s blocking JavaScript execution time. Once I experienced this it made sense, but it isn’t intuitive since we think it is rendered on the server - the reality is that server-side-rendered content is rendered on the server and the client.
In extreme scenarios, this hydrating of HTML or extra state that react/Next.js SSR embeds into the page’s script tags can cause a response to exceed the 4.5MB maximum size for Vercel serverless functions. If that happens you’ll see this error:
LAMBDA_RUNTIME Failed to post handler success response. Http response code: 413.
So Server-rendered React isn’t unconditionally better, and it’s behavior isn’t always intuitive. So pay close attention to how it actually renders. If you discover a scenario where you want to force a component to be rendered only on the client you can in Next.js as follows.
In the below example, SiteHeaderNavItems
is a defined as a "use client"
client component and the use of dynamic
and import
ensures client-side rendering.
const SiteHeaderNavItems = dynamic(
() => import("./SiteHeaderClientComponents"),
{ ssr: false },
)
export const SiteHeader = () => {
return (
<nav className="navbar navbar-expand-md bg-body-tertiary">
<div className="container-fluid">
<Link className="navbar-brand" href="/">
<SvgIcon icon="coinpoet-card" svgViewBox="0 0 16 16" /> Coin Poet
</Link>
<SiteHeaderNavToggler />
<div className="collapse navbar-collapse" id="navbarNav">
<SiteHeaderNavItems />
</div>
</div>
</nav>
)
}
In conclusion, this is all a lesson to be mindful and what what gets rendered into your pages and monitor page size, as these technologies sometimes have surprising side effects that can become an increasingly important factor page performance.
As of 2019 Googlebot will regularly update it’s Chrome version↩︎
https://developers.google.com/search/docs/crawling-indexing/javascript/javascript-seo-basics↩︎
Google says “server-side or pre-rendering is still a great idea because it makes your website faster for users and crawlers, and not all bots can run JavaScript”. Bing “recommends to switch between client-side rendered and pre-rendered content for specific user agents such as Bingbot.↩︎
There are debates on how large a file can be, but I found Google Documentation indicating Googlebot “considers the first 15MB of the file for indexing”↩︎
On Bing Webmaster Tools I see the following: Evaluated size of HTML is estimated to be over 125 KB and risks not being fully cached.↩︎
"use client"
is required for interactivity. Though marked as "use client"
, Next.js still renders it on the server, it just hydrates it on the client later. More at https://nextjs.org/docs/app/building-your-application/rendering/client-components#full-page-load↩︎
Interestingly the client-rendered React got four 100s, while Next.js got a 99 on perf for “Reduce unused JavaScript”. I’m guessing that would turn into used JavaScript as the app scaled up into a real app.↩︎