How we passed our Core Web Vitals — LCP — Fragment Caching our Server Side Rendered React (SSR)

Kevin McCarthy
3 min readApr 14, 2021

TL:DR — When moving to SSR we cached our prerendered HTML to hit our LCP target

Caching server-side rendered react

When we server-side render react we move the job of turning our JS React into HTML from the customer’s browser to our server.

This is faster to do on the server but it is still slow. Therefore we cache the outputted HTML so we don’t need to do the hard work every time.

How we do it

Rails makes fragment caching straightforward so we can wrap our react_component with a cache block.

BEFORE CACHING
<%= react_component "ListingCarousel", { listing: @listing } prerender: true %>

AFTER CACHING
<% cache @listing do %>
<%= react_component "ListingCarousel", { listing: @listing } prerender: true %>
<% end %>

Now it will save the outputted HTML in our Redis cache and next time someone hits this page it’ll pull the HTML from cache instead of prerendering the React component. Once the listing record in our database doesn’t change we can keep using the HTML over and over again.

Easy peasy!

Complications — invalidating cache on PLP

This works pretty well on our Product Detail Page (PDP). We’ve got a single listing.

When we look at our Product Listings Page (PLP) we have 100 listings. And we are getting the data from Algolia, not our database so we don’t have those 100 listings loaded in memory to easily check.

So what we did was even though we are caching the HTML output we make the call to Algolia every single time and then make our cachekey a combination of a bunch of attributes for each listing.

See the below example of 2 listings.

listings = [
{ id: 12434, price: 210, on_sale: false },
{ id: 2341, price: 150, on_sale: true }
]
Cachekey is
<first_listing.id><first_listing.price><first_listing.on_sale><second_listing.id><second_listing.price><second_listing.on_sale>
We use this as our cache_key
12434210false2341150true

So we do the exact same thing for 100 listings and end up with a super long string for our cachekey.

I was originally concerned about the length of the cachekey but based on this stackoverflow article it seems to not be an issue.

Complications — passing analytics queryID to frontend

Every request we make to Algolia we get a queryID we use for our analytics. We need this to be fresh every single time.

So in order to populate this we need to add it to the DOM, then add this from the React component.

We needed to learn about Managing server vs browser in SSR React so we ended up with.

constructor(props) {  super(props);  this.state = { isClient: false }}componentDidMount() {
// this function won't run on server so can use to keep track of where code is running
this.setState({ isClient: true })}relevantHits() { if(!this.state.isClient) return this.props.hits; // this will be true on the client if its been server side rendered if(this.props.hits[0].__queryID === ""){ const queryID = document.getElementById("hits").getAttribute("data-queryid") return this.props.hits.map(hit => ({ ...hit, __queryID: queryID }))}

--

--