After building the Localstreet web app and vendor dashboard in React, I’ve learned what actually makes React apps slow — and it’s usually not what tutorials say.
The most common React performance advice: “add useMemo and useCallback everywhere.” This is wrong. Over-memoization is a real problem, adds code complexity, and sometimes makes things slower because of memo comparison overhead.
Let’s talk about what actually works.
Measure Before You Optimise
Before writing a single line of optimisation code, open React DevTools Profiler and record a slow interaction. Without profiling data, you’re guessing.
What you’re looking for:
- Components that re-render frequently with the same props
- Renders that take >16ms (you’re dropping frames)
- Commit phases with many simultaneous re-renders
React DevTools → Profiler tab → Record → Perform slow action → Stop
Only after seeing the flamegraph should you start optimising.
The Real Source of Slow React Apps
In my experience, 80% of React performance issues come from three sources:
- Too many components re-rendering when they don’t need to (prop drilling, context overuse)
- Expensive computations running on every render (legitimate
useMemouse case) - Large lists rendering all items at once (virtualisation fixes this)
Let’s tackle each.
Stop Unnecessary Re-renders: Context Architecture
The most common performance issue I see in production React apps is context abuse. Putting frequently-changing values in a single context means every consumer re-renders on every change.
The Problem
// Bad: one big context — every consumer re-renders when ANYTHING changes
const AppContext = createContext<{
user: User;
cart: CartItem[];
notifications: Notification[];
theme: 'dark' | 'light';
}>({} as any);
The Fix: Split Contexts by Update Frequency
// Stable context — user data rarely changes
const UserContext = createContext<User | null>(null);
// Volatile context — cart changes often
const CartContext = createContext<CartItem[]>([]);
// Separate dispatch from state — components that only dispatch
// don't re-render when state changes
const CartDispatchContext = createContext<Dispatch<CartAction>>(() => {});
The dispatch/state split is underused. A <AddToCartButton /> only needs dispatch — it should never re-render because someone else added an item. With the split pattern, it won’t.
useMemo: Use It For These Specific Cases
useMemo has two legitimate use cases. Just two.
Use Case 1: Referentially stable derived data passed to memo’d children
// Without useMemo: new array reference on every render = child always re-renders
const filteredProducts = products.filter(p => p.category === selectedCategory);
// With useMemo: stable reference when deps don't change
const filteredProducts = useMemo(
() => products.filter(p => p.category === selectedCategory),
[products, selectedCategory]
);
Use Case 2: Genuinely expensive computations (>1ms)
// Computing search index over 10k items — genuinely expensive
const searchIndex = useMemo(
() => buildFuseIndex(products), // takes ~50ms
[products]
);
Don’t useMemo things that are already fast. Sorting 10 items, formatting a date, mapping a small array — these are microsecond operations. The memo bookkeeping costs more than the computation.
React.memo: The Right Pattern
React.memo prevents a component from re-rendering if its props haven’t changed (shallow comparison). It’s useful, but most tutorials miss the key caveat: object and function props bypass memo unless they’re stable references.
// This memo does NOTHING — new object literal every render
<ProductCard product={{ id: 1, name: 'Apple' }} onAdd={() => addToCart(1)} />
// This works — stable references
const product = useMemo(() => ({ id: 1, name: 'Apple' }), []);
const handleAdd = useCallback(() => addToCart(1), [addToCart]);
<ProductCard product={product} onAdd={handleAdd} />
The practical rule: before adding React.memo to a component, verify its parent passes stable prop references. Otherwise you’re adding complexity for zero benefit.
Virtualising Long Lists
This is the highest-impact optimisation for data-heavy UIs. Rendering 1,000 DOM nodes is slow — virtualisation renders only the ~20 items visible in the viewport.
At Localstreet, our vendor product list had 500+ items. Before virtualisation, initial render took 1.2 seconds. After, 90ms.
import { useVirtualizer } from '@tanstack/react-virtual';
function ProductList({ products }: { products: Product[] }) {
const parentRef = useRef<HTMLDivElement>(null);
const virtualizer = useVirtualizer({
count: products.length,
getScrollElement: () => parentRef.current,
estimateSize: () => 80, // estimated row height in px
overscan: 5, // render 5 extra items outside viewport
});
return (
<div ref={parentRef} style={{ height: '600px', overflow: 'auto' }}>
<div style={{ height: virtualizer.getTotalSize() }}>
{virtualizer.getVirtualItems().map((vItem) => (
<div
key={vItem.key}
style={{
position: 'absolute',
top: vItem.start,
left: 0,
width: '100%',
}}
>
<ProductRow product={products[vItem.index]} />
</div>
))}
</div>
</div>
);
}
@tanstack/react-virtual is my go-to — it handles variable heights, horizontal lists, and grid layouts.
Code Splitting: Split at Route Boundaries First
Don’t micro-split. Split at the biggest natural boundaries: routes and heavy modal dialogs.
import { lazy, Suspense } from 'react';
// Each route chunk is loaded on demand
const VendorDashboard = lazy(() => import('./pages/VendorDashboard'));
const OrderHistory = lazy(() => import('./pages/OrderHistory'));
const Analytics = lazy(() => import('./pages/Analytics'));
function App() {
return (
<Suspense fallback={<PageSkeleton />}>
<Routes>
<Route path="/dashboard" element={<VendorDashboard />} />
<Route path="/orders" element={<OrderHistory />} />
<Route path="/analytics" element={<Analytics />} />
</Routes>
</Suspense>
);
}
This reduced our initial JS bundle from 890KB to 210KB — a 4x improvement in parse time on mid-range Android devices.
Image Optimisation: The Overlooked 80%
For most apps, images account for 70%+ of the byte weight. Fix images before you fix JavaScript.
// Don't render full-res images in thumbnails
// Bad:
<img src="https://cdn.example.com/product-1.jpg" alt="Product" width={80} height={80} />
// Good: request a sized variant from your CDN/image service
const imgUrl = `https://cdn.example.com/product-1.jpg?w=160&q=75&fm=webp`;
<img src={imgUrl} alt="Product" width={80} height={80} loading="lazy" decoding="async" />
For our Cloudflare CDN setup, we use image transformation workers. For Next.js projects, next/image handles this automatically. For vanilla React, a simple utility that appends CDN params is enough.
The Performance Checklist I Use Before Every Release
- Profile with React DevTools — no component taking >16ms
- Check bundle size with
source-map-explorer— no unexpected large deps - Test on a mid-range Android device (Moto G series) — real users don’t have MacBook Pros
- Run Lighthouse in incognito — LCP under 2.5s, TBT under 300ms
- Verify image dimensions match render size — no 2000px images in 80px thumbnails
The biggest performance gains I’ve found aren’t from React tricks — they’re from measuring carefully, fixing images, and splitting code at natural boundaries. Start there.