From 6 Seconds to 600ms - Hunting Down React Performance Issues

react performance debugging optimization web-development

A war story of debugging and fixing performance issues in a large-scale React application


The Wake-Up Call

"This feels sluggish."

Four words in a client feedback email that made my heart sink. We'd just delivered a major feature for our enterprise dashboard—a real-time analytics page that visualized complex data sets—and I was proud of what we'd built. The features worked flawlessly in our testing.

But there it was: "sluggish."

Our product manager forwarded the email with a simple note: "Can you look into this? The client's thinking about pausing the next phase until we address it."

For context, this wasn't just any client. They were our biggest account, responsible for nearly 40% of our annual revenue. If they paused the next phase of development, our quarterly targets would be in jeopardy.

I needed to fix this, and fast.

The Diagnosis Phase

I started by reproducing the issue. The client was right—navigating through the dashboard did feel slow, especially on the analytics page. But saying something "feels slow" doesn't help identify the cause. I needed data.

My first step was to use React's built-in Profiler to measure component rendering times:

import { Profiler } from "react";
 
function onRenderCallback(
  id, // the "id" prop of the Profiler tree
  phase, // "mount" or "update"
  actualDuration, // time spent rendering
  baseDuration, // estimated time for the entire subtree
  startTime, // when React began rendering
  commitTime // when React committed the updates
) {
  console.log(`${id} [${phase}] took ${actualDuration.toFixed(2)}ms`);
}
 
function App() {
  return (
    <Profiler id="Application" onRender={onRenderCallback}>
      <Dashboard />
    </Profiler>
  );
}

I wrapped key sections of our application with Profiler components and deployed a debug build to a staging environment that mirrored our production setup.

After clicking around for a few minutes, the console was filled with timing data. Three issues immediately stood out:

  1. The data table component was re-rendering entirely on every row selection, taking up to 800ms
  2. The filter sidebar was re-rendering when unrelated parts of the app updated
  3. Several chart components were re-rendering simultaneously when only one needed to update

This was a good start, but I needed more specific insights, so I turned to the React DevTools Profiler for a visual representation of the renders.

The First Culprit: Wasteful Renders

The most alarming discovery was our data table. It contained thousands of rows of data, each with multiple columns and interactive elements. When a user selected a row, the entire table re-rendered—not just the selected row.

Here's what the component looked like (simplified):

function DataTable({ data, onRowSelect }) {
  const [selectedRow, setSelectedRow] = useState(null);
 
  // This function was recreated on every render
  const handleRowSelect = (rowId) => {
    setSelectedRow(rowId);
    onRowSelect(rowId);
  };
 
  return (
    <table className="data-table">
      <thead>
        <tr>{/* Table headers */}</tr>
      </thead>
      <tbody>
        {data.map((row) => (
          <DataRow
            key={row.id}
            row={row}
            isSelected={selectedRow === row.id}
            onSelect={handleRowSelect}
          />
        ))}
      </tbody>
    </table>
  );
}
 
// This component wasn't memoized
function DataRow({ row, isSelected, onSelect }) {
  return (
    <tr
      className={isSelected ? "selected" : ""}
      onClick={() => onSelect(row.id)}
    >
      {Object.values(row).map((cell, index) => (
        <td key={index}>{cell}</td>
      ))}
    </tr>
  );
}

Two key issues were causing unnecessary renders:

  1. The handleRowSelect function was recreated on every render, causing all child components to re-render
  2. The DataRow component wasn't memoized, so even if props didn't change, it would still re-render

The fix was relatively straightforward:

function DataTable({ data, onRowSelect }) {
  const [selectedRow, setSelectedRow] = useState(null);
 
  // Memoize the function so it doesn't change on every render
  const handleRowSelect = useCallback(
    (rowId) => {
      setSelectedRow(rowId);
      onRowSelect(rowId);
    },
    [onRowSelect]
  );
 
  return (
    <table className="data-table">
      <thead>
        <tr>{/* Table headers */}</tr>
      </thead>
      <tbody>
        {data.map((row) => (
          <MemoizedDataRow
            key={row.id}
            row={row}
            isSelected={selectedRow === row.id}
            onSelect={handleRowSelect}
          />
        ))}
      </tbody>
    </table>
  );
}
 
// Memoize the component to prevent unnecessary renders
const MemoizedDataRow = React.memo(function DataRow({
  row,
  isSelected,
  onSelect
}) {
  return (
    <tr
      className={isSelected ? "selected" : ""}
      onClick={() => onSelect(row.id)}
    >
      {Object.values(row).map((cell, index) => (
        <td key={index}>{cell}</td>
      ))}
    </tr>
  );
});

This change alone reduced the rendering time from 800ms to around 150ms—a significant improvement, but not enough.

The Second Culprit: Prop Drilling and Context Overuse

As I dug deeper, I discovered our application was suffering from an architectural issue. We had implemented a global context for user settings, which included filter preferences, display options, and other UI state.

This context was being used by most components, even those that only needed a small subset of the data. Whenever any part of the context changed, all consumers would re-render.

Here's a simplified version of what we had:

// A massive context with many unrelated pieces of state
const AppContext = createContext();
 
function AppProvider({ children }) {
  const [filters, setFilters] = useState({});
  const [selectedItems, setSelectedItems] = useState([]);
  const [uiPreferences, setUiPreferences] = useState({});
  const [userData, setUserData] = useState(null);
  // Many more state variables...
 
  const value = {
    filters,
    setFilters,
    selectedItems,
    setSelectedItems,
    uiPreferences,
    setUiPreferences,
    userData,
    setUserData
    // Everything bundled together
  };
 
  return <AppContext.Provider value={value}>{children}</AppContext.Provider>;
}
 
// Components would use the entire context even when they only needed a small part
function FilterSidebar() {
  const { filters, setFilters, uiPreferences } = useContext(AppContext);
 
  // Component logic
}

The solution was to split this monolithic context into smaller, more focused contexts:

const FilterContext = createContext();
const SelectionContext = createContext();
const PreferencesContext = createContext();
const UserContext = createContext();
 
function AppProvider({ children }) {
  return (
    <UserContext.Provider value={useUserState()}>
      <PreferencesContext.Provider value={usePreferencesState()}>
        <SelectionContext.Provider value={useSelectionState()}>
          <FilterContext.Provider value={useFilterState()}>
            {children}
          </FilterContext.Provider>
        </SelectionContext.Provider>
      </PreferencesContext.Provider>
    </UserContext.Provider>
  );
}
 
// Custom hooks to manage each slice of state
function useUserState() {
  const [userData, setUserData] = useState(null);
  return { userData, setUserData };
}
 
// Similar hooks for other state slices...
 
// Components would only consume the contexts they needed
function FilterSidebar() {
  const { filters, setFilters } = useContext(FilterContext);
  const { sidebarExpanded } = useContext(PreferencesContext);
 
  // Component logic
}

By splitting the context, components would only re-render when the specific context they consumed changed, not whenever any state in the application changed.

The Third Culprit: Expensive Calculations

While investigating the chart components, I noticed we were performing complex data transformations during render:

function AnalyticsChart({ rawData, type }) {
  // This expensive calculation ran on every render
  const processedData = transformDataForChart(rawData, type);
 
  return <ChartComponent data={processedData} />;
}
 
function transformDataForChart(data, type) {
  // Complex calculations involving lots of array manipulations
  // This could take 100-200ms for large datasets
  // ...
}

The solution was to memoize these calculations using useMemo:

function AnalyticsChart({ rawData, type }) {
  // Only recalculate when rawData or type changes
  const processedData = useMemo(() => {
    return transformDataForChart(rawData, type);
  }, [rawData, type]);
 
  return <ChartComponent data={processedData} />;
}

This prevented the expensive transformation from running on every render, but only when the inputs actually changed.

The Hidden Performance Killer: Anonymous Objects

After applying the previous optimizations, the app was much faster—but still not as snappy as I wanted. Using the React DevTools Profiler, I noticed some components were still re-rendering unnecessarily.

The issue turned out to be subtle: we were passing new object references as props on every render, even though the content of these objects didn't change.

For example:

function ParentComponent() {
  // This creates a new object reference on every render
  const chartConfig = {
    height: 400,
    width: 600,
    animation: true,
    theme: "light"
  };
 
  return <Chart config={chartConfig} />;
}

Even if Chart was memoized with React.memo, it would still re-render on every parent render because chartConfig was a new object each time.

The fix was to memoize these objects:

function ParentComponent() {
  // This creates a stable object reference that only changes when dependencies change
  const chartConfig = useMemo(
    () => ({
      height: 400,
      width: 600,
      animation: true,
      theme: "light"
    }),
    []
  );
 
  return <Chart config={chartConfig} />;
}

We applied this pattern throughout the codebase, focusing on props passed to memoized components.

The Virtualization Breakthrough

Despite all these improvements, the data table was still not as fast as I wanted when displaying thousands of rows. That's when I turned to virtualization.

Instead of rendering all rows at once, we would only render the rows visible in the viewport, plus a small buffer. As the user scrolled, rows would be recycled and their content updated.

I implemented this using react-window:

import { FixedSizeList } from "react-window";
 
function VirtualizedTable({ data, onRowSelect }) {
  const [selectedRow, setSelectedRow] = useState(null);
 
  const handleRowSelect = useCallback(
    (rowId) => {
      setSelectedRow(rowId);
      onRowSelect(rowId);
    },
    [onRowSelect]
  );
 
  const Row = useCallback(
    ({ index, style }) => {
      const row = data[index];
      return (
        <div
          style={style}
          className={selectedRow === row.id ? "row selected" : "row"}
          onClick={() => handleRowSelect(row.id)}
        >
          {Object.values(row).map((cell, cellIndex) => (
            <div key={cellIndex} className="cell">
              {cell}
            </div>
          ))}
        </div>
      );
    },
    [data, selectedRow, handleRowSelect]
  );
 
  return (
    <FixedSizeList
      height={500}
      width="100%"
      itemCount={data.length}
      itemSize={50}
    >
      {Row}
    </FixedSizeList>
  );
}

This change dramatically improved performance, especially for large datasets. Even with 10,000 rows, scrolling was smooth and row selection was instantaneous.

The Final Touch: Code Splitting

Our analytics dashboard had many features, but users typically only used a subset of them in any given session. Yet, we were loading the code for all features upfront, increasing initial load time.

I implemented code splitting to load features only when needed:

import { lazy, Suspense } from "react";
 
// Instead of direct imports
// import DataVisualization from './DataVisualization';
 
// Lazy load components
const DataVisualization = lazy(() => import("./DataVisualization"));
const AdvancedFilters = lazy(() => import("./AdvancedFilters"));
const ExportTools = lazy(() => import("./ExportTools"));
 
function Dashboard() {
  const [activeTab, setActiveTab] = useState("visualization");
 
  return (
    <div>
      <TabSelector activeTab={activeTab} onChange={setActiveTab} />
 
      <Suspense fallback={<LoadingSpinner />}>
        {activeTab === "visualization" && <DataVisualization />}
        {activeTab === "filters" && <AdvancedFilters />}
        {activeTab === "export" && <ExportTools />}
      </Suspense>
    </div>
  );
}

This reduced our initial JavaScript bundle size by nearly 60%, leading to much faster initial page loads.

The Result: From 6 Seconds to 600ms

After implementing all these optimizations, the results were dramatic:

  1. Initial page load: Reduced from 4.5 seconds to 1.8 seconds
  2. Time to interactivity: Reduced from 6.2 seconds to 2.1 seconds
  3. Table row selection: Reduced from 800ms to under 50ms
  4. Filter application: Reduced from 1.2 seconds to 200ms
  5. Chart rendering: Reduced from 900ms to 300ms

Overall, the application felt like an entirely different product—responsive, snappy, and pleasant to use.

The best part? I received an email from our product manager a week after deploying the optimized version: "Client loves the performance improvements. They're ready to proceed with the next phase."

Lessons Learned: A Performance Optimization Checklist

This experience taught me several valuable lessons about React performance optimization:

  1. Measure before optimizing:

    • Use the React Profiler to identify actual bottlenecks
    • Don't guess what's slow—measure it
  2. Prevent unnecessary re-renders:

    • Memoize components with React.memo
    • Memoize callback functions with useCallback
    • Memoize computed values with useMemo
  3. Split context providers:

    • Create focused contexts for specific concerns
    • Avoid putting unrelated state in the same context
  4. Optimize lists:

    • Use virtualization for long lists
    • Ensure list items are properly memoized
    • Use stable keys for list items
  5. Handle expensive calculations properly:

    • Move expensive calculations out of render
    • Memoize results of expensive functions
    • Consider using web workers for truly intensive operations
  6. Optimize bundle size:

    • Implement code splitting for large feature sets
    • Lazy load components that aren't needed immediately
    • Analyze your bundle to find unexpected large dependencies

I now apply this checklist to all new features we develop, preventing performance issues before they arise rather than fixing them after customers complain.

The Preventative Approach: Performance Budgets

Following this experience, we implemented performance budgets for our application. We set thresholds for:

We added automated performance tests to our CI pipeline that would fail the build if these budgets were exceeded, forcing us to address performance issues before they reached production.

// Example of a simple render performance test
import { render } from "@testing-library/react";
import { Profiler } from "react";
import DataTable from "./DataTable";
 
test("DataTable renders within performance budget", () => {
  let renderTime = 0;
 
  const onRender = (id, phase, actualDuration) => {
    renderTime = actualDuration;
  };
 
  render(
    <Profiler id="performance-test" onRender={onRender}>
      <DataTable data={testData} onRowSelect={jest.fn()} />
    </Profiler>
  );
 
  expect(renderTime).toBeLessThan(50); // 50ms budget
});

Final Thoughts

Performance isn't just a technical concern—it directly impacts user experience, customer satisfaction, and ultimately, business outcomes. In our case, it was the difference between losing and retaining our biggest client.

The React ecosystem provides powerful tools for building complex applications, but with that power comes the responsibility to use it wisely. Understanding React's rendering behavior and optimization techniques isn't optional for serious React developers—it's essential.

So the next time someone says your application feels "sluggish," don't despair. See it as an opportunity to dive deep into React's performance model and emerge with both a faster application and a stronger understanding of how React works under the hood.

Your users—and your business stakeholders—will thank you.