Announcing major updates to Lightstep’s observability platform:

Learn more about Change Intelligence

Observability

How we built Lightstep Metrics: a new charting library with React and Canvas


Ted Pennings

by Ted Pennings

Explore more Observability Blogs

Ted Pennings

by Ted Pennings


02-09-2021

Looking for Something?

No results for 'undefined'

When adding Lightstep Metrics to our platform, our teams built a fresh, tailored approach to charting with the library on top of React and Canvas.

Our UI is entirely React. For our charts, we chose Canvas to ensure that we can support the data-rich charts at the foundation of our Change Intelligence product. Charting has two approaches, vector graphics (SVG) and raster graphics (Canvas 2D or WebGL). Canvas scales much more reliably with large data sets. It also has a much smaller footprint in the DOM for complex charts.

SVG pairs effortlessly with React because SVG, at its core, is just DOM elements inside an <svg> element. JSX makes this seamless — it’s just like putting <p> tags in a <div>. (SVG can also be displayed from an <img> tag pointing to a .svg file or a data URI.)

Canvas is completely different. This post will discuss how to manage Canvas elements inside React.

Unlike SVG, Canvas does not have any children and its contents are not part of the DOM. Canvas is an imperative graphics technology. A <canvas> is a completely empty, transparent element when it first mounts in the DOM.

Lightstep Metrics Charting - empty canvas

Note: the axes and gridlines are SVG, which has better text performance.

Canvas must be manipulated directly by JavaScript to fill it with content:

const ctx = element.getContext(2d’);   // or elementRef.current.getContext(‘2d’);
ctx.fillStyle = ‘#039c49’;
ctx.fillRect(10, 25, 250, 100); // x, y, width, height; note that (0,0) is top left.

Lightstep Metrics Charting - green

So, let’s put this in React. Canvas requires imperative rendering like above, inside the React render cycle. This is easy with hooks: useEffect(). With class components, componentDidUpdate is likely the best choice.

The canvas element will need a ref to access DOM functions like the .getContext() call above. With the ref, it’s fairly easy to render with useEffect().

useEffect(() => {
  if (!canvasElRef.current || height === 0 || width === 0) {
    return; // Element is not mounted, or has no dimensions
  }
   const ctx = canvasElRef.current.getContext('2d')
   /// rendering code...
}, [])

This block will usually need other information like element height and width. We use useMeasure to access these dimensions for the useEffect() deps. (react-use is a wonderful library with many helpful hooks.)

Height and width are needed because the actual canvas content is binary data scaled to the element height and width. When a <canvas> element resizes, its content will not resize gracefully. It must be re-rendered. (It can also be stretched with distortion.) Adding height and width as dependencies on your useEffect() block will re-render your component when it resizes.

useEffect(() => {
  if (!canvasElRef.current || height === 0 || width === 0) {
    return; // Element is not mounted, or has no dimensions
  }

  const ctx = canvasElRef.current.getContext('2d')
  ctx.fillStyle = '#039c49';
  ctx.fillRect(10, 25, 250, 100); 

  // render a chart (or draw an owl)

}, [height, width])

Lightstep Metrics Charting - chart

Be on the watch for small, unintentional layout shifts — even a subpixel layout shift can trigger an expensive, complete re-render. Periodic profiling with the browser dev tools or even a temporary console.log() in the useEffect() are the easiest ways to find this.

devicePixelRatio is another consideration. This represents the logical:physical pixel scaling of a display. Basically, it measures how Retina a display is. A MacBook pixel ratio is 2.0, recent iPhones have a 3.0 device pixel ratio, and newer Android phones go up to 4.0 (more info). Canvas defaults to a pixel ratio of 1.0, which can be very bad because it is becoming uncommon. Graphics will appear blurry on these screens if the <canvas> is not scaled when it is initialized, before rendering.

Here are some examples of 1.0 vs 2.0 pixel ratio, taken from a MacBook Pro (2.0):

Lightstep Metrics Charting - devicePixelRatio

Lightstep Metrics Charting - devicePixelRatio 2

Lightstep Metrics Charting - devicePixelRatio 3

This is bad for data visualization because you lose the contours of intersecting lines, and the layering of a dense bar graph turns into a moss-like fuzz.

It’s easy to scale to the correct devicePixelRatio. Add the following code to your initialization useEffect() to scale the canvas by the devicePixelRatio:

const pixelRatio = window.devicePixelRatio || 1;

const adjustedHeight = height * pixelRatio;
const adjustedWidth = width * pixelRatio;

element.setAttribute('width', adjustedWidth);
element.setAttribute('height', adjustedHeight);
ctx.scale(pixelRatio, pixelRatio);
element.style.width = `${width}px`;
element.style.height = `${height}px`;

The ratio only needs to be set in one place, as above. Graphics pixel coordinates remain the same. A canvas that is 200px by 300px will remain that size, and its coordinate plane will span (0,0) to (200,300), not larger, and is not changed by the devicePixelRatio.

Note: that the code above will clear the canvas. Canvas data is discarded when the dimensions are set. This is very useful and avoids a ctx.clearRect(). Keep this in mind because it may not always be necessary to clear before rendering, and scaling only needs to happen once.

Ok. Almost there! A <canvas> usually cannot be shown immediately. The canvas is blank until it’s first rendered, and often that is not immediate (useLayoutEffect vs useEffect). So it’s best to use CSS to hide the element until you’ve finished rendering to it. (visibility:’hidden’ means the element will still occupy space in the layout, vs display:’none’ which does not.) A simple useState() flag to track if the component is rendered works great for this.

	<canvas style={{ visibility : rendered ? ‘visible’: ‘hidden’ }} height={200} width={300} />

One last thing: the browser only paints 60 times per second at most, but many events can fire much more frequently than that (like mousemove). Do not call canvas rendering code more than 60 times per second to avoid wasted work. This can be done by placing canvas rendering code in a requestAnimationFrame(). It acts like setTimeout(), where the browser calls your function as part of the render and paint cycles. (It does not take a timeout.) requestAnimationFrame() can provide deduping because it returns a request ID that can be stored (likely in a ref) to cancel a pending render. For example, if requestAnimationFrame() has been called three times before the next frame has started to render, the first two can be canceled with cancelAnimationFrame() (like clearTimeout()). Using requestAnimationFrame() also synchronizes work on multiple canvases into the same paint.

So that’s it! I hope this overview of one foundational aspect of building data visualizations in React has been helpful.

To see our new charts in action, head over to our Lightstep Sandbox and let us know what you think!

Interested in joining our team? See our open positions here.

Explore more Observability Blogs