I joined LightStep this past summer as a Product Manager. Having transitioned to Product from Engineering, I was intrigued by LightStep’s technology. LightStep’s unique satellite architecture ingests infinite volumes of high-cardinality trace data to provide engineers with novel and fast ways to gain a clear picture of the interactions between their microservices.
However, Observability is still in its early stages, with some teams relying heavily on logs and using tools built for monoliths to monitor their now distributed systems. The challenge of creating a much needed shift in monitoring, in a time during which ‘tool fatigue’ is becoming increasingly common, called for a better first-time user experience — one that allows engineers to easily understand the value of Observability without any instrumentation effort on their part.
Announcing LightStep Sandbox
LightStep Sandbox is our first step towards building an in-product onboarding experience. It is a guided, interactive demo environment aimed at simplifying LightStep for engineers with varying levels of familiarity with tracing. The Sandbox, accessible here, was built on the product principles below.
Zero barriers to entry
The Sandbox is accessible to users without authentication. Entering your email in an enterprise product often results in an influx of sales emails. We wanted to make it clear to users that our sandbox is for exploratory purposes; there are no strings attached to checking out our product.
The Sandbox is built on mock data, which allows users to try LightStep pre instrumentation. To create a realistic experience, our data mimics that of an e-commerce company with several familiar microservices. Regressions are built into the data to enable scenarios for users to solve. These scenarios include problems engineers are faced with on a daily basis — resolving a latency regression, debugging an error, and monitoring a deployment, with many more to come. Analogous to a game that creates a sense of urgency and competition, each scenario begins by prompting users to respond to an alert.
It is common knowledge that games have the best onboarding experiences. They allow users to learn by doing instead of reading, reward them on completing levels, and provide guardrails for progress. Consumer apps, like Duolingo, have seen success in gamifying their onboarding experiences. This served as a big motivation to build a gamified sandbox. Users are provided with progressive goals or levels to successfully respond to an alert. Each goal asks users to dig deeper into the problem by using our analytical features — goals entail actions to identify correlations across data attributes, hunt for problematic service dependencies, and pinpoint performance bottlenecks. Indications of progress through the goals motivate users to push forward. The Sandbox has guardrails as well — if users get stuck on a goal, they can ask for hints; if they get lost within our product, they are prompted to steer back on track.
Recognizing that Observability and tracing are still nascent, we aimed to target engineers with little to no familiarity with these concepts. As a result, the prompts for the goals do not go deep into the technical details of our product. Neither do they teach users how to use the different features within our product. Instead, they walk users through the workflows they are completing and implicitly highlight the analyses they are performing with trace data. This makes it easier for users to understand the problems that are solvable with LightStep instead of getting caught in the weeds of how to read a latency histogram or trace map. Additionally, as our goal was to make the messaging applicable to a beginner audience, we were faced with the challenge of keeping more advanced users engaged as well. This led us to build hotspots in the Sandbox — users that need more explanation through the goals can click on the pulsing animations to learn more, while experts can power through without doing so.
By gamifying our onboarding experience and casting a playful light on the often stressful task of responding to an alert, we hope to have created a memorable first-time user experience. We also hope to have simplified the seemingly complex concepts of Observability and tracing for those who are unfamiliar. Check out LightStep Sandbox here. If you’re interested in solving similar problems, we’re hiring!
Thanks to the engineering team that built LightStep Sandbox: Katia Bazzi, Chris Heinen, Joe Blubaugh, Nate Bisbee, Daniel Roberts, Casie Chen, and Jacob Esparza.