Real Time
Using Your Analytics Data

Simplicity is Powerful

Example showing sample/fake data & some ingest data sources

Ingest Data

Attach Your Pipeline

Connect your existing analytic events from a variety of sources, no need to transform or pre-aggregate. You can choose from a variety of input schemas, or we can craft a new one. As long as it's JSON, we will take it.

Filter & Aggregate

Using Events & Properties, exactly as they were sent

Setup filters using JSONPath to decide which events to aggregate. Then, choose properties to group by and which to use for your calculations. Need to exclude some obsolete data? Sending a new event to consider? No problem, just modify your filter - no need to change any code or do a deploy.

Grafana Visualization Example

Visualize & Alert

Powered by Grafana

Set up dashboards and alerts using our Grafana Plugin or roll your own using the Metrics API.

Simple Pricing

We strive for understandable and predictable pricing. No tiered services, no differentiated features. Just pay for the quantity you need. Everyone gets the same great service.

Starting at $60

Billed Monthly
5M Events
10 Filters
14 Day Free Trial (no credit card required)
View Pricing
Why Choose

JSON means never having to say "I wish I'd added a tag for that."


Don't worry about cardinality or pre-aggregates or rates of sums or counters vs gauges.


We don't hold onto your raw data, we don't want it.

Frequently Asked Questions
How long is my raw data retained?

Our pipeline will retain raw data for up to 12 hours in the event of emergencies that interrupt processing. This data lives only in a raw compressed queue and is not queryable/accessible easily.

Is my data secure?

Yes! We utilize various, battle-tested 3rd parties to build our product and ensure that in every possible scenario, encryption is enabled both in transit and at rest.

Are there limits?

Some, yes, but ideally, you won’t notice. One of the motivating factors behind was to remove any burden of thinking about cardinality. Read more about cardinality here

Can I REALLY use my existing data pipeline?

Almost definitely yes! If you’re using Segment or a similar CDP, they probably have a webhook destination. If you’re using Kafka, you can set up an HTTP Sink. As long as you can POST us JSON, we can probably ingest it. If you have a tricky situation, reach out and we can probably figure it out.

Why should I trust you?

We’re a new company, it makes sense to be wary. We have worked in Data, Operations, Analytics, Product & more at some of the largest companies in the world. We have built based on that experience, and we’re trying to provide products similar to tooling those companies have internally - for everyone.

Latest Docs & Blog Posts

Image for Load Balancing Data Ingest With Cloudflare Workers
Load Balancing Data Ingest With Cloudflare Workers

Building an API for analytics event ingestion using Cloudflare workers as a means to load-balance traffic spikes and enable high-availability services.

Image for Analytics Without Events: DynamoDB
Analytics Without Events: DynamoDB

In this use case, we look at how one company is able to achieve real time monitoring without an existing data pipeline and no new code.

Image for Measuring Concurrency Using Heartbeat Events
Measuring Concurrency Using Heartbeat Events

Explore how a video game with a simple "heartbeat" event enables numerous metrics using the Grafana plugin.