Frequently Asked Questions


How long is my raw data retained?

Our pipeline will retain raw data for up to 12 hours in the event of emergencies that interrupt processing. This data lives only in a raw compressed queue and is not queryable/accessible easily.

Is my data secure?

Yes! We utilize various, battle-tested 3rd parties to build our product and ensure that in every possible scenario, encryption is enabled both in transit and at rest.

Are there limits?

Some, yes, but ideally, you won’t notice. One of the motivating factors behind was to remove any burden of thinking about cardinality. Read more about cardinality here

Can I REALLY use my existing data pipeline?

Almost definitely yes! If you’re using Segment or a similar CDP, they probably have a webhook destination. If you’re using Kafka, you can set up an HTTP Sink. As long as you can POST us JSON, we can probably ingest it. If you have a tricky situation, reach out and we can probably figure it out.

Why should I trust you?

We’re a new company, it makes sense to be wary. We have worked in Data, Operations, Analytics, Product & more at some of the largest companies in the world. We have built based on that experience, and we’re trying to provide products similar to tooling those companies have internally - for everyone.


Do you offer custom plans?

If you need to ingest a substantial amount of events or have particular requirements, we may be able to work something out. Get in touch.

What happens if I exceed my event count?

You can enable overages for your account, which will grant you an additional 10% of your plan’s event count each time you exceed the threshold, and are billed at a 10% premium of your subscription rate. This charge will appear on your following month’s invoice.

Will you let me know when I'm approaching my limit?

Absolutely! Admins of the organization will receive an email when your usage surpasses 50%, 75% & 90% of your event count.

Real Time

What is a filter?

A filter is the building block for your real-time aggregation. It defines what data matters, how you want to group your aggregations and at what interval. Learn more here

How many aggregations do I get per filter?

You can add up to 10 aggregations per filter. Each aggregation can contain 1-6 calculations, so you don’t need to worry about separate aggregations for Counts vs Sums etc.


How do I define an "event"?

You can define an event by providing up to 3 properties that go into an event definition. Read more in the docs

If I accidentally send PII, can it be removed from AutoDocs samples?

Yes! Security and privacy is of the upmost importance. Review the docs on how to do this.

Are there limits to the number of events or versions?

There are. In order to safegaurd the system (for example, if you define a user_id field as an event descriptor) - any config that has more than 250 unique events or 200 unique versions will be disabled. These are flexible limits, if you have a need for more events/versions, please contact us.

How deeply nested does AutoDocs scan?

By default, events will be explored up to 10 levels deep, including nested arrays. This is a flexible limit, please contact us if you need this limit increased.

© Data Stuff, LLC Privacy Policy Terms of Service Acceptable Use