LOGO

Iteratively Raises $5.4M to Revolutionize Data Pipelines

February 4, 2021
Iteratively Raises $5.4M to Revolutionize Data Pipelines

The Growing Importance of Data Quality in Analytics

With the increasing volume of data collected by businesses, maintaining the reliability and quality of that information is paramount. The effectiveness of an analytics pipeline is directly tied to the accuracy of the data it receives, and inconsistencies or errors can create significant problems further along in the process.

Iteratively, a company based in Seattle, is dedicated to assisting organizations in building trustworthy data pipelines. Recently, the company secured $5.4 million in seed funding, with Google’s Gradient Ventures fund leading the investment. Additional participation came from Fika Ventures and PSL Ventures, a previous investor in Iteratively. Gradient Ventures partner Zach Bratun-Glennon has also joined Iteratively’s board of directors.

Iteratively's Origins and Customer Discovery

Patrick Thompson, Iteratively’s co-founder and CEO, began developing the platform approximately two years ago. Prior to this, he gained experience at both Atlassian and Syncplicity, where he connected with his co-founder, Ondrej Hrebicek. The initial phase involved six months of customer research, revealing a widespread lack of confidence in the data companies were collecting.

“Numerous companies we interviewed had attempted to create internal solutions to address this very issue. We even developed one at Atlassian, so I was acutely aware of the challenges involved. This led us to create a product designed to alleviate this pain point,” Thompson explained.

Image Credits: Iteratively

A common issue within many organizations is a disconnect between those who generate data and those who analyze it. Communication often occurs through tools like spreadsheets or wikis, which can be inefficient. Iteratively aims to foster a collaborative environment, establishing a unified source of truth for all stakeholders.

“Often, requirements are transferred via systems like JIRA tickets, Confluence pages, or spreadsheets, and these implementations are frequently flawed, leading to downstream complications,” Thompson clarified.

Focus on Event Streaming Data and Data Privacy

Currently, Iteratively concentrates on event streaming data used for product and marketing analytics – the type of data commonly integrated with platforms like Mixpanel, Amplitude, or Segment. The tool is positioned at the point of data origin, such as within an application, to validate the data before routing it to chosen third-party solutions.

This placement ensures that the tool operates where the data is initially created, while also guaranteeing that no data passes through Iteratively’s servers.

Image Credits: Iteratively

“We do not directly access the data itself,” Thompson emphasized. “We are not a data processing service. Instead, we function as a layer over your existing analytics pipeline or SaaS tools, verifying the data payloads as they flow through our SDK on the client-side.”

While the current model avoids data processing, Thompson acknowledged that this may evolve over time, potentially incorporating metadata and observability features.

Pricing and Future Growth

Because the company does not process the data, its pricing structure is based on the number of user seats, rather than the volume of events processed. This model may be adjusted as Iteratively explores data processing capabilities.

The company currently employs around 10 individuals and intends to expand its team to 20 by year-end, with hiring focused on research and development, sales, and marketing.

Iteratively’s software offers a distinctive approach to promoting company-wide collaboration and ensuring data quality,” stated Gradient’s Bratun-Glennon. “We anticipate that sophisticated analytics and data-driven decision-making will be crucial for the success of future businesses and products. Iteratively’s mission, product, and team are well-positioned to provide these capabilities to their customers.”

#data pipelines#data quality#data trust#funding#iteratively#data engineering