Senior Backend Engineer (Data Platform)

Tel Aviv office
Full-time

Description

Our team is looking for a Senior Backend Engineer who enjoys working on complex, data-heavy systems and thinking beyond the immediate task.

This role sits between backend engineering and data infrastructure, and focuses on building the systems that power how data is collected, processed, and used across the product.

Our story

Anecdotes is redefining enterprise GRC for the AI era. We are building an AI-native platform that continuously collects and structures data from across organizational systems, helping companies manage risk, compliance, and audits in a more automated and reliable way.

We work with large enterprises and auditors, and the scale and messiness of real-world data is a core part of what we deal with every day.

What you’ll do

• Design and build core systems that power our data platform, from ingestion through processing to serving

 • Own architecture of data pipelines across multiple systems and environments

 • Work with APIs, distributed processing frameworks, and analytical storage systems

 • Make decisions around data modeling, storage formats, and system boundaries

 • Build backend services that interact with both operational and analytical data stores

 • Continuously improve system reliability, performance, and scalability

 • Collaborate with product and R&D teams on features that rely on high-quality data

 • Take part in design discussions, mentor others, and help raise the engineering bar

What we’re looking for

• 6+ years of experience building backend systems in production

 • Strong hands-on experience with Python or a similar language

 • Experience designing and building data pipelines end to end, including ingestion, processing, and serving

 • Solid understanding of data modeling and the differences between transactional and analytical systems

 • Experience working with lakehouse technologies such as Iceberg, Athena, Delta Lake, and data warehouse technologies such as Snowflake, BigQuery, Redshift

 • Experience with production databases such as Postgres and at least one NoSQL system

 • Familiarity with distributed data processing tools such as Spark or Dataflow

 • Experience with workflow orchestration tools such as Airflow or Temporal

 • Experience working in cloud environments such as AWS, GCP, or Azure

 • Strong system design skills and the ability to think through scale, reliability, and data correctness

Nice to have

• Experience working with multi-cloud or cross-environment data systems

 • Familiarity with data formats like Parquet and concepts such as partitioning and schema evolution

 • Experience building internal data platforms or developer-facing infrastructure

 • Hands-on experience with AI tools such as Copilot or Cursor

How we work

We value engineers who take ownership, ask good questions, and are comfortable working in areas that are not fully defined. People who succeed here are those who understand how systems behave in production and think about the long-term impact of their decisions.

If you enjoy building systems that deal with real data complexity and want to influence both architecture and product, this role is a good fit.

Our story

What You'll Do

Requirements

Our playground