Get new jobs by email
- ...maintain cloud-native data platforms with automated ingestion, transformation, and governance pipelines using modern tools like DBT, Apache Spark, Delta Lake, Airflow, and Databricks Work with stakeholders, including the Product, BI, and Support teams , to assist with data-...SuggestedRemote jobLong term contractShift work
- ...Points Deep Golang expertise Deep Kubernetes Knowledge Experience with modern data engineering technologies: Spark, Trino, Iceberg, Parquet, ClickHouse, DBT DBA background (relational, OLAP, columnar) Expertise in telemetry and time series...SuggestedRemote jobLong term contractFull timeFor contractorsWorldwideHome officeFlexible hours
- ...engineering. ~ Familiarity with the MLOps stack, including Databricks, Feature Stores, Kubernetes, Kubeflow, MLflow etc ~ Expertise in Spark for large-scale data processing and distributed workflows. ~ Proficient in Git and version control best practices. ~ Highly...SuggestedRemote jobPermanent employment
- ...modern Web technologies (single page webapp with Angular/AngularJS) and integrates with a very large ecosystem of big data technologies (Spark, Hadoop, MPP databases, Cloud services, ...). Our software testing stack is based on Python, Pytest, Selenium, and Allure for the...SuggestedFull timeRemote work
- ...Hands-on, production experience with cloud (AWS, Azure, GCP) Experience with data engineering - streaming and batch processing, spark, tryno, iceberg, clickhouse, parquet Are you a Do'er? Be your truest self. Work on your terms. Make a difference. We are...SuggestedRemote jobFull timeFor contractorsWorldwideHome officeFlexible hours1 day week
- ...familiarity with Rust is a plus) ~ Proven experience designing and implementing large-scale batch data processing workflows (e.g., Pandas, Spark, Polars, SQL) ~ Hands-on experience developing and operating real-time, low-latency, streaming data pipelines (e.g., Apache Flink,...SuggestedFull timeLocal areaRemote work
- ...targeting systems. The Daily To-Do Independently implement, optimize, and maintain robust ETL/ELT pipelines using Python, Airflow, Spark, Iceberg, Snowflake, Aerospike, Docker, Kubernetes (EKS), AWS, and real-time streaming technologies like Kafka and Flink. Engage...SuggestedRemote jobPermanent employment