Senior Data Engineer
Senior
AWS
Azure
SQL
Expert Python and SQL skills, deep Databricks or Snowflake expertise, ability to design and justify end-to-end data architectures, strong cloud knowledge (AWS/GCP/Azure), presales experience, and ML/LLM data readiness.
Senior Data Engineer
Senior Data Engineer
Senior
AWS
Azure
SQL
Role Summary
Lead the data strategy and engineering excellence for our global clients. We are looking for a Senior Data Engineer who acts as a trusted technical advisor - someone who can design complex Lakehouse architectures, defend technical decisions before stakeholders, and drive presales activities. You will combine deep technical expertise in Python, SQL, and distributed computing with high-level architecture, ensuring our solutions are scalable, secure, and future-proof.
The Mission
To be the architect of value. Your mission is to design scalable, secure, and cost-effective data platforms that solve critical business problems. You will lead technical audits, define best practices for the team, and build the foundational infrastructure that enables advanced analytics and AI/GenAI capabilities for our clients.
The Tech Stack
Core Languages: Python or Scala (Expert/Patterns), SQL (Expert/Internals).
Compute & Storage: Databricks (Unity Catalog), Snowflake, BigQuery, Synapse.
Processing: Spark/PySpark (Deep internals, Tuning, Streaming), dbt (Enterprise patterns).
Architecture Patterns: Data Mesh, Lakehouse (Delta Lake/Iceberg), Lambda/Kappa.
Infrastructure & DevOps: Advanced Terraform/IaC, CI/CD, Docker/Kubernetes.
Emerging Tech: Feature Stores, Vector Databases, MLOps basics.
Your Skills
Engineering Mastery: Expert-level proficiency in Python (design patterns, library development) and SQL. Ability to optimize code others wrote.
Deep Platform Expertise: Mastery of Databricks (Spark memory management, partitioning strategies) or Snowflake (Warehouse tuning, RBAC, Zero-Copy Cloning). You understand how they work under the hood.
Architectural Vision: Ability to design end-to-end data solutions, select the right tools (e.g., “Why Snowflake over Redshift?”), and defend decisions to client leadership.
Cloud Mastery: Expert-level knowledge of AWS, GCP, or Azure. Deep understanding of networking (VPC, PrivateLink), security (IAM), and integration limits.
Consulting & Business: Experience participating in Presales, technical audits, or discovery phases. Translating business needs into technical specs.
AI/ML Readiness: Understanding of engineering data for Machine Learning (Feature Engineering, pipelines for LLM/RAG).
Your Responsibilities
Architecture & Leadership: Lead the design and implementation of scalable data pipelines and Lakehouse architectures. Act as the Design Authority.
Advanced Engineering: Solve the hardest technical challenges—optimizing high-load streaming pipelines, debugging complex Spark jobs, and designing generic frameworks.
Consulting & Presales: Participate in technical assessments, audits of existing systems, and proposal estimations. Explain the ROI of technical modernization.
Performance & Security: Ensure all solutions are production-ready: secure, monitored, cost-efficient (FinOps), and documented.
Mentorship: Define coding standards, conduct code reviews, and mentor Middle/Junior engineers to foster a culture of engineering excellence.
Nice to Have
Streaming: Deep experience with Kafka, Kinesis, or Spark Structured Streaming.
GenAI Stack: Experience with Vector Databases (Pinecone, pgvector, Weaviate) or frameworks like LangChain.
Certifications: Professional-level cloud certifications (e.g., AWS Solutions Architect Pro, Databricks Certified DE Professional).
NoSQL: Advanced modeling for DynamoDB, Cosmos DB, or MongoDB.
What we offer
Long-term career stability with a competitive salary paid in USD.
Conditions for steady career development.
Development supported by dedicated mentors and a variety of programs focused on expertise and innovation.
Private medical insurance provided after successful completion of the probationary period
A well-equipped and cozy office supports comfort and productivity across all project stages.
Welcoming atmosphere and a friendly corporate culture.
If you feel this opportunity resonates with you, apply now — we’re looking forward to getting to know you!
Role Summary
Lead the data strategy and engineering excellence for our global clients. We are looking for a Senior Data Engineer who acts as a trusted technical advisor - someone who can design complex Lakehouse architectures, defend technical decisions before stakeholders, and drive presales activities. You will combine deep technical expertise in Python, SQL, and distributed computing with high-level architecture, ensuring our solutions are scalable, secure, and future-proof.
The Mission
To be the architect of value. Your mission is to design scalable, secure, and cost-effective data platforms that solve critical business problems. You will lead technical audits, define best practices for the team, and build the foundational infrastructure that enables advanced analytics and AI/GenAI capabilities for our clients.
The Tech Stack
Core Languages: Python or Scala (Expert/Patterns), SQL (Expert/Internals).
Compute & Storage: Databricks (Unity Catalog), Snowflake, BigQuery, Synapse.
Processing: Spark/PySpark (Deep internals, Tuning, Streaming), dbt (Enterprise patterns).
Architecture Patterns: Data Mesh, Lakehouse (Delta Lake/Iceberg), Lambda/Kappa.
Infrastructure & DevOps: Advanced Terraform/IaC, CI/CD, Docker/Kubernetes.
Emerging Tech: Feature Stores, Vector Databases, MLOps basics.
Your Skills
Engineering Mastery: Expert-level proficiency in Python (design patterns, library development) and SQL. Ability to optimize code others wrote.
Deep Platform Expertise: Mastery of Databricks (Spark memory management, partitioning strategies) or Snowflake (Warehouse tuning, RBAC, Zero-Copy Cloning). You understand how they work under the hood.
Architectural Vision: Ability to design end-to-end data solutions, select the right tools (e.g., “Why Snowflake over Redshift?”), and defend decisions to client leadership.
Cloud Mastery: Expert-level knowledge of AWS, GCP, or Azure. Deep understanding of networking (VPC, PrivateLink), security (IAM), and integration limits.
Consulting & Business: Experience participating in Presales, technical audits, or discovery phases. Translating business needs into technical specs.
AI/ML Readiness: Understanding of engineering data for Machine Learning (Feature Engineering, pipelines for LLM/RAG).
Your Responsibilities
Architecture & Leadership: Lead the design and implementation of scalable data pipelines and Lakehouse architectures. Act as the Design Authority.
Advanced Engineering: Solve the hardest technical challenges—optimizing high-load streaming pipelines, debugging complex Spark jobs, and designing generic frameworks.
Consulting & Presales: Participate in technical assessments, audits of existing systems, and proposal estimations. Explain the ROI of technical modernization.
Performance & Security: Ensure all solutions are production-ready: secure, monitored, cost-efficient (FinOps), and documented.
Mentorship: Define coding standards, conduct code reviews, and mentor Middle/Junior engineers to foster a culture of engineering excellence.
Nice to Have
Streaming: Deep experience with Kafka, Kinesis, or Spark Structured Streaming.
GenAI Stack: Experience with Vector Databases (Pinecone, pgvector, Weaviate) or frameworks like LangChain.
Certifications: Professional-level cloud certifications (e.g., AWS Solutions Architect Pro, Databricks Certified DE Professional).
NoSQL: Advanced modeling for DynamoDB, Cosmos DB, or MongoDB.
What we offer
Long-term career stability with a competitive salary paid in USD.
Conditions for steady career development.
Development supported by dedicated mentors and a variety of programs focused on expertise and innovation.
Private medical insurance provided after successful completion of the probationary period
A well-equipped and cozy office supports comfort and productivity across all project stages.
Welcoming atmosphere and a friendly corporate culture.
If you feel this opportunity resonates with you, apply now — we’re looking forward to getting to know you!
Senior Data Engineer
Content
Senior
Expert Python and SQL skills, deep Databricks or Snowflake expertise, ability to design and justify end-to-end data architectures, strong cloud knowledge (AWS/GCP/Azure), presales experience, and ML/LLM data readiness.