
Job description
We are seeking a Senior Data Engineer with a consultative, analytical, and highly communicative profile, capable of leading data architecture and engineering initiatives from design to delivery.
The ideal candidate will design, implement, and optimize Data Lake architectures and data pipelines on AWS, with a strong focus on workload migration, real-time data integration, and governance. This role requires both technical expertise and strategic vision, collaborating with cross-functional teams to translate business needs into scalable, data-driven solutions.
Job requirements
Key Responsibilities
Design and implement modern Data Lake / Data Mesh architectures.
Lead workload and pipeline migrations from on-premises or hybrid environments to cloud platforms (AWS, Azure, GCP).
Build and optimize batch and streaming data pipelines ensuring scalability and reliability.
Implement Change Data Capture (CDC) and real-time streaming with Kafka and Kinesis.
Ensure observability, security, and governance across data pipelines.
Collaborate with Data Scientists, Architects, and Stakeholders to deliver scalable, data-driven solutions.
Provide technical leadership and support architectural modernization strategies.
Required Qualifications
Proven experience with AWS Data Services (S3, Glue, Athena, Redshift, Kinesis, Lambda, Step Functions, Lake Formation, IAM).
Hands-on experience with CDC frameworks (e.g., Debezium, AWS DMS).
Strong background in Kafka and orchestration tools like Airflow or Airbyte.
Proficiency in Python and SQL for ETL/ELT development.
Experience in data workload migration and pipeline modernization.
Familiarity with Git-based workflows and CI/CD practices.
Excellent communication skills across technical and non-technical teams.
Nice to Have
Experience in consulting or client-facing roles.
Strong analytical and solution-oriented mindset.
Experience with Databricks, Spark, and Delta Lake.
Knowledge of BigQuery (GCP) and Azure Data Services (Synapse, Data Factory).
Understanding of MLOps and ML pipeline integrations.
Experience with Infrastructure as Code (Terraform, Pulumi, AWS CDK).
English fluency (technical or conversational).
- Brazil
or
All done!
Your application has been successfully submitted!
