AZKAIT is a Mexican company that seeks and connects the best IT talent with Latin American and United States companies.
We are looking for your talent as a Data Engineer.
Main Objective: As part of the International Data Engineering team, you will be responsible for design, development and operations of large-scale data systems operating. You will be focusing on real-time data pipelines, streaming analytics, distributed big data and machine learning infrastructure. You'll interact with the engineers, product managers, BI developers and architects to provide scalable robust technical solutions.
Requirements:- Min 3 years of BIG data development experience.
- Demonstrates up-to-date expertise in Data Engineering, complex data pipeline development.
- Design, develop, implement and tune large-scale distributed systems and pipelines that process large volumes of data; focusing on scalability, low -latency, and fault-tolerance in every system built.
- Experience with Java, Python to write data pipelines and data processing layers.
- Experience in writing map-reduce jobs.
- Demonstrates expertise in writing complex, highly-optimized queries across large data sets.
- Proven working expertise with Big Data Technologies Hadoop, Hive, Kafka, Presto, Spark, HBase, Automic & Aorta.
- Highly Proficient in SQL.
- Experience with Cloud Technologies (GCP, Azure).
- Experience with Relational, NoSQL/in memory data stores would be a big plus ( Oracle, Cassandra, Druid).
- Provides and supports the implementation and operations of the data pipelines and analytical solutions.
- Performance tuning experience of systems working with large data sets.
- Experience with streaming data processing.
- Experience with metadata management tool like MITI, Monitoring tool like Ambari
- Experience in developing REST API data service.
- Retail experience is a huge plus.