Data Engineer
Design and implement modern scalable data solutions together with leads and architects, using a range of new and emerging technologies.
Work with Agile frameworks and implementation approaches in the delivery.
Minimum of 3 years of experience working as a Data Engineer
Have a solid understanding of cloud-based technologies (at least one of Azure, GCP, or AWS)
Experience with Python/ R / Scala.
Knowledge of SQL / NoSQL databases.
Demonstrated experience in building data pipelines in data analytics implementations such as Data Lake / Data Warehouse / Lakehouse.
Ability to collaborate closely with business analysts, architects, and client stakeholders to create technical specifications.
Ability to analyze and profile data, and assess data quality in the context of business rules.
Extensive experience of data migration.
Proven experience in end-to-end implementation of data processing pipelines.
Experience configuring or developing custom code components for data ingestion, data processing, and data provisioning, using Big Data & distributed computing platforms such as Hadoop / Spark
Proficiency in data modeling, for both structured and unstructured data, for various layers of storage
Experience in using and understating infrastructure as a tool platform like Terraform or Cloud Build
Hands-on experience with streaming data
Understanding of Design patterns (Lambda architecture/Data Lake/Microservices)