Principal Data Engineer

  • Lahore, Multan, Karachi, Islamabad
  • WFH Flexible
  • Full-time
  • Delivery

We are looking for a Principal Data Engineer, responsible for leading data engineering teams. He/she will provide guidance, mentorship, and technical expertise to team members, fostering a collaborative and innovative work environment. Collaboration is key for a Principal Data Engineer as he/she will work closely with stakeholders to define project requirements. Therefore, a commitment to collaborative problem solving, sophisticated design, and the creation of quality products is essential.

Responsibilities:

  • Collaborate with Big Data Solution Architects to design, prototype, implement, and optimize data ingestion pipelines so that data is shared effectively across various business systems.
  • Build & maintain ETL/ELT Ingestion pipelines and analytics solutions using cloud technologies.
  • Ensure the design, code and procedural aspects of solution are production ready, in terms of operational, security and compliance standards.
  • Implement monitoring and logging solutions to ensure data pipelines run smoothly and issues are promptly identified.
  • Ensure all data processing complies with security policies and data protection regulations.
  • Takes the lead in ensuring that data is accurate, consistent, and reliable.
  • Participate in day-to-day project and production delivery status meetings, and provide technical support for faster resolution of issues.
  • Work with Stakeholders and Understand their Requirements

Qualifications

  • Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field (or equivalent work experience).
  • 5+ years of overall experience in data engineering & data warehousing domain focusing on building large scale data pipelines
  • 3+ years' experience in Big Data Distributed systems such as Microsoft Fabric, Databricks, AWS EMR
  • Strong experience in developing data pipelines using PySpark and ETL tools like ADF, Talend
  • Proficiency in Python & SQL
  • Solid understanding of data modelling, ETL processes, and best practices in data warehousing domain
  • Strong problems solving and critical thinking skills
  • Experience of writing effective and maintainable unit and integration tests for ingestion pipelines.
  • Working knowledge of CI/CD pipelines for deployments.
  • Working knowledge of using code version control repositories like Git.
  • Knowledge of designing, developing, and managing Kafka-based data pipelines
  • Knowledge of security best practices including encryption of sensitive data, encryption at rest and encryption in flight.

We have an amazing team of 700+ individuals working on highly innovative enterprise projects & products. Our customer base includes Fortune 100 retail and CPG companies, leading store chains, fast-growth fintech, and multiple Silicon Valley startups.

What makes Confiz stand out is our focus on processes and culture. Confiz is ISO 9001:2015 (QMS), ISO 27001:2022 (ISMS), ISO 20000-1:2018 (ITSM) and ISO 14001:2015 (EMS) Certified. We have a vibrant culture of learning via collaboration and making workplace fun.

People who work with us work with cutting-edge technologies while contributing success to the company as well as to themselves. 

To know more about Confiz Limited, visit: https://www.linkedin.com/company/confiz-pakistan/