Senior Data Engineer

BH-267325
  • Competitive Remuneration Package
  • Malaysia Kuala Lumpur
  • Permanent
  • Oil & Gas
Job Requirements:
The position entails designing, developing, and maintaining robust, scalable, and sustainable data products. This involves building and optimizing data pipelines and infrastructure using technologies like Synapse, Azure Data Factory (ADF), Apache Spark, Apache Kafka, or similar tools. Collaboration with stakeholders is crucial to understanding their data requirements and translate them into technical solutions. Implementing data quality monitoring and validation processes using these technologies ensures data integrity. Additionally, enforcing data governance best practices, including data lineage, documentation, and security, is essential. Supporting data analysts and architects, contributing to the organization's data strategy, and evaluating new technologies to enhance the data engineering ecosystem are also part of the role.

The role also involves troubleshooting and resolving data-related issues, performance bottlenecks, and scalability challenges specifically on the Azure platform. It requires a solid grasp of DevOps principles and expertise in infrastructure automation using tools like Terraform or AWS CloudFormation. Hands-on experience with Azure cloud platforms and related services such as Azure Synapse, Azure Data Lake, etc., is essential.
Additionally, familiarity with data warehousing concepts and best practices is necessary. Collaborating closely with stakeholders to grasp business requirements and translating them into data solution designs is a key aspect. A strong understanding of data architecture principles, data modeling techniques, and data integration patterns is also vital for success in this role.

Educational Qualification & Experience:
  • BA/BS degree in Computer Science, Computer Engineering, Electrical Engineering or related technical field
  • 4- 8 years to total IT experience preferably in field of data engineering
  • Experience with Azure services including IAM, Synapse, DataLake, SQL Server, ADF etc.
  • Experience in creating and deploying docker containers on Kubernetes.
  • Experience in supporting development teams on Kubernetes best practices, troubleshooting, and performance optimization.
  • Experience with CI/CD pipelines toll such as Jenkins and GitHub Actions
  • Experience with Synapse data warehousing and data lake solutions
  • Strong programming skill in Python, PySpark and SQL
  • Experience in scripting and automation using languages such as Bash, Python, or Go
  • Experience with infrastructure-as-code tools such as Terraform, Ansible, or CloudFormation and containerization technologies (e.g., Docker, Kubernetes).
  • Knowledge of Agile methodologies and software development lifecycle processes
  • Proven experience in designing and implementing large-scale data solutions, including data pipelines and ETL processes on Azure.


With over 90 years' combined experience, NES Fircroft (NES) is proud to be the world's leading engineering staffing provider spanning the Oil & Gas, Power & Renewables, Chemicals, Construction & Infrastructure, Life Sciences, Mining and Manufacturing sectors worldwide. With more than 80 offices in 45 countries, we are able to provide our clients with the engineering and technical expertise they need, wherever and whenever it is needed. We offer contractors far more than a traditional recruitment service, supporting with everything from securing visas and work permits, to providing market-leading benefits packages and accommodation, ensuring they are safely and compliantly able to support our clients.

Apply for this role