Company:
https:/www.energyjobline.com/sitemap.xml
Location: Ottawa
Closing Date: 30/11/2024
Salary: £100 - £125 Per Annum
Hours: Full Time
Type: Permanent
Job Requirements / Description
Position Description
Security clearance: Reliability Clearance a MUST
Location: Ottawa, ON
Remote: Yes
Are you a data engineering professional interested in working within a talented and collaborative team developing Cloud, Data, and AI solutions for leading global clients? Do you often find yourself in discussions about disparate and scattered data sources and feel the need to create robust data engineering infrastructure to remedy this disorder? If you have the technical expertise to build pipelines and enable data engineering or ETL within organizations, a genuine problem-solving aptitude, and the mindset required to collaborate with diverse clients and team members, we are looking for you.
Your future duties and responsibilities:
Pipeline Construction: Build and maintain scalable and high-performance data pipelines using Azure Data Factory, Azure Databricks, or other Azure services to transport data seamlessly across systems.
Data Storage and Management: Manage data storage in Azure, selecting appropriate storage solutions (like Azure Data Lake, Azure SQL Database) to ensure data is stored efficiently and securely.
Data Cleaning and Transformation: Develop scripts or use ETL tools for data cleaning and transformation to prepare data for analysis, ensuring data quality and accessibility.
Automation and Optimization: Automate pipelines and optimize data retrieval using Python and Azure Data Factory, Azure Data Bricks.
Monitoring and Maintenance: Monitor data pipelines and performance, troubleshoot any issues, and ensure systems are up-to-date with the latest security patches.
Strategic collaboration with clients to explore and deliver key data engineering solutions.
Delivery in areas of data preparation and transformations, as well as ETL or ELT development, whether on-premises or in the cloud.
Building data engineering pipelines on the cloud (Azure, AWS, GCP, etc.).
Working in a professional environment with large, complex, and voluminous datasets, as well as scattered data sources.
Utilizing SQL or advanced SQL skills and Python skills as needed.
Gathering client requirements, coding, and implementing data solutions.
Implementing proof of concepts to demonstrate value, then packaging and scaling full-fledged data engineering both on-premises and in cloud environments.
Supporting and collaborating with other data engineers or data scientists on the team regarding technical knowledge.
Required qualifications to be successful in this role:
Python experience is a must.
5 years or more of practical experience in data engineering.
Experience working on at least one modern cloud platform (Azure, AWS, GCP, etc.).
Excellent communication and teamwork skills.
Desirable:
Experience in data engineering with ML projects.
Experience with pipeline management tools and data workflow tools such as Airflow, Jenkins, etc.
Experience with a variety of tools such as Azure Data Factory, Google Dataflow, Qlik Compose, AWS Data Pipeline or Glue, Talend, Microsoft SSIS, IBM Datastage, Informatica.
#LI-KM1#LI-YH1
Skills:
Data Engineering
English
ETL
MS SQL Server
Python
Amazon CloudFront
French
#J-18808-Ljbffr
Share this job
https:/www.energyjobline.com/sitemap.xml