top of page

Data Engineer Position

OPS Angels is a US based consulting company. We provide services of developing end-to-end Google Cloud Platform based solutions tailored to the business needs of our clients.

We are looking for an individual that has a strong attention to detail and enjoys the application of rules and logic making use of new tools and technologies.

 

As a Data Engineer you will work on developing infrastructural components within GCP that turn Data into useful insights for decision making processes. As part of an agile delivery team, you will design, develop, deploy and support the data ingestion pipelines, ETL/ELT processes, and develop warehouse storage solutions.

 

We take a DevOps approach and strive for CI/CD way of development. We use GitLab and Terraform to deploy our code. Since our infrastructure is managed through Terraform, any Terraform experience would be advantageous.

We are looking for an individual that has a strong attention to detail and enjoys the application of rules and logic making use of new tools and technologies.

 

As a Data Engineer you will work on developing infrastructural components within GCP that turn Data into useful insights for decision making processes. As part of an agile delivery team, you will design, develop, deploy and support the data ingestion pipelines, ETL/ELT processes, and develop warehouse storage solutions.

 

We take a DevOps approach and strive for CI/CD way of development. We use GitLab and Terraform to deploy our code. Since our infrastructure is managed through Terraform, any Terraform experience would be advantageous.

Here are the details for the position:

Key Areas of Work:

• Helping to design, build and support our new google cloud-based infrastructure.

• Continued migration of data sources to google cloud platform.

• Designing and implementing solutions to meet business requirements (e.g. bringing new data sources into the platform)

• Perform data modelling based on the business/reporting requirements

• Create and maintain ETL/ELT process related documents (e.g. data lineage, data flow, mapping)

• Proactively ensure that deliverables meet or exceed functional, technical and performance requirements

• Working alongside other developers to deploy regular change to the platform

• Root cause analysis and recommendations for fixes of any defects

• Conducting code reviews for other team members

Minimum Requirements:

• Ability to work in an English-speaking, international environment mainly based on US time zone;

• Knowledge of Git and ability to work in CI/CD environment;

• Google Certified Associate Cloud Engineer;

• Experience with data processing software (e.g., Hadoop, Spark, Pig, Hive, Dataflow);

• Experience in developing and deploying infrastructure on GCP, troubleshooting technical issues, debuting the code;

• Experience with programming languages: Python, SQL;

• Experience in working with/on data warehouses (preferably BigQuery), including data warehouse technical architectures, infrastructure components, ETL/ ELT, reporting/analytic tools and environments, data structures.

Preferred Experience and Qualifications:

• Google Certified Professional Data Engineer;

• Experience in Big Data, information retrieval, data mining, or machine learning;

• Ability to contribute to all aspects of a solution – design, infrastructure, development, testing and maintenance;

• Track record of taking initiative and delivering projects end-to-end; clear evidence of being self-driven/motivated;

• Knowledge and experience in infrastructure automation on Terraform.

Sign me up!

Apply for this job

Thanks for submitting!

bottom of page