FS Dev / DevOps (JHB contract)
20 March 2024Scrum Master (JHB contract)
25 March 2024Gauteng
Posted 6 months ago
A client in Gauteng is seeking an Intermediate Data Engineer on AWS as follows
- To design and build data pipelines on AWS, that BI Devs will use for visualization
- Design pipelines using traditional Data Modelling techniques and Big Data concepts
- Extract data from sources like Oracle / SQL / Dynamo DB’s, Big Data and Boto S3 using ETL tools
- Verify and clean source data using Data Quality tools and methods
- Build data pipelines using Terraform, Python, PySpark, Glue, Kinesis, Kafka, CloudFormation and EMR
- In-depth knowledge of data formats is required, incl. XML, CSV, JSON / RESTful API’s and Apache Parquet
- Automate Linux / Unix server setup with Bash / Shell scripts and code containerization using Docker
- BI dev exp. is useful, as you’ll work with Analysts / Devs who will be building BI reports (not required)
Skills Required
- 4+ yrs exp. in Data Engineering (or similar Back-End Dev exp.)
- 2 to 3 yrs exp. in AWS, as well as most of the above tech stack
- Solid Agile / DevOps work method exp. is required
- Relevant AWS and Data Eng. certs are a big benefit
- An IT / Eng. degree is also a big benefit
This is a long term hybrid work contract with a few days / week at their offices in Gauteng
If you’d like to apply, pls complete the application form below and upload your latest CV