DataOps Engineer (P688)
About Us:
As a Mid-Level DataOps Engineer at Kenility, you’ll join a tight-knit family of creative developers, engineers, and designers, who strive to develop and deliver the highest quality products into the market.
Technical Requirements:
- Bachelor’s degree in Engineering, Computer Science, or a related field.
- Experience with AWS services, including DMS, Glue, Lake Formation, DataZone, Redshift, S3, KMS, Macie, CloudWatch, CloudTrail, IAM, and AWS Organizations.
- Proficiency in Python and SQL for script development, query optimization, and app creation.
- Hands-on experience with Terraform or CloudFormation for infrastructure automation.
- Skilled in setting up CI/CD pipelines, version control with Git, and workflow orchestration using Apache Airflow or Step Functions.
- Competence in PySpark for advanced ETL transformations and Shell Scripting for automation tasks.
- Familiarity with Agile/Scrum workflows, automated testing processes, and documentation practices.
- Strong knowledge of monitoring, alerting, and data governance frameworks.
- AWS Solutions Architect Associate Certification.
- AWS Data Analytics Specialty or AWS Certified Data Engineer.
Soft Skills:
- Responsibility
- Proactivity
- Flexibility
- Great communication skills