DataOps Engineer

Cape Town CBD, South Africa

Job Description

Our DataOps Team is focused on delivering value faster by creating predictable delivery and change management of data, data models and related artifacts. DataOps uses technology to automate the design, deployment and management of data delivery with appropriate levels of governance, and it uses metadata to improve the usability and value of data in a dynamic environment. Our client is growing quickly, which brings a number of unique and interesting challenges. Data is growing quickly within the organization, and there is a lot of opportunity to shape the tools, technologies and culture of data in the company.

Job Industry

Software Engineering, Programming

Job Salary Currency

ZAR

Job Salary Fixed

No

Key Deliverables

  • Terraform to manage Cloud Infrastructure, Chef to manage virtual servers
  • Building and deploying systems for metrics, monitoring, and logging
  • Operations for Kafka, Kubernetes, and more
  • CI/CD Build Systems to ensure our teams can deploy frequently and safely
  • Code management and review
  • Hardening servers, and building security into the platform
  • Developing automation so we can focus on the hard problems
  • Implementing features, technology, and processes that move us towards industry best practices, improving on scalability, efficiency, reliability, and security
  • Responding to incidents and requests

Professional Qualifications

Industry Qualification
Software Engineering, Programming DataOps Python google cloud kubernetes

Essential Qualities

Essential Qualities
  • Bachelor's Degree or Advanced Diploma in Information Systems, Computer Science, Mathematics, Engineering and a minimum of 3 years of DataOps experience in a software/technology environment is required.
  • In the event that a candidate does not have a bachelor's degree or an advanced diploma (in Information Systems, Computer Science, Mathematics, or Engineering), an equivalent experience requirement must be met, which equates to a minimum of 5 years of DataOps experience in a software/technology environment.
  • An understanding of computer science fundamentals, including linux and operating systems, networking
  • Solid grasp of development fundamentals such as data structures and algorithms
  • Can write code (preferably in Python)
  • Experience with open source relational database systems (MySQL, PostgreSQL)
  • Practical experience working with other database systems like BigQuery, Redis and ElasticSearch will be beneficial
  • Has experience with Kafka, PubSub, or other event-based systems
  • Has experience with Google Cloud, or another cloud provider (architecture, operations)
  • Understands cost and implications of scaling
  • Has experience managing Kubernetes Clusters (certificates, users, kubeadm, kubectl etc.)
  • Understands networking deeply (tcp/ip, calico/weave, vlans, tcpdump, routing etc.)
  • Understands Linux deeply (kernel tuning, proc filesystem, cgroups, os scheduling, etc.)
  • Has experience with Build Systems (Jenkins, Gitlab, Spinnaker)
  • Has a reasonable understanding of Networking (TCP, UDP, IP)
  • Has experience with Linux administration (Processes, Networking, Storage, Security)
  • Has experience with at least one configuration management system (Chef, Puppet, Ansible)
  • Has experience managing production systems


  • Close Date

    15/05/2024