Data Engineer-DataCamp

August 30, 2021
Urgent

Job Description

Join our team

DataCamp is building the best platform to learn and teach data skills. We create technology for personalized learning experiences and bring the power of data fluency to millions of people around the world. Our learners get real hands-on experience by completing self paced, interactive data science courses, practice, and projects from the best instructors in the world, right in the browser.

We are an international team with backgrounds in education, data science, design, psychology, biology, linguistics, engineering and more. We are united by our passion for impacting the future of education.

About the role

DataCamp, being a data-driven organisation runs a data-warehouse on AWS Redshift and reports are created in Metabase and in custom Shiny applications. These reports help the company’s teams and leadership members take action using this data. DataCamp’s Airflow cluster runs a thousand tasks each day (from data ingestion pipeline tasks, to data processing tasks that provide data-sets and data-models used by DataCamps team of data scientists).

To facilitate data processing we have a highly automated pipeline built with Terraform which allows infrastructure provisioning of all data engineering tooling, this allows DataCamps data scientists to be provided with all the latest data-sets, refreshed on a daily basis. Through good documentation and continuous improvement, we want to continue to enhance the data engineering capability at DataCamp.

It will be your role as a part of the cross functional Infrastructure team to work directly with the data science team on all data engineering initiatives from the business. You will maintain and create new data pipelines, and you will be managing company wide shared data resources which support our data architecture, and building upon those internal processes as well as having the creative freedom to shape the processes and roadmap for data engineering at DataCamp.

The team has a strong bias towards providing self-serve systems for deployment and infrastructure provisioning, and aim is to support other teams using these services, making sure they are available and functional, rather than being a central bottleneck in the company. You will play a key part in planning future improvements and owning your day to day work.

Besides providing data engineering skills to DataCamp you will equally be adept at writing Python and having an understanding of authoring Data Models and a passion for data science and data management and Security on the platform (Python, R, SQL, ..). Evolutions towards regional deployment models are envisioned and will be pivotal for the growth of DataCamp and its data engineering capability.

The ideal candidate

  • Has 2+ years experience of data warehousing (e.g., Redshift, Big Query or Snowflake) and data engineering related tools (e.g. Airflow, Metabase, Fivetran)
  • Has 2+ years of administering/maintaining DevOps related tools (AWS, Docker, CI/CD, K8s)
  • Can develop in Python
  • Has excellent oral and written communication skills
  • Is interested in understanding and scaling complex data pipelines
  • Is interested in monitoring and self healing systems
  • Is highly organised with a flexible, can-do attitude and a willingness/aptitude for learning
  • Improves the team with code reviews, technical discussions and documentation
  • Is able to work collaboratively in teams and develop meaningful relationships to achieve common goals

It’s a plus if

  • You have an entrepreneurial spirit
  • You have experience with Infrastructure-as-code (Terraform, Ansible, etc)
  • You have experience with API-gateways or service meshes (Kong, Istio, etc)
  • You are passionate about data science and education

What’s in it for you

In addition to joining a creative and international start-up, you’ll enjoy:

  • A very competitive salary and stock options
  • An exciting job that will offer you technical challenges every day
  • Flexible working hours
  • International company retreats
  • Conference and hardware budget
  • Working with a great team (everyone says this, but we’re serious—we’re pretty great)