Apply to a connected planning company

Senior Data Engineer - Big Data

a connected planning company

Full time

Apply Now
San Francisco, CA
Expected Pay Rate:
$100.00 - $110.00 per hour
Assignment Length:
Contract to hire
Job Description

HireArt is helping a connected planning company find a Senior Data Engineer - Big Data. 

The Big Data team is to own and control all data activities: own the core data collection system that drives hundreds of millions data points, develop and maintain data pipelines, including AI models, to bring quality data and signals for customers. 

As a member of the team, you’ll build services, libraries and tools to create these data pipelines. You’ll be working closely with all engineering, data science and analysts teams.

You will be able to work anywhere which means that you can choose to work in the environment best for you. Whether you choose to work remotely from home or work in an office - it’s up to you.

What you’ll be doing:

  • Own data process that you and others built - develop and maintain

    • ETL

    • Data Pipeline

    • Microservices

  • Design and implement core components that will be used by data-related teams

  • Perform ongoing optimization improvements

  • Implement best practices and follow coding high-standards

  • Use python, spark and SQL

The type of person we’re looking for:

  • Passion for data and the massive effect of it on customers and business

  • You love to know what is happening in your applications at any given time

  • You are eager to solve technological and business challenges

  • Well-rounded hands-on experience data engineer, with good grasp of the AI and infrastructure world

  • People’s person, you’ll have to showcase, support and mentor developers from different teams and inside the team

  • Positive energy and enthusiasm

  • B.Sc or M.Sc graduate in quantitative field

    • Engineering, Computer Science, Information Systems, etc.

  • At least 5 years of experience with Python, Java or equivalent OOP language

  • Deep understanding of data, metrics, data modelling and business needs

  • Good familiarity with cloud providers (AWS/GCP/Azure)

  • You are able to write simple, clean and testable code

  • Familiarity with big data tools

    • Spark, AWS Redshift, MongoDB, Redis, AWS Athena, etc.

  • Past experience with container-orchestration systems;

    • Kubernetes (K8s)

    • Docker (DCOS)

  • Solid understanding of tasks orchestration systems like

    • Airflow, Luigi, etc.

Submit Application

You don't need a cover letter. Read why.

All fields required