Senior Data Engineer

Chicago, IL

Direct Hire

Salary Range: $150,000 - $210,000

Position Overview

We are looking for an experienced Senior Data Engineer to join our data platform team. This role involves developing efficient data systems to support our analytics engine. The ideal candidate will be proficient in Python and SQL, with a strong background in data analytics. You will collaborate closely with implementation teams, data scientists, and product stakeholders to advance this innovative product.

Key Responsibilities

  • Develop scalable batch data processing frameworks for analytical transformations
  • Generate foundational data for machine learning models and deploy them into production
  • Enhance the performance of data pipelines and database queries
  • Monitor and resolve data quality issues

Qualifications

  • 5-10 years of experience with:
    • Python
    • SQL
    • Apache Airflow
    • Snowflake
    • AWS services (RDS, EMR, S3)
    • Git / GitHub
    • Spark (PySpark)
  • Extensive background in building data processing systems and analytical data products
  • Knowledge of machine learning algorithms and data analysis techniques

Experience and Skills

  • Python Expertise

    • Advanced proficiency in Python for data engineering, with over 5 years of experience developing optimized ETL pipelines for data transformation and analysis. Skilled in using Pandas, NumPy, and SQLAlchemy for data manipulation and database interaction.
    • Proven ability to design and execute complex data models and ETL processes in Snowflake, ensuring high performance and scalability. Experience with Python-based frameworks for analytics and machine learning is a plus.
    • Experience integrating Python applications with cloud services and data orchestration tools, such as AWS and Apache Airflow.
  • Batch Data Processing

    • Experience in designing and implementing batch data processing pipelines (ETL/ELT) to extract, transform, and load data.
  • Snowflake Experience

    • Proficient in designing and implementing data models in Snowflake, optimizing for performance and scalability. Familiar with Snowflake’s architecture, including data sharing and warehousing features.
    • Experience with Snowpark and Snowpipe is advantageous.
  • AWS Proficiency

    • Extensive experience with AWS services (RDS, EMR, S3), including provisioning and management through Boto3, CDK, CLI, or Apache Airflow operators.
  • Spark (PySpark)

    • Experience with Spark (PySpark) is a plus.
  • Version Control

    • Proficient in using Git / GitHub, with experience in PyCharm or similar IDEs.
  • Apache Airflow

    • Experience building DAGs for ETL/ELT, including single-use case DAGs and reusable frameworks.
  • SQL Expertise

    • Advanced SQL skills, including optimization techniques such as explain plans, join optimization, indexing, and advanced functions.

Share This Job

Apply Now


We help people find the next step in their careers in technology, marketing, sales, human resources, finance, accounting, and real estate. Check out what jobs we have available today.

Follow the hottest hiring trends. #IYKYK

Talent Insights is THE place to keep up with the latest trends in hiring. From market analysis to hot takes on talent practices, tune in to learn (and maybe be entertained).

drop us a line

Need help with hiring? Turns out, we'd love to help. Contact us below.
If you're looking for a new job, check out the job openings for our clients here.