We’re looking for a Data Engineer with 2–3 years of experience to join our team and help build
scalable, reliable, and high-performing data pipelines. If you’re Snowflake certified, that’s a big plus.
Roles + Responsibilities:
- Design, develop, and maintain scalable data pipelines and ETL/ELT processes.
- Work with Snowflake to design efficient schemas, optimize performance, and manage workloads.
- Ensure data quality, consistency, and governance across multiple data sources.
- Monitor, troubleshoot, and optimize existing data pipelines for performance and cost efficiency.
- Implement best practices for security, compliance, and data management.
What it takes to be one of us!
Required Skills & Qualifications:
- Bachelor’s degree in Computer Science, Information Systems, or related field. -
- Strong expertise in Snowflake (data modeling, query performance tuning, warehouse scaling, security,
and resource optimization). - - Proficiency in SQL and data transformation techniques. - Experience with ETL/ELT tools and workflow orchestration (e.g., dbt, Airflow, Workato, Informatica,
etc.). - - Familiarity with cloud platforms (AWS, Azure, or GCP) and data storage solutions. -
- Strong problem-solving skills and ability to work in collaborative, agile environments.
Good to have:
- Snowflake certification (SnowPro Core or Advanced).
- Experience with scripting languages (Python, Scala, or Java) for data transformation.
- Knowledge of CI/CD pipelines and version control (Git).
- Exposure to BI/analytics tools such as Tableau, Power BI, or Looker.
Back to all openings