About Role
We’re looking for a highly motivated and experienced Senior Data Engineer with a passion for building robust and scalable data solutions. This role is perfect for someone who thrives on tackling complex data challenges and wants to make a real impact on our data architecture. You’ll be a key driver in ensuring our data infrastructure is efficient, reliable, and empowers data-driven decision-making across the organization. This position begins as a 6-month contract with a strong possibility of extension and conversion to a full-time role.
What You'll Do:
- Architect and optimize our data warehouse on AWS or Azure, focusing on scalability, performance, and cost-effectiveness.
- Design and implement robust data models that support analytics, reporting, and operational needs.
- Build and maintain ETL/ELT pipelines using cloud-based ETL tools (e.g., Fivetran, Matillion, Azure Data Factory).
- Leverage Databricks, Spark, and PySpark to process and transform large datasets efficiently.
- Develop and optimize complex SQL queries for data transformation and reporting, ensuring performance and accuracy.
- Champion data governance, quality, and best practices across the organization.
- Collaborate closely with data analysts, engineers, and business stakeholders to understand their data requirements and deliver effective solutions.
- Implement and maintain version control and CI/CD workflows using tools like GitHub, Docker, and Terraform.
What You'll Need:
- 5+ years of experience in data engineering, with a strong emphasis on data warehousing and data modeling.
- Deep expertise in AWS or Azure cloud platforms.
- Advanced SQL skills, including performance tuning and optimization.
- Proven experience with Databricks, Spark, and PySpark for large-scale data processing.
- Hands-on experience with at least one cloud-based ETL tool.
- A solid understanding of data governance principles, data lineage, and best practices.
- Excellent communication and collaboration skills, with the ability to work effectively in cross-functional teams.1
Bonus Points:
- Experience with DevOps tools (Docker, Terraform, Airflow, GitHub Actions).
- Familiarity with CI/CD pipelines and infrastructure as code.
- A proven track record of optimizing data processes for cost and performance efficiency.
What We Offer:
- 100% remote work – work from anywhere!
- The opportunity to lead high-impact data projects in a rapidly growing organization.
- Competitive compensation and excellent career growth potential.
- Flexible work arrangements: Start with a 6-month contract, with a strong possibility of extension and conversion to a full-time, permanent position.
We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.