Remote Senior Data Engineer - AI/ML Focus
About the Role
We are seeking a Remote Senior Data Engineer to join our innovative team at Federato. As a Remote Senior Data Engineer, you will play a crucial role in building the infrastructure that supports our AI-native platform. Your expertise will help us tackle pressing issues such as the climate crisis and cyber-attacks by enabling insurers to provide affordable coverage. Join us in revolutionizing the insurance industry with cutting-edge technology!
What You'll Do
- Collaborate with Data Science, Product Managers, and Software Engineers to build robust ETL pipelines that empower the Product Support team.
- Contribute to architecture decisions and observability tooling to enhance system reliability.
- Work closely with machine learning engineers to develop and deploy AI-powered features.
- Ensure data and tooling are scalable and aligned with the needs of our AI-native platform.
- Participate in code reviews and provide mentorship to junior engineers.
Requirements
- 5+ years of experience as a Data Engineer, with a focus on building ETL pipelines.
- Proficiency in programming languages such as Python and SQL.
- Experience with cloud platforms (AWS, GCP, or Azure) and data warehousing solutions.
- Strong understanding of data modeling and database design.
- Familiarity with machine learning concepts and tools.
Nice to Have
- Experience with big data technologies like Hadoop or Spark.
- Knowledge of data visualization tools.
- Previous experience in the insurance or financial services industry.
What We Offer
- Competitive salary ranging from $140,000 to $180,000 per year.
- Fully remote work environment with flexible hours.
- Opportunities for professional development and growth.
- Health, dental, and vision insurance.
- Generous paid time off and holidays.
Join a forward-thinking company as a Remote Senior Data Engineer, focusing on AI and data engineering. Enjoy a competitive salary and remote flexibility.
Who Will Succeed Here
Proficiency in Python for building robust data pipelines and ETL processes, with hands-on experience in frameworks like Apache Airflow or Luigi for orchestrating workflows.
Strong familiarity with cloud platforms such as AWS, GCP, or Azure, and the ability to leverage their data services like AWS Redshift or GCP BigQuery for effective data warehousing and modeling.
A problem-solving mindset focused on leveraging data engineering to support AI/ML initiatives, including experience with data preparation for machine learning models and understanding of data governance practices.
Learning Resources
Career Path
Market Overview
Skills & Requirements
Domain Trends
Industry News
Loading latest industry news...
Finding relevant articles from the last 6 months