Data Pipeline Engineer - Fund Data (Remote)
About the Role
We’re looking for a hands-on Data Pipeline Engineer to join our team remotely and build, run, and maintain robust data pipelines that power our fund data operations. In this Data Pipeline Engineer remote role, you will implement business rules directly in pipelines, operationalize and schedule scripts, monitor data quality, and ensure timely, reliable delivery for downstream stakeholders.
What You'll Do
- Design, implement, and operate batch/streaming data pipelines (ingestion, transformation, validation, and delivery) across cloud/on-prem environments.
- Translate fund data requirements into executable rules (e.g., validation, enrichment, exception handling) directly in code and/or orchestration layers.
- Own daily runs, scheduling, and monitoring, including incident response, root-cause analysis, and recovery.
- Define and maintain data quality checks (completeness, timeliness, accuracy, conformity); implement automated alerting, lineage, and audit trails.
- Develop and maintain Ruby/Python scripts, reusable components, and helper functions; automate routine tasks and reduce manual touchpoints.
- Collaborate closely with Operations, IT, and Data/AI teams to align logic with fund data standards (NAVs, share classes, benchmarks, holdings, fees, EMT/EPT, ESG/SFDR).
- Maintain technical documentation, runbooks, and standard operating procedures; support change management and release readiness.
- Follow bank-grade standards for access control, PII handling, segregation of duties, and production change workflows (ITIL).
- Proactively identify stability, performance, and cost optimization opportunities.
Requirements
- 3–5 years in data engineering or production data ops, with at least 2+ years in fund data within a bank, asset manager, or fund administrator.
- Hands-on experience with Python/Ruby for scripts, SQL (advanced).
- Experience building validation rules, reference/master data controls, and reconciliation (e.g., vendor data vs. internal golden sources).
- Comfortable with SLAs and incident management.
- Ability to translate business rules into technical logic and explain trade-offs to non-technical partners.
Nice to Have
- Familiarity with NAV production cycles, share-class metadata, benchmarks, holdings, fees, corporate actions; vendor feeds such as Bloomberg, Refinitiv, Morningstar, fund docs (KIID/KID), and templates (EMT/EPT).
- Awareness of MiFID II, SFDR, UCITS/AIFMD data implications and downstream publication requirements.
- Experience with ITIL, SDLC, change/release management in regulated environments.
- Knowledge of secrets management, key vaults, least-privilege access, data masking.
What We Offer
- Competitive salary and benefits package.
- Remote work flexibility.
- Opportunities for professional development and growth.
- Collaborative and innovative work environment.
- Access to cutting-edge technology and tools.
This remote Data Pipeline Engineer position at Deutsche Börse Group offers a unique opportunity to work with cutting-edge technology in the FinTech industry. Enjoy a competitive salary and flexible work environment.
Generating success profile...
Analyzing job requirements and market data
Loading market overview...
Analyzing market trends and skill demands
Industry News
Loading latest industry news...
Finding relevant articles from the last 6 months