Overview
Responsibilities:
- Pipeline Architecture: Build and maintain scalable Python ETL pipelines for core financial datasets (Actuals, Budgets, Headcount, Allocations)
- Data Integration: Design strategies to extract data from various APIs and databases, implementing incremental loads and efficient warehousing logic
- Transformation & Modeling: utilize Numpy, Pandas or Polars to transform raw logs into clean, documented, and modeling-ready datasets for Power BI
- Data Quality Assurance: Engineer automated quality checks (reconciliation logic, mapping validation, anomaly detection) to ensure trust in the numbers
- Operational Excellence: Own the deployment routines, including scheduling, logging, error handling, and retry mechanisms
Required Qualifications:
- Advanced skills in Python for data processing, ability to write clean, modular, and reusable code
- Experience with Pandas is required; familiarity with Polars is a plus
- 3+ years of experience building automated reporting systems or data products
- Strong SQL skills ( joins, window functions, CTEs) and understanding of performance optimization
- Experience in working with APIs, handling pagination and authentication
- Understanding of financial reporting concepts (like reconciliation, general ledgers, and dimensional modeling)
Note:
✨ Our intelligent job search engine discovered this job and republished it for your convenience.
Please be aware that the job information may be incorrect or incomplete. The job announcement remains the property of its original publisher. To view the original job and its full details, please visit the job's URL on the owner’s page.
Please clearly mention that you have heard of this job opportunity on https://ijob.am.



