Overview

We’re looking for a highly analytical Data Engineer to join our Data Engineering & Cloud Operations team, which sits at the core of the end-to-end data lifecycle; from ingesting raw vendor data to delivering high-quality datasets to clients. This isn’t just a pipeline-building role. We need someone who goes beyond fixes and digs into the why behind system behavior. You’ll own systems end-to-end and directly impact how they perform, scale, and behave under pressure. You’ll work with real-time, high-volume financial market data from 100+ global exchanges, where reliability, performance and precision are non-negotiable. If you enjoy root cause analysis, performance tuning, and working deep in Linux-based environments – this role is for you.

Responsibilities:
  • Design and maintain scalable ETL pipelines in the Linux environment using Python script and Apache Airflow to process time-sensitive, high-frequency data
  • Analyze workflows, identify bottlenecks, and optimize storage and access patterns for large-scale time-series datasets
  • Collaborate with Operations and Product teams to meet strict SLAs
  • Shape how systems are built, improved, and maintained long-term
  • Reduce manual work, improve observability and make systems easier to operate
  • Act as a Tier-3 escalation point for production issues
  • Perform deep Root Cause Analysis (RCA), identifying not just what broke, but why and ensuring it stays fixed
Required Qualifications:
  • Bachelor’s degree or higher in Computer Science, Engineering, Mathematics, Physics or a related discipline
  • Advanced Python Skills: Strong programmatic foundation, comfortable with sophisticated data structures, libraries, and writing clean, efficient code
  • Orchestration Expert: Practical experience managing and developing complex Apache Airflow DAGs
  • Cloud Infrastructure: Experience with AWS (S3, EFS, EC2) is highly preferable
  • Linux Power User: Proficiency with Linux environments and Bash scripting, comfortable navigating systems and manipulating data via the command line
  • The Investigative Mindset: Proven ability to perform Root Cause Analysis (RCA), enjoy the challenge of troubleshooting complex, distributed systems
  • ETL Fundamentals: Solid understanding of ETL/ELT patterns
  • Growth Mindset: An obsession with learning new technologies and improving existing processes
  • Analytical Thinking: High-level problem-solving skills with the ability to look at a data discrepancy and work backward to the source
  • Communication & English: Strong communication skills with professional working proficiency in English, able to clearly articulate technical topics, document decisions and collaborate effectively with cross-functional and distributed teams
Benefits:
  • Meaningful work with challenging tasks and real impact
  • Flexible working hours and hybrid work options (home or office)
  • Annual salary reviews and performance-based growth
  • Health insurance(for you and your family)
  • Reimbursement for professional development (training, courses, certifications)
  • Referral bonus program
Note:

✨ Our intelligent job search engine discovered this job and republished it for your convenience.
Please be aware that the job information may be incorrect or incomplete. The job announcement remains the property of its original publisher. To view the original job and its full details, please visit the job's URL on the owner’s page.

Please clearly mention that you have heard of this job opportunity on https://ijob.am.