Overview
The Data Analytics Engineer will be responsible for understanding business objectives, assessing the current data landscape, and writing SQL queries to transform raw data managed in DBT into comprehensive Looker dashboards.
Effective collaboration requires at least a 3-hour overlap with Eastern Standard Time (EST).
Client:
Our client is a U.S.-based healthcare marketplace focused on connecting providers with patients to improve access to medical services. The company fosters a strong engineering culture and values proactive individuals who are eager to explore new technologies and implement ready-made solutions.
Project Overview:
The client is expanding their Product Analytics team and seeking a Data Analytics Engineer to develop data visualization dashboards in Looker.
The primary objective is to enable product teams to adopt a data-driven approach by providing actionable insights, supporting hypothesis generation, and evaluating the impact of key product KPIs.
- Build and maintain reliable, scalable data models and pipelines using SQL, DBT, and Dagster
- Support self-service analytics by managing tools like Looker and Amplitude, driving adoption and training
- Partner with analysts, product teams, and engineers to align data infrastructure with business needs
- Improve data ingestion and integration in collaboration with data engineering
- Define and promote best practices in data governance, quality, privacy, and security
- Expert-level SQL skills for building performant, production-grade transformations
- Proficiency with data visualization tools; experience with Looker is a plus, and mandatory proficiency in any modern BI tool (e.g., Tableau, Power BI, Qlik)
- Python scripting for automation, testing, and developer productivity
- DBT (Core/Cloud) for data modeling and transformation framework
- Experience with Snowflake as the primary cloud data warehouse
- Orchestration and workflow management using Dagster (preferred) or Airflow
- Looker (LookML) for semantic layer design, project architecture, and governance
- Experience with Redshift or BigQuery
- Amplitude for event-based product analytics and user journey modeling
- Implementation of data governance frameworks and KPI management tools to maintain quality, lineage, and ownership
- Use of AI-assisted development tools (e.g., GitHub Copilot, CodeWhisperer) to improve workflow efficiency
- Strong understanding of data modeling and data warehousing principles
- Excellent communication skills for collaborating with cross-functional stakeholders to gather requirements and clarify data context and meaning
Please clearly mention that you have heard of this job opportunity on https://ijob.am.