- Prywatny
- ...
- Oferty pracy
- Szczegóły stanowiska
Opis i wymagania
The Data & Insights (D&I) team is involved in every aspect of data at EA, using it to deliver valuable insights and solutions for game teams and players. We drive strategy, build data and AI tools, and support personalization, experimentation, and analytics. Join D&I to work with experts dedicated to improving understanding, workflows, and player experiences.
We are looking for an experienced Data Engineer to join our team temporarily with a potential conversion after 12-months. This role will be based in Vancouver, BC for a hybrid or onsite work model.
You will collaborate with the Game, Analytics and Product teams to implement data strategies and develop complex ETL pipelines that support dashboards for promoting deeper understanding of our games. You will be part of the data modernization efforts for NHL and UFC that includes migration of data features from legacy architecture to modern medallion architecture using the latest data engineering tech stack.
You will have experience developing and establishing scalable, efficient, automated processes for large-scale data analyses. You will also stay informed of the latest trends and research on all aspects of data engineering and analytics. You will work with leaders from an internal Game Studio, providing them with data for understanding game and player insights and report to the Technical Lead for this group.
Key Responsibilities:
As a Data Engineer you will be involved in the entire development life cycle, from brainstorming ideas to implementing elegant solutions to obtain data insights.
You will gather requirements, model and design solutions to support product analytics, business analytics and data science teams
Design, implement and maintain efficient, scalable and robust data pipelines using cloud-native and open source technologies
Develop and optimize ETL/ELT processes to ingest, transform, and deliver data from diverse sources.
You will work with analysts, understand requirements, develop technical specifications for ETLs, including documentation.
You will support production code to produce comprehensive and accurate datasets.
Automate deployment and monitoring of data workflows using CI/CD best practices.
You will guide communications between our users and studio engineers to provide scalable end-to-end solutions.
You will promote strategies to improve our data modelling, quality and architecture
Participate in code reviews, mentor junior engineers, and contribute to team knowledge sharing.
Document data processes, architecture, and workflows for transparency and maintainability.
You will work with big data solutions, data modelling, understand the ETL pipelines and dashboard tools.
Explore data feature designs and suggest new opportunities of cost and compute optimization
Required Qualifications:
4+ years relevant industry experience in a data engineering role and graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field
Proficiency in writing SQL queries and knowledge of cloud-based databases like Snowflake, Redshift, BigQuery or other big data solutions
Experience in data modelling and tools such as dbt, ETL processes, and data warehousing
Experience with at least one of the programming languages like Python, Java
Experience with version control and code review tools such as Git
Knowledge of latest data pipeline orchestration tools such as Airflow
Experience with cloud platforms (AWS, GCP, or Azure) and infrastructure-as-code tools (e.g., Docker, Terraform, and CloudFormation).
Familiarity with data quality, data governance, and observability tools (e.g., Great Expectations, Monte Carlo).
Comfortable working with a multi-functional team, both locally and remote, understanding the perspectives of each partner
Experience with BI and data visualization tools (e.g., Looker, Tableau, Power BI).
Experience working in an Agile development environment and familiar with process management tools such as JIRA, Target process, Trello or similar
Pluses:
Experience in gaming and working with its telemetry data or data from similar sources
Experience with big data platforms and technologies such as EMR, Databricks, Kafka, Spark, Iceberg
Experience in developing engineering solutions based on near real-time/streaming dataset
Exposure to AI/ML, MLOps concepts and collaboration with data science or AI teams.
Experience integrating data solutions with AI/ML platforms or supporting AI-driven analytics