- Startsida
- ...
- Lediga jobb
- Information om jobb
Description & Requirements
The Data & Insights (D&I) team is involved in every aspect of data at EA, harnessing its power to deliver valuable insights and solutions for game teams and players. We drive strategy, build data and AI tools, and support personalization, experimentation, and analytics. When you join D&I, you’re joining a team of passionate experts working to enable more profound understanding, better ways of working, and more meaningful experiences for our players.
EA SPORTS is one of the leading sports entertainment brands in the world, with top-selling video game franchises, award-winning interactive technology, fan programs, and cross-platform digital experiences. We are looking for an experienced Data Engineer with broad technical skills and ability to work with large amounts of data and metrics for Mobile Sports titles. You will collaborate with the Game, Analytics and Product teams to implement data strategies and develop complex ETL pipelines that support dashboards for promoting deeper understanding of Sports Mobile games. You will be part of the data modernization efforts to implement medallion architecture using the latest data engineering tech stack.
You will have experience developing and establishing scalable, efficient, automated processes for large scale data analyses. You will also stay informed of the latest trends and research on all aspects of data engineering and analytics. You will work with leaders from an internal Game Studio, providing them with data for understanding game and player insights and report to the Technical Lead for this group.
You will have an impact on shaping the gaming experience of players.
Key Responsibilities:
As a Data Engineer you will be involved in the entire development life cycle, from brainstorming ideas to implementing elegant solutions to obtain data insights.
You will gather requirements, model and design solutions to support product analytics, business analytics and data science teams
Design, implement and maintain efficient, scalable and robust data pipelines using cloud-native and open source technologies
Develop and optimize ETL/ELT processes to ingest, transform, and deliver data from diverse sources.
You will work with analysts, understand requirements, develop technical specifications for ETLs, including documentation.
You will support production code to produce comprehensive and accurate datasets.
Automate deployment and monitoring of data workflows using CI/CD best practices.
You will guide communications between our users and studio engineers to provide scalable end-to-end solutions.
You will promote strategies to improve our data modelling, quality and architecture
Participate in code reviews, mentor junior engineers, and contribute to team knowledge sharing.
Document data processes, architecture, and workflows for transparency and maintainability.
You will work with big data solutions, data modelling, understand the ETL pipelines and dashboard tools.
Explore data feature designs and suggest new opportunities of cost and compute optimization
Required Qualifications:
4+ years relevant industry experience in a data engineering role and graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field
Proficiency in writing SQL queries and knowledge of cloud-based databases like Snowflake, Redshift, BigQuery or other big data solutions
Experience in data modelling and tools such as dbt, ETL processes, and data warehousing
Experience with at least one of the programming languages like Python, Java
Experience with version control and code review tools such as Git
Knowledge of latest data pipeline orchestration tools such as Airflow
Experience with cloud platforms (AWS, GCP, or Azure) and infrastructure-as-code tools (e.g., Docker, Terraform, CloudFormation).
Familiarity with data quality, data governance, and observability tools (e.g., Great Expectations, Monte Carlo).
Comfortable working with a multi-functional team, both locally and remote, understanding the perspectives of each partner
Experience with BI and data visualization tools (e.g., Looker, Tableau, Power BI).
Excellent communication and collaboration skills.
Experience working in a fast paced environment
Experience working in an Agile development environment and familiar with process management tools such as JIRA, Target process, Trello or similar
Nice to Have:
Experience in gaming and working with its telemetry data or data from similar sources
Experience with big data platforms and technologies such as EMR, Databricks, Kafka, Spark, Iceberg
Experience in developing engineering solutions based on near real-time/streaming dataset
Exposure to AI/ML, MLOps concepts and collaboration with data science or AI teams.
Experience integrating data solutions with AI/ML platforms or supporting AI-driven analytics