描述和要求
We are EA
EA inspires the world to play, and the Worldwide Localization Team makes that happen! We are accountable for delivering the high-quality localized versions of our titles to over 30 countries around the world. You will find a vibrant, multicultural environment with a great mixture of professionals. From software engineers, project managers, testers to audio, text and linguistic specialists, each employee has an important role to play. EA has built a cutting-edge Localization powerhouse that covers the needs for both current and next-generation platform.
We are hiring an Associate Big Data Engineer that will report to our Localization Data & AI Manager. Loc Data & AI´s mission is to support and foster the use of data from all perspectives, along with developing policies and procedures for the monitoring and active management of data in EA Localization.
This includes an understanding of data extraction, transformation, and loading (ETL) techniques.
Responsibilities
- Design, develop, optimize, and maintain scalable data pipelines to support Data Science and AI solutions.
- Work with structured and unstructured datasets, ensuring data integrity, quality, and accessibility for AI applications.
- Build and manage ETL/ELT pipelines to process, clean, and transform data for machine learning models.
- Optimize data storage and retrieval processes, leveraging cloud-based solutions (AWS, GCP, or Azure) and distributed computing frameworks (Spark).
- Collaborate with Data scientists and ML engineers to ensure efficient deployment of AI models in production.
- Monitor and troubleshoot data pipeline performance, implementing improvements for scalability and efficiency.
- Maintain and improve data governance, security, and compliance standards.
Qualifications
- 2+ years of industry experience as a Data Engineer or in a related field.
- Bachelor’s degree in Engineering, Computer Science, Maths or a related field.
- Proficiency in Python, SQL, and Data processing frameworks (Pandas, Spark, PySpark).
- Experience with data pipeline orchestration tools (Apache Airflow).
- Strong knowledge of ETL processes, Data modeling, and Data warehousing systems (Snowflake, BigQuery, Redshift).
- Familiarity with cloud platforms (AWS, and Azure) and data storage solutions.
- Experience with APIs and microservices for data integration.
- Strong analytical mindset and ability to work with large-scale datasets.
- Excellent communication and collaboration skills, with the ability to translate data insights into business impact.
Connect your future to ours. Inspire. Dream. Play.