San Diego, California, United States · $139,500-258,100/yr
Data Engineer - Software Delivery & Technologies
Apply NowDo you want to help define the future of delivering Apple software to customers? Join the Software Delivery – Insights & Release Technologies team and work on new technologies used to deliver Apple platforms to millions of customers. Our team thrives on innovation and engineering excellence, seeking indivduals with a genuine excitement to collaborate, solve sophisticated problems, and deliver outstanding user experiences. You will be a pivotal member of a team building the next generation of software release workflows, enabling the software development lifecycle for an ever-growing number of platforms and teams contributing to Apple's software products. Our applications seamlessly integrate with developer workflows, from source code integration through release of Apple platforms and assets to customers. Our team of engineers is dedicated to reusable design, robust architecture, and delivering elegant, extensible, and high-quality engineering solutions. In this role, you will collaborate closely with software developers, data engineers, and project managers. You will translate complex requirements into scalable, reliable, and secure data pipelines, data processing workflows, and machine learning pipelines that produce actionable insights and integrate seamlessly with AIML initiatives. Success in this role involves clear technical communication and collaboration across teams to explain complex systems, contribute to technical decision-making, and help align teams on shared engineering initiatives. We're seeking individuals who share these values and are passionate about improving and extending our services and products.
Required Qualifications
- Bachelor's degree or equivalent practical experience in Computer Science, Software Engineering, Data Engineering, or a related field
- 2+ years of hands-on data engineering experience in production systems
- Designing, building, and maintaining large-scale ETL/ELT data pipelines
- End-to-end pipeline ownership (ingestion, transformation, storage, validation)
- Query performance optimization and tuning at scale
- Strong proficiency in Python and SQL, used in production environments
- Hands-on experience with workflow orchestration tools (e.g., Airflow, Spark workflows, or similar)
- Experience working with modern data platforms (big data platforms, data lakes, and/or data warehouses)
- Experience building and operating data systems in cloud environments (AWS/GCP)
- Experience with CI/CD tooling and production deployment practices (e.g., Jenkins or similar tools)
- Experience with one or more compiled programming languages and solid software engineering fundamentals -including data structures & algorithms, object-oriented design, and concurrency
Preferred Qualifications
- Hands-on experience supporting machine learning data workflows, including data preparation, feature engineering, and ensuring data quality, freshness and schema for ML systems
- Experience with containerization and orchestration technologies (e.g., Docker, Kubernetes/EKS)
- Knowledge of data governance principles, data security best practices, and data privacy regulations