Multimodal LLMs Research Engineer
Apple
Sunnyvale, California
Posted 1 weeks ago
Qualifications
Education
Ph.D. with relevant research background, or Master of Science
Responsibilities
Primary Duties
- Developing, implementing, and enhancing multimodal foundation models
- Collaborating closely with cross-functional partners to define data and infrastructure requirements
- Staying at the forefront of advancements in AI, machine learning, and computer vision
Experience Requirements
Required
2 years of relevant industry experience
2 years of experience
Benefits & Perks
Benefits Package
- Comprehensive medical and dental coverage
- Retirement benefits
- Discounted products and free services
- Reimbursement for certain educational expenses including tuition
Required Skills
Technical Skills
Strong Python programming experienceStrong PyTorch programming experienceStrong JAX programming experience
Full Job Description
Multimodal LLMs Research Engineer
Sunnyvale, California, United States
Machine Learning and AI
Posted: Dec 15, 2025
Role Number: 200636347
Summary
We are actively seeking exceptional individuals who thrive in collaborative environments and are driven to push the boundaries of what is currently achievable with multimodal inputs and large language models. Our centralized applied research and engineering group is dedicated to developing cutting-edge Computer Vision and Machine Perception technologies across Apple products. We balance advanced research with product delivery, ensuring Apple quality and pioneering experiences. A successful candidate will possess deep expertise and hands-on experience across the full lifecycle of Multimodal LLM development, encompassing early ideation, data definition, model training, and fine-tuning.
Description
We are seeking a candidate with a proven track record demonstrated through academic research, industry contributions, or a combination of both in developing multimodal LLMs and advanced topics such as agentic AI, reasoning, and large-scale model evaluation. This role offers the opportunity to drive groundbreaking research projects, spanning foundational concepts to practical applications.
Responsibilities
At Apple, base pay is one part of our total compensation package and is determined within a range. This provides the opportunity to progress as you grow and develop within a role. The base pay range for this role is between $147,400 and $272,100, and your base pay will depend on your skills, qualifications, experience, and location. Apple employees also have the opportunity to become an Apple shareholder through participation in Apple's discretionary employee stock programs. Apple employees are eligible for discretionary restricted stock unit awards, and can purchase Apple stock at a discount if voluntarily participating in Apple's Employee Stock Purchase Plan. You'll also receive benefits including: Comprehensive medical and dental coverage, retirement benefits, a range of discounted products and free services, and for formal education related to advancing your career at Apple, reimbursement for certain educational expenses including tuition. Additionally, this role might be eligible for discretionary bonuses or commission payments as well as relocation.
Apple is an equal opportunity employer that is committed to inclusion and diversity. We seek to promote equal opportunity for all applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, Veteran status, or other legally protected characteristics.
Sunnyvale, California, United States
Machine Learning and AI
Posted: Dec 15, 2025
Role Number: 200636347
Summary
We are actively seeking exceptional individuals who thrive in collaborative environments and are driven to push the boundaries of what is currently achievable with multimodal inputs and large language models. Our centralized applied research and engineering group is dedicated to developing cutting-edge Computer Vision and Machine Perception technologies across Apple products. We balance advanced research with product delivery, ensuring Apple quality and pioneering experiences. A successful candidate will possess deep expertise and hands-on experience across the full lifecycle of Multimodal LLM development, encompassing early ideation, data definition, model training, and fine-tuning.
Description
We are seeking a candidate with a proven track record demonstrated through academic research, industry contributions, or a combination of both in developing multimodal LLMs and advanced topics such as agentic AI, reasoning, and large-scale model evaluation. This role offers the opportunity to drive groundbreaking research projects, spanning foundational concepts to practical applications.
Responsibilities
- Model Design & Development: Developing, implementing, and enhancing multimodal foundation models. This encompasses training or fine-tuning MM-LLMs from scratch or leveraging existing technologies to optimize performance and capabilities.
- Model Evaluation: Collaborating closely with cross-functional partners to define data and infrastructure requirements crucial for robust evaluation of model designs and developments.
- Innovation & Dissemination: Staying at the forefront of advancements in AI, machine learning, and computer vision, and applying this knowledge to foster continuous innovation within the company. Depending on your research expertise and project scope, you will have the opportunity to publish papers and/or patents to positively impact the broader community.
- Ph.D. with relevant research background, or Master of Science and a minimum of 2 years of relevant industry experience
- Demonstrated track record through publications, patents, and/or shipping relevant features
- Strong Python programming experience
- Strong PyTorch and/or JAX programming experience
- Ability to effectively utilize AI code development tools to accelerate the development process
- Strong publication record in relevant venues, such as CVPR, ICCV, ECCV, NeurIPS, ICML, ICLR, etc.
- Technical leadership experience - guiding technical efforts across diverse teams/individuals.
- Experience in shipping MM-LLMs in products.
At Apple, base pay is one part of our total compensation package and is determined within a range. This provides the opportunity to progress as you grow and develop within a role. The base pay range for this role is between $147,400 and $272,100, and your base pay will depend on your skills, qualifications, experience, and location. Apple employees also have the opportunity to become an Apple shareholder through participation in Apple's discretionary employee stock programs. Apple employees are eligible for discretionary restricted stock unit awards, and can purchase Apple stock at a discount if voluntarily participating in Apple's Employee Stock Purchase Plan. You'll also receive benefits including: Comprehensive medical and dental coverage, retirement benefits, a range of discounted products and free services, and for formal education related to advancing your career at Apple, reimbursement for certain educational expenses including tuition. Additionally, this role might be eligible for discretionary bonuses or commission payments as well as relocation.
Apple is an equal opportunity employer that is committed to inclusion and diversity. We seek to promote equal opportunity for all applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, Veteran status, or other legally protected characteristics.
How to Apply
$100
/ hour
Apple pays $100 for Cybersecurity Analyst in Sunnyvale, California, with most salaries ranging from $65 to $160. Pay can vary based on role, experience, and local cost of living.
Median
$100
Low
$65
High
$160
Companies Similar to Apple for Jobs
Share This Job
Figures represent approximate ranges and may vary based on experience, location, and other factors. For the most accurate information, please consult the employer directly. Contact us to suggest updates to this information.





