Job Location : Boston,MA, USA
Overview
Machine Learning Engineer, vLLM Inference role at Red Hat. Red Hat's Inference team accelerates AI for the enterprise and provides a stable platform for open-source LLM deployments. This role focuses on vLLM, working to improve model performance and efficiency within our open-source software stack.
What You Will Do
What You Will Bring
Pay Transparency
The salary range for this position is $133,650.00 - $220,680.00. Actual offer will be based on your qualifications. This position may also be eligible for bonus, commission, and/or equity. For Remote-US locations, the actual salary range may differ by location but will be commensurate with job duties and relevant experience.
About Red Hat
Red Hat is the world's leading provider of enterprise open source software solutions, using a community-powered approach to deliver high-performing Linux, cloud, container, and Kubernetes technologies. We hire creative, passionate people ready to contribute their ideas, help solve complex problems, and make an impact. We are committed to an open, inclusive environment where ideas come from people with diverse backgrounds and perspectives.
Benefits
Equal Opportunity Policy (EEO)
Red Hat is proud to be an equal opportunity workplace and an affirmative action employer. We review applications without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, citizenship, age, veteran status, genetic information, disability, medical condition, marital status, or any other basis prohibited by law. Red Hat provides reasonable accommodations to applicants and will respond to inquiries regarding application status via the designated channels.
J-18808-Ljbffr