Description Annapurna Labs, Inc, the silicon innovation arm of Amazon Web Services, is seeking a dedicated, multitasking and customer-obsessed individual to support our growing U.S. organization. This role requires superior attention to d...
Lugar:
Austin, TX | 06/02/2026 03:02:17 AM | Salario: S/. $74200 - 129800 per year | Empresa:
AmazonDescription As a member of the Cloud-Scale Machine Learning Acceleration team you'll be responsible for the design and optimization of hardware in our data centers including AWS Inferentia, our custom designed machine learning inference d...
Description We're seeking a Safety Innovation Laboratory Lab Manager to join the Global Safety Engineering team... oriented background in the management of research equipment, managing laboratory safety programs, running lab facilities...
Lugar:
Kent, WA | 17/02/2026 22:02:56 PM | Salario: S/. No Especificado | Empresa:
AmazonDescription Do you want to be part of AI revolution? At AWS our vision is to make deep learning pervasive for everyday developers and to democratize access to AI hardware and software infrastructure. In order to deliver on that vision, we...
Description The Annapurna Labs team at Amazon Web Services (AWS) builds AWS Neuron, the software development kit used to accelerate deep learning and GenAI workloads on Amazon's custom machine learning accelerators, Inferentia and Trainiu...
Description What if you could help build something that's never existed before? At Annapurna Labs, an Amazon company, we're not just participating in the AI revolution - we're accelerating it. We design custom silicon and revolutionary so...
Description The Product: AWS Machine Learning accelerators are at the forefront of AWS innovation. The Inferentia chip delivers best-in-class ML inference performance at the lowest cost in cloud. Trainium delivers the best-in-class ML tra...
Description Do you want to be part of AI revolution? At AWS our vision is to make deep learning pervasive for everyday developers and to democratize access to AI hardware and software infrastructure. In order to deliver on that vision, we...
Description The Product: AWS Machine Learning accelerators are at the forefront of AWS innovation and one of several AWS tools used for building Generative AI on AWS. The Inferentia chip delivers best-in-class ML inference performance at ...
Description Custom SoCs (System on Chips) are the brains behind AWS's Machine Learning servers. Our team builds C++ & SystemC functional models of these custom-designed accelerator SoCs for use by AWS internal teams to significantly left-...