Our mission at Argmax is to empower developers and enterprises everywhere, who are eager to migrate inference from cloud servers to user devices, or to turbocharge their existing on-device inference. We believe that achieving this will pull the future forward when AI is private and convenient for users, and profitable and maintainable for companies building it.
To deliver on this mission, we are currently building our:
- Cross-platform Inference Engine using technologies such as Metal, CUDA, Triton, OpenCL
- Swift packages for end-to-end inference pipelines
- Python toolkit packing our model compression and inference efficiency R&D
- Developer community
Open Positions
Job Positions
How to Apply
Why join us?
So you could:
- work closely with and learn from industry leader colleagues, partners, and collaborators
- open-source code and publish technical blogs
- earn autonomy aligned with business objectives
- openly argue for an increased budget for your R&D project based on data