Get new jobs by email
- ...of Experts (MoE) architectures Constitutional AI and alignment techniques Efficient inference optimization (quantization, distillation) Real-time streaming ML systems We're particularly interested in candidates who can demonstrate both theoretical depth and...
- ...Evaluate, fine-tune, and integrate state-of-the-art models (open-source and proprietary). Lead advanced experimentation: model distillation, hybrid search strategies, prompt optimization, hallucination detection, etc. Build frameworks for automatic evaluation,...
- ...TeamCity, and Azure, and strong policy management skills, particularly in DevSecOps. Exceptional Communication Skills : Ability to distil complex technical issues into simple terms. Resource Management : Experience in resource allocation and adherence to...
- ...pipelines (training from scratch and transfer learning). Deep understanding of model optimization : quantization, pruning, distillation, FLOPs analysis, CUDA profiling, mixed precision, and inference performance trade-offs. Proven ability to design and train models...
- ...visualization tools such as Power Query, Power BI, or equivalent analytics platforms. ~ Strong analytical rigor with the ability to distill complex data sets into strategic insights that influence executive decision-making and operational performance. ~ Executive-level...
- ...test sets, adversarial probing, and model behavior tracking over time. Apply and advance model compression and efficiency : distillation from LLMs, quantization-aware considerations, pruning/sparsity (including MoE routing behavior), and inference-time optimization...
- ...SageMaker, MLflow, Vertex AI, W&B, Docker, and Kubernetes. Apply advanced model optimization techniques including quantization, distillation, batching, and GPU/TPU acceleration. Conduct experiments, research emerging AI techniques (LLMs, RAG, multimodal AI, vector...
- ...executive stakeholders. ~ Proven track record of managing client workshops, design walkthroughs, and feedback loops. ~ Ability to distill complex design trade-offs into concise, persuasive recommendations. ~ Proficiency in planning and conducting user research,...
- ...open-source and commercial) for production use Implement inference optimization techniques (quantization, batching, caching, distillation) Build and maintain RAG pipelines (embeddings, vector databases, retrieval strategies) Evaluate and improve model quality (latency...
- ...Formal Verification of Software , and contribute to data pipelines . You’ll also support data curation , dataset creation and distillation , and participate in training and improving large language models —including continuous pre-training , supervised fine-...