Get new jobs by email
  •  ...of Experts (MoE) architectures Constitutional AI and alignment techniques Efficient inference optimization (quantization, distillation) Real-time streaming ML systems We're particularly interested in candidates who can demonstrate both theoretical depth and... 

    Pixalate, Inc.

    Lahore
    4 days ago
  •  ...Evaluate, fine-tune, and integrate state-of-the-art models (open-source and proprietary). Lead advanced experimentation: model distillation, hybrid search strategies, prompt optimization, hallucination detection, etc. Build frameworks for automatic evaluation,... 

    INTECH Process Automation

    Lahore
    13 days ago
  •  ...TeamCity, and Azure, and strong policy management skills, particularly in DevSecOps.  Exceptional Communication Skills : Ability to distil complex technical issues into simple terms.  Resource Management : Experience in resource allocation and adherence to... 

    Aqovia

    Lahore
    more than 2 months ago
  •  ...pipelines (training from scratch and transfer learning). Deep understanding of model optimization : quantization, pruning, distillation, FLOPs analysis, CUDA profiling, mixed precision, and inference performance trade-offs. Proven ability to design and train models... 

    Devsinc

    Lahore
    more than 2 months ago
  •  ...visualization tools such as Power Query, Power BI, or equivalent analytics platforms. ~ Strong analytical rigor with the ability to distill complex data sets into strategic insights that influence executive decision-making and operational performance. ~ Executive-level... 

    Nysonian

    Islamabad
    a month ago
  •  ...test sets, adversarial probing, and model behavior tracking over time. Apply and advance model compression and efficiency : distillation from LLMs, quantization-aware considerations, pruning/sparsity (including MoE routing behavior), and inference-time optimization... 

    Skylabs AI

    Islamabad
    a month ago
  •  ...SageMaker, MLflow, Vertex AI, W&B, Docker, and Kubernetes. Apply advanced model optimization techniques including quantization, distillation, batching, and GPU/TPU acceleration. Conduct experiments, research emerging AI techniques (LLMs, RAG, multimodal AI, vector... 

    Devsinc

    Lahore
    more than 2 months ago
  •  ...executive stakeholders. ~ Proven track record of managing client workshops, design walkthroughs, and feedback loops. ~ Ability to distill complex design trade-offs into concise, persuasive recommendations. ~ Proficiency in planning and conducting user research,... 

    VentureDive

    Pakistan
    28 days ago
  •  ...open-source and commercial) for production use Implement inference optimization techniques (quantization, batching, caching, distillation) Build and maintain RAG pipelines (embeddings, vector databases, retrieval strategies) Evaluate and improve model quality (latency... 

    HR POD - Hiring Talent Globally

    Lahore
    27 days ago
  •  ...Formal Verification of Software , and contribute to data pipelines . You’ll also support data curation , dataset creation and distillation , and participate in training and improving large language models —including continuous pre-training , supervised fine-... 

    Skylabs AI

    Islamabad
    a month ago