【About Us】
HIGGS HK TECHNOLOGY LIMITED is a leading provider of artificial intelligence solutions in the financial trading industry. In the rapidly changing capital markets, the greatest challenge lies in accurately forecasting market movements. Traditional models often fail to capture the complexity of financial markets, leading to missed trading opportunities and heightened risk. We are dedicated to helping the Chinese financial sector overcome these hurdles by delivering high-performance, AI-driven infrastructure. Our solutions empower even the most stringent traders to effectively manage diverse models and datasets within regulatory boundaries, enabling them to concentrate on deeper research and make more informed trading decisions—ultimately achieving exceptional returns. We have branches in Kuala Lumpur and Ho Chi Minh City, forming an international development team.
【Responsibilities】
- Design and develop Large Language Models (LLMs), including model pre-training, efficient fine-tuning, and performance optimization
- Develop and optimize model training frameworks, implementing key technologies such as distributed training and Parameter-Efficient Fine-Tuning (PEFT)
- Build LLM evaluation systems and design domain-specific benchmarks
- Optimize model inference performance, implement model quantization, pruning, and deployment optimization
【Requirements】
- Master’s degree or above in Computer Science or related fields
- 2+ years of deep learning project development experience, including large-scale model training practice
- Solid foundation in machine learning algorithms
- Excellent experimental design and results analysis capabilities
- Strong coding standards and documentation skills
【Large Model Development】
- Expert in LLM training technologies (such as LoRA, QLoRA, Adapter, and other PEFT methods)
- Deep understanding of Transformer architecture and mainstream pre-training models (such as LLaMA, Mistral) principles and implementation
- Familiar with low-level optimization techniques like Flash Attention and random gradient compression
- Experience in model quantization and compression (such as INT4/INT8 quantization, model pruning, knowledge distillation)
- Experience in inference performance optimization, understanding of vLLM, TensorRT-LLM, and other inference acceleration frameworks
【Distributed Training】
- Expert in PyTorch, deep understanding of distributed training mechanisms like DistributedDataParallel and FSDP
- Familiar with large-scale training frameworks like DeepSpeed and Megatron-LM
- Mastery of 3D parallel (data parallel, tensor parallel, pipeline parallel) training technologies
- Experience in multi-GPU/multi-machine training system design and performance tuning
- Familiar with memory optimization methods like gradient checkpointing and mixed precision training
【System Optimization】
- Expert in Linux systems and CUDA programming
- Deep understanding of GPU architecture and memory management
- Capable of training and inference performance analysis and optimization
- Familiar with distributed storage systems (such as S3, HDFS)
【Model Evaluation】
- Expert in model performance and effectiveness evaluation methods
- Familiar with A/B testing and statistical analysis techniques
- Experience in model interpretability analysis
【Core Technologies】
- Expert in Python data processing (numpy, pandas, scikit-learn, etc.)
- Proficient in using PySpark for large-scale data processing
- Capable of designing and implementing custom loss functions
- Familiar with data visualization and experimental analysis tools
【Bonus Qualifications】
- Published papers or contributions to open-source projects related to large models
- Familiar with low-level implementation of core architectures like Transformer
- Experience in financial institutions or quantitative investment
- Understanding of financial market mechanisms and trading strategies
【Benefits】
- 2 days remote work per week, up to 25 days overseas remote work annually
- Competitive base salary and bonuses
- Flat organizational structure, positive team atmosphere
- Multiple company overseas trips annually
- Recreational activities including sports and board games
【Location】
Kuala Lumpur, Malaysia & Ho Chi Minh City, Vietnam