【About Us】
HIGGS HK TECHNOLOGY LIMITED is a leading provider of artificial intelligence solutions in the financial trading industry. In the rapidly changing capital markets, the greatest challenge lies in accurately forecasting market movements. Traditional models often fail to capture the complexity of financial markets, leading to missed trading opportunities and heightened risk. We are dedicated to helping the Chinese financial sector overcome these hurdles by delivering high-performance, AI-driven infrastructure. Our solutions empower even the most stringent traders to effectively manage diverse models and datasets within regulatory boundaries, enabling them to concentrate on deeper research and make more informed trading decisions—ultimately achieving exceptional returns. We have branches in Kuala Lumpur and Ho Chi Minh City, forming an international development team.
【Responsibilities】
ETL Process Design and Implementation
Design, develop, and maintain ETL pipelines focusing on data integration and data governance.
Extract, transform, and load data from various sources into MySQL/PostgreSQL-based systems for reporting, analytics, and operations.
Relational Database Development
Leverage advanced features of MySQL and PostgreSQL (e.g., partitioning, replication, indexing) to ensure optimal performance and scalability.
Collaborate with data analysts and business stakeholders to define data schemas, relationships, and queries that support business needs.
Redis & In-Memory Caching
Integrate Redis as a high-performance caching layer to reduce latency and improve data access.
Configure and maintain Redis clusters, including sharding, replication, and failover strategies.
Monitor Redis performance, troubleshoot issues, and implement optimizations for high-throughput scenarios.
Apache Flink & Real-time Data Processing
Build and maintain streaming data pipelines using Apache Flink for real-time analytics and event processing.
Implement Flink’s stateful stream processing features, windowing, and checkpointing to ensure fault tolerance and low-latency data flows.
Collaborate with other teams to integrate Flink with data sources, including relational databases, message queues, and Redis.
Data Quality and Profiling
Perform data profiling and analysis to identify data quality issues across diverse datasets.
Implement data cleansing strategies and validation rules within MySQL/PostgreSQL environments.
Performance Optimization
Monitor query performance and optimize SQL queries, indexing strategies, and resource usage in MySQL/PostgreSQL.
Ensure ETL jobs run efficiently and reliably, with minimal downtime or latency.
Documentation and Collaboration
Create and maintain comprehensive documentation of ETL processes, data flows, and transformation rules.
Work closely with cross-functional teams (data engineers, analysts, developers) to ensure alignment on project requirements and timelines.
Maintenance and Troubleshooting
Proactively monitor and troubleshoot ETL jobs, database performance, and system health.
Provide timely resolutions to production incidents and data quality issues.
【Requirements】
Educational Background
Bachelor’s degree in Computer Science, Information Technology, or a related field.
Technical Expertise
Proven experience as an ETL Developer, Database Engineer, or similar role in the financial or related industry.
Strong proficiency in MySQL and PostgreSQL with hands-on experience in database design, optimization, indexing, and replication.
Familiarity with ETL tools such as Informatica, DataStage, SSIS, or equivalent frameworks.
Redis Skills
Hands-on experience with Redis for caching or real-time data storage.
Understanding of Redis cluster setup, partitioning, and replication.
Ability to troubleshoot memory usage, latency, and concurrency issues in a Redis environment.
Apache Flink Skills
Knowledge of Apache Flink for real-time streaming and data processing.
Familiarity with Flink’s event-time processing, state management, and checkpointing.
Experience integrating Flink pipelines with databases (MySQL/PostgreSQL) or in-memory data stores (Redis) is a plus.
SQL & Data Modeling
In-depth knowledge of SQL language and best practices for query optimization.
Solid understanding of data modeling concepts, normalization/denormalization, and data warehouse design.
Data Quality & Analysis
Ability to analyze complex data sets, perform data profiling, and implement data quality checks.
Experience handling large-scale transactional data or high-volume data pipelines is a plus.
Soft Skills
Excellent problem-solving and communication skills.
Strong attention to detail and ability to handle multiple tasks in a fast-paced environment.
Knowledge of banking systems or financial data is preferred, though not mandatory.
【Benefits】
Option to work remotely within Malaysia & Spain & Barbados – up to 100% if desired; with the option to work abroad up to 25 days yearly.
Competitive base salary plus bonus.
A flat organizational structure with a positive team spirit.
Multiple company-sponsored overseas trips each year.
Various leisure activities including sports, board games, and more.
【Location】
Kuala Lumpur, Malaysia & Ho Chi Minh City, Vietnam