Blog

ETL Developer-2 (Timeseries Database & Redis & Flink)

【About Us】
HIGGS HK TECHNOLOGY LIMITED is a leading provider of artificial intelligence solutions in the financial trading industry. In the rapidly changing capital markets, the greatest challenge lies in accurately forecasting market movements. Traditional models often fail to capture the complexity of financial markets, leading to missed trading opportunities and heightened risk. We are dedicated to helping the Chinese financial sector overcome these hurdles by delivering high-performance, AI-driven infrastructure. Our solutions empower even the most stringent traders to effectively manage diverse models and datasets within regulatory boundaries, enabling them to concentrate on deeper research and make more informed trading decisions—ultimately achieving exceptional returns. We have branches in Kuala Lumpur and Ho Chi Minh City, forming an international development team.

【Responsibilities】
Real-time Data Processing

Build and optimize real-time data stream processing systems using Apache Flink for millisecond-level market data, option Greeks, volatility, and news sentiment calculations.

Design and implement complex event processing logic using Flink CEP for real-time market anomaly detection and trading signal generation.

In-Memory Storage and Performance Optimization

Design efficient real-time data storage and query architectures using Redis to support second-level aggregation and access of massive tick data.

Optimize Redis cluster performance, including data sharding, persistence configuration, memory management, and high concurrency processing.

Time Series Database Integration

Evaluate, implement, and optimize time series databases such as AWS Timestream, Apache Doris, and Alicloud TSDB to handle large-scale, high-throughput data ingestion and real-time analytics.

Collaborate with the data engineering team to ensure end-to-end data pipelines are robust, scalable, and capable of meeting ultra-low-latency requirements for trading operations.

Quantitative Strategy Support

Support real-time generation and dynamic adjustment of quantitative trading strategies, such as minute-level 3-Way Collar strategies.

Work closely with quantitative analysts and algorithm teams to design low-level technical frameworks supporting high-frequency trading.

Kafka & Messaging Pipelines

Plan, set up, and maintain Kafka-based messaging systems to handle high-volume real-time data ingestion and distribution.

Configure Kafka topics, partitions, and consumer groups to achieve high throughput and low latency based on business needs.

Monitor and troubleshoot Kafka clusters, ensuring data pipeline reliability and minimal downtime.

System Stability and Scalability

Build highly available distributed systems ensuring stability and low latency for real-time trading systems.

Design data flow and storage scaling solutions supporting real-time computation needs for thousands of stocks and options.

Technical Innovation and Optimization

Continuously research latest big data processing technologies and database optimization solutions.

Create high-quality technical documentation and provide technical support and training.

【Requirements】
Core Skills

Bachelor’s degree or above in Computer Science or related field, 2+ years experience in distributed systems or financial big data processing.

Expert in Redis, familiar with data structures, persistence mechanisms, and cluster deployment.

Expert in Apache Flink, familiar with stream processing architecture, state management, and Complex Event Processing (CEP).

Experience or strong interest in time series databases (e.g., AWS Timestream, Alicloud TSDB, Apache Doris) is highly preferred.

Programming Skills

Proficient in C++, Java, Python, Scala with solid programming foundation.

Experience in high-throughput, low-latency development preferred.

Familiar with multi-threading programming and asynchronous frameworks.

Data Skills (Bonus)

Proficient in setting up Apache Doris.

Experienced with AWS Timestream, Alicloud TSDB, Kafka, Minio, and other common components for large-scale data ingestion and real-time analytics.

Familiar with Bloomberg/Reuters/Factset financial data product structures.

Hands-on experience with Kafka cluster installation, configuration, and management.

Proficiency in setting up Kafka topics, partitions, consumer groups, and handling offset management.

Ability to monitor and troubleshoot Kafka performance issues (e.g., latency, throughput, broker health).

Familiarity with securing Kafka (e.g., SSL, ACLs) and integrating it into real-time data pipelines.

Quantitative Trading Experience (Bonus)

Experience in financial industry and quantitative trading system development.

Understanding of tick data calculation and option pricing models.

Familiar with Greeks and implied volatility calculations.

Knowledge of common option trading strategy calculations.

Personal Skills

Strong problem analysis and solving abilities.

Excellent team collaboration and communication skills.

Fluent English proficiency preferred.

【Benefits】
Optional remote work up to 100% – your choice; up to 25 days working abroad annually.

Competitive base salary and bonuses.

Flat organizational structure, positive team atmosphere.

Multiple company overseas trips annually.

Recreational activities including sports and board games.

【Location】
Kuala Lumpur, Malaysia & Ho Chi Minh City, Vietnam