【About Us】
HIGGS HK TECHNOLOGY LIMITED is a leading provider of artificial intelligence solutions in the financial trading industry. In the rapidly changing capital markets, the greatest challenge lies in accurately forecasting market movements. Traditional models often fail to capture the complexity of financial markets, leading to missed trading opportunities and heightened risk. We are dedicated to helping the Chinese financial sector overcome these hurdles by delivering high-performance, AI-driven infrastructure. Our solutions empower even the most stringent traders to effectively manage diverse models and datasets within regulatory boundaries, enabling them to concentrate on deeper research and make more informed trading decisions—ultimately achieving exceptional returns. We have branches in Kuala Lumpur and Ho Chi Minh City, forming an international development team.
【Responsibilities】
Elasticsearch Ecosystem (ELK)
Drive operational readiness and resolve production issues in Elasticsearch, Logstash, and Kibana (ELK).
Conduct capacity planning and analysis for Elasticsearch clusters.
Perform regular health checks of Elasticsearch clusters and indices; collect and analyze slow logs to identify poorly performing queries.
Troubleshoot performance issues, scaling indices, and optimize cluster configurations.
Work with multiple stakeholders to analyze requirements, clarify design dependencies, create test plans, and support both functional and non-functional activities.
Set up and configure the Elastic Stack and ensure secure data transfers (e.g., via Beats, Logstash pipelines).
Configure ELK stack components to collect, store, and visualize data to meet business requirements.
Redis
Design and implement Redis architectures to support high-throughput and low-latency caching needs.
Optimize Redis performance, including data sharding, persistence configurations, memory management, and high concurrency access.
Monitor and troubleshoot Redis clusters, proactively identifying and resolving performance bottlenecks.
Kafka & Messaging Systems
Design, deploy, and maintain Apache Kafka clusters for real-time data ingestion and event streaming.
Configure Kafka topics, partitions, and consumer groups to optimize throughput and scalability.
Ensure reliability of data pipelines by monitoring and troubleshooting Kafka performance, latency, and offset management.
Collaborate with cross-functional teams to integrate Kafka with ETL workflows and other real-time data processing frameworks (e.g., Flink, Redis).
Flink & Real-time Data Processing
Build real-time data ingestion and processing pipelines using Apache Flink.
Develop and maintain streaming ETL jobs that handle large-scale data flows in near real time.
Implement Flink’s stateful stream processing, checkpointing, and fault-tolerance features to ensure data reliability and low latency.
Work closely with data engineering and analytics teams to integrate Flink pipelines with other components, including Elasticsearch and Redis.
Data Governance and Security
Implement and manage security settings (e.g., encryption at rest, encryption in transit) across Elasticsearch, Redis, and Flink.
Work on vulnerability fixes and maintain strict compliance with security standards.
Coordinate with cross-functional teams to ensure data integrity, consistency, and governance best practices.
Infrastructure & Maintenance
Maintain large-scale Linux environments; monitor system health, server resource utilization, and network performance.
Collaborate with infrastructure teams to ensure smooth operation of server, storage, and network services.
Troubleshoot production issues in a timely manner and implement preventive measures.
Documentation & Collaboration
Maintain comprehensive documentation of configurations, processes, and best practices for Elasticsearch, Redis, and Flink environments.
Provide input on architectural decisions and share insights with cross-functional teams.
Participate in an on-call rotation to respond to production system incidents.
【Requirements】
Elasticsearch & ELK
Hands-on experience in Elasticsearch, Logstash, Kibana (ELK).
Proficient in deploying Elasticsearch and Logstash configurations, performance tuning, REST APIs, and cluster management.
Ability to set up different types of Beats and establish secure data transfers.
Skilled in creating queries, dashboards, and visualizations in Kibana for log analytics and monitoring.
Redis
Practical experience with Redis clusters, including data partitioning, replication, and persistence.
Strong understanding of caching patterns, memory management, and high concurrency optimization.
Kafka
Proven experience in Apache Kafka administration, including cluster setup, topic configuration, and performance tuning.
Knowledge of consumer group management, offsets, partitioning, and replication strategies.
Ability to troubleshoot issues related to data ingestion, latency, and broker performance.
Familiarity with best practices for securing Kafka (ACLs, encryption, etc.) in production environments.
Apache Flink
Experience with Apache Flink for building streaming ETL pipelines and real-time data processing.
Familiarity with Flink’s CEP, windowing, stateful processing, and checkpointing mechanisms is a plus.
Data & Query Languages
Good experience in query languages (SQL and/or specialized search languages) for handling large data sets.
Understanding and experience with Cloudera CDP services (HDFS, HBase, Spark, Ranger) is a bonus.
Ability to write complex queries with joins and aggregations across big datasets.
Security & System Administration
Hands-on experience with security configurations in a distributed environment.
Strong background in managing and maintaining large-scale Linux environments.
Familiarity with ITIL processes is advantageous.
Soft Skills
Excellent problem-solving and communication skills.
Attention to detail and ability to work effectively in a fast-paced environment.
Ability to collaborate with cross-functional teams and stakeholders.
【Benefits】
Option to work remotely within Malaysia & Spain & Barbados – up to 100% if desired; with the option to work abroad up to 25 days yearly.
Competitive base salary and bonus.
Flat structure with a positive team spirit.
Multiple overseas company trips each year.
Leisure activities such as sports, board games, etc.
【Location】
Kuala Lumpur, Malaysia & Ho Chi Minh City, Vietnam