Company Summary
My client is an Internet of Things (IoT) platform, that offers a smart management application for small and big enterprises. Their users, including businesses and organizations, have access to a diverse selection of products, such as energy system performance monitoring, analytics and forecasting, and equipment fault diagnostics. Due to business expansion, my client is currently looking for a Data Engineer to join the HK team.
Responsibilities:
- Responsible for planning, building, and executing a data pipeline for handling data streaming from IoT devices
- Oversee and keep up with existing database, including both social and non-social.
- Collaborate with Project Management, Design Group, QA, and other developers’ team.
- Foster APIs for data administrative purposes for serving information to other parties.
- Work closely with the development team to deliver new elements and upgrades for the company’s digital products.
- Streamline the data architecture on the cloud to deal with massive amounts of incoming data in a scalable manner.
Requirements:
- Bachelor’s degree in Computer Science, Software Engineering, or related.
- At least 2+ years experience in Python and System integration.
- Has good experience with ETL pipelines and RESTful APIs.
- Knowledge of container technologies, preferably Docker, and Kubernetes
- Great experience in cloud development, such as Azure or Amazon Web Services.
- Knowledge of big data frameworks, such as Presto, Kafka, Flink, or Spark.
- Self-motivated, with strong problem-solving and analytical skills.
- Good verbal and written communication skills in Cantonese, Mandarin, and English