Bachelor’s degree, preferably in Computer Science, Information Systems Management, Engineering, or another related field.
Excellent platform and data engineering skills, with a 3-10 years of prior experience delivering enterprise data science or big data platforms within public service and/or financial services industry
Good understanding of Traditional and Generative AI / data application development (Python) and / or networking protocols and services
Familiarity with various database technologies such as MSSQL and PostgreSQL. Working experience with both structured and unstructured datasets a must
Experience setting up AWS or GCP services a plus
Working experience with Version Control (e.g.: Bitbucket, Gitlab) and Continuous Integration / Continuous Deployment platforms (e.g.: Gitlab, Jenkins) with added advantage for Container Management (e.g.: Kubernetes, Docker, AWS ECS), Terraform and Ansible, either On Premise or Cloud
Good appreciation of common cybersecurity risks and controls
Experience in analysing findings from Vulnerability Management tools (e.g.: Amazon Inspector, SonarQube, NexusIQ, Twistlock and Tenable) and rectifying code, supply-chain and / or security findings.
Knowledge on configuring popular operating systems such as Linux and Windows and / or Middleware such as Apache, Nginx, gunicorn against CIS standards
Comfortable working on multifaceted problems in an agile environment. Demonstrate passion and ability in overcoming business and technical constraints and deliver platforms necessary to productise data science products in MAS
Strong presentation and communication skills. Ability to explain technical concepts to a non-technical / business / senior management audience is a must
As part of the shortlisting process for this role, you may be required to complete a medical declaration and/or undergo further assessment.
This contract is a 2-year contract. All applicants will be notified on whether they are shortlisted or not within 4 weeks of the closing date of this job posting.