6-8 years of experience in custom ETL design, implementation and maintenance using Python, Apache Beam, Apache Airflow, Spark & Scala.
Minimum needed certifications: Google Cloud Certified Professional Data Engineer
5+ years of designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Spark, Python, Scala etc.
5+ years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Cloud Storage, Cloud SQL, Cloud Dataproc, Cloud Dataflow, BigQuery, Cloud Pub Sub, Cloud Functions, Cloud Run, Spark, Hive, Airflow etc.
Experience implementing and administering APIs.
5+ years of software development experience in Agile Scrum methodology
Experience with code repository, version control and code integration tools such as Git/bitbucket and Jenkins
Hands on experience with Unix/Linux Command Line Interface (CLI) and Shell Scripting
Strong Communication Skill (English/Thai)
Good Planning and organizational skills
Good to have skills:
Understanding of Machine Learning model development, deployment using Vertex AI platform
Understanding of defining enterprise level Data modernization & transformation solution on Google Cloud and adoption strategy for customers
Participated in Architecture and solution design of cloud data strategy and platform aligned with the business objectives
Understand the evaluation the google cloud native services and other technologies for our customer needs and recommending the best fit for them aligning to their future vision and strategy
Work collaboratively with various teams across geographies and contribute to build, evangelize data architecture & AI/ML best practices, re-usable assets and solutions
Based on Customer specific requirements and use cases, support the Data architect to design solutions and in end to end ownership of iterating and ensuring the solutions are deployed and fit for purpose..