Job Description
We’re looking for a Data Engineer to build and maintain scalable, high-performance data pipelines using GCP tools like BigQuery, DataFlow, Dataform/dbt, and Cloud Storage. You’ll design efficient ETL/ELT workflows, optimize data infrastructure, and ensure data quality, security, and compliance.
If you're passionate about transforming complex data into reliable insights and enabling data-driven decisions across teams, this is your opportunity to make a real impact.
Job Responsibilities
Design, develop, and maintain scalable data pipelines using GCP services such as BigQuery, DataFlow (Apache Beam), Dataform/dbt, and Cloud Storage.
Implement ETL/ELT processes for data ingestion, transformation, and loading.
Optimize data workflows for performance, scalability, and reliability.
Ensure data governance, security, and compliance best practices.
Collaborate with data analysts to enable seamless access to structured and unstructured data.
Troubleshoot and resolve data infrastructure issues, minimizing downtime and performance bottlenecks.
Stay up to date with industry trends and emerging technologies, continuously improving data engineering practices.
Job Requirement
Bachelor's degree in Computer Science, Data Science, Statistics or a related field
2 years of experience in a Data Engineering or similar role
Strong verbal and written communication skills in Indonesian and English
Strong experience with Google Cloud Platform (BigQuery, Cloud Storage, Cloud Functions, etc.)
Hands-on experience with Dataform (or dbt)
Proficiency in SQL and scripting languages (e.g., Python or JavaScript)
Experience in version control tools (Git) and CI/CD workflows
Solid understanding of data warehousing principles and ETL/ELT best practices
Experience with orchestration tools (e.g., Airflow, Cloud Composer) is a plus