Job Description
What You’ll Do
- Design and build data pipelines using Google BigQuery
- Work with different teams to understand data needs and project goals
- Develop and maintain large-scale data systems
- Ensure data accuracy and quality through validation and testing
- Optimize data workflows for better performance
- Identify and fix data-related technical issues
What You Need
- Strong understanding of data engineering concepts
- Hands-on experience with BigQuery and cloud data platforms
- Good knowledge of SQL and data pipelines
- Experience with ETL, data processing, and data warehousing
- Strong problem-solving and analytical skills
- Ability to work well in a team environment
- Good communication skills
- Familiarity with Agile methodology and version control tools
Good to Have
- Experience with GCP, Hadoop, Spark, AWS, Azure
- Knowledge of Python and data analysis tools
- Exposure to Big Data technologies