Job Description
Roles & Responsibilities
- Develop and maintain data pipelines using Python, Java, or Scala
- Write and optimize SQL queries; work with relational and NoSQL databases (PostgreSQL, MongoDB)
- Design, build, and deploy data solutions on cloud platforms such as AWS, Azure, or GCP
- Work with big data tools like Spark, Hadoop, and Airflow for large-scale data processing
- Design and manage ETL pipelines, data models, and data warehouses
- Ensure data security, privacy, and compliance with regulations such as GDPR and CCPA
- Analyze data issues and provide efficient, scalable solutions
Required Skills
- Programming: Python, Java, Scala
- Databases: SQL, PostgreSQL, MongoDB
- Cloud Platforms: AWS / Azure / GCP
- Big Data: Spark, Hadoop, Airflow
- ETL & Data Warehousing concepts
- Strong analytical and problem-solving skills
Job Details
- Industry Type: IT Services & Consulting
- Department: Engineering – Software & QA
- Employment Type: Full Time, Permanent
- Role Category: Software Development
Education
- UG: Any Graduate / B.Tech / B.E. in Any Specialization
Key Skills
AWS, Python, SQL, ETL Pipelines, PostgreSQL, Data Engineering