Job Description

  • Process large and complex financial data from banks, payment gateways, and processors
  • Build and maintain data pipelines with business rules and financial calculations
  • Design event-driven systems to meet strict timelines for transactions and settlements
  • Prepare data for reports, analytics, AI insights, fraud detection, and payment optimization

Required Skills

  • Strong Python skills with good understanding of OOPS
  • Experience with big data tools like Spark (PySpark), Kafka, Flink, Hudi, or Iceberg
  • Knowledge of cloud platforms: AWS, GCP, or Azure
  • Understanding of data warehousing and data modeling
  • Experience with ETL tools such as Apache Airflow
  • Hands-on experience with data lakes and cloud storage (S3, GCS, Azure Blob)
  • Good knowledge of Git for version control
  • Ability to work on complex systems and solve challenging problems

Qualifications

  • Bachelor’s degree in Computer Science / IT or equivalent experience
  • 1–5 years of experience in Data Engineering, ETL, or Database Management
  • Experience in cloud-based environments is preferred
  • Proven ability to build and optimize high-performance systems

Key Skills

Python, Data Processing, Apache Spark, Kafka, ETL, Data Modeling, Git, Cloud Platforms, Performance Tuning