TQUSI0525_5330 - Data AI with GCP

Job Type: C2H

Work Mode: Hybrid (3 Days from office)


Role - Data AI with GCP

Location - Chennai/Hyderabad/Bangalore

Experience - 8-12 Years


Key Responsibilities

  • Design and implement end-to-end data pipelines on GCP
  • Build and manage data lakes and data warehouses using BigQuery and Cloud Storage
  • Develop and deploy AI/ML models and analytics solutions
  • Implement data ingestion, processing, and transformation using Dataflow, Dataproc, and Pub/Sub
  • Enable real-time and batch data processing
  • Apply MLOps & DataOps best practices for model and pipeline automation
  • Ensure data quality, security, governance, and performance optimization
  • Collaborate with business, analytics, and cloud teams to deliver insights-driven solutions


Required Skills

  • 8+ years of overall experience in Data Engineering / AI / Analytics
  • Strong hands-on experience with GCP data & AI services
  • BigQuery, Cloud Storage
  • Dataflow, Dataproc, Pub/Sub
  • Vertex AI, BigQuery ML (good exposure)
  • Strong programming skills in Python & SQL
  • Experience with ETL/ELT frameworks and large-scale data processing
  • Understanding of cloud-native and distributed data architectures


Good to Have

  • Experience with AI/ML use cases (predictive analytics, NLP, recommendation systems)
  • Knowledge of GenAI / LLM integration on GCP (Vertex AI, Gemini, RAG)
  • Hands-on with Docker, Kubernetes (GKE)
  • GCP Certifications (Data Engineer / ML Engineer / Cloud Architect)
  • Domain experience in BFSI, Healthcare, Retail, or Telecom


Want To
WORK FOR YOU?

GET THE QUOTE

Want To
WORK WITH US?

CAREER