TQUSI0005_4567 - DataArchitect (GCP)

Job Description :- 


  • Gaining a deep understanding of business domain data and how it is used for metrics, analytics, and AI/ML data designs for strategic project/data product
  • Ensuring the integrity of data designs and governance for data products
  • Initiating data design - Conceptual/Logical/Physical data models and proficient in Dimensional Data Design
  • Proficient in all modeling techniques - Data Flow, ER Diagram, Conceptual, Logical and Physical
  • Metadata and data taxonomy management
  • Implementation/designs for data security classification and protection
  • Building strong working relationships and develop deep partnerships with the business, Data Engg team and other stakeholders  
  • Working collaboratively with clients, developers, and QA analytics to design and implement solutions in production
  • Data Mapping from the disparate data source – Teradata, Oracle, SQL Server and Semi – Structure data sources and translate to source to target mapping with complex business rules
  • Mandate Skills on : Data Profiling and Analysis and pivoting results to excel
  • Will be updating the JIRA’s on a regular basis

 

What we’re looking for...

  • SME knowledge & experience using and data architecture/design with Google Cloud Big Query is mandatory
  • Google Cloud Platform (GCP) certification (or comparable experience)
  • Data design/migration to Google Cloud Platform hands on experience
  • Subject matter expertise with Google Cloud Services: Streaming and Batch, Cloud Storage, Cloud Dataflow, Data Proc , DFunc, Big Query, and Big Table
  • Subject matter expertise with Data Proc & Dataflow with Java on Google Cloud Platform
  • Subject matter expertise with serverless data warehousing concepts on Google Cloud Platform
  • Experience working with both structured and unstructured data sources
  • Familiarity with the Technology stack available in the industry for data cataloging, data ingestion, capture, processing, and curation: Kafka, StreamSets, Collibra, Map Reduce, Hadoop, Spark, Flume, Hive, Impala, Spark SQL
  • Five or more years of relevant expertise to create data architecture diagrams/flows, presentation slideware, and other architecture artifacts from requirements and business users using Erwin Data Modeler or a similar toolset
  • Working knowledge of Teradata, AWS, and GCP data environments


Want To
WORK FOR YOU?

GET THE QUOTE

Want To
WORK WITH US?

CAREER