Information Technology

Azure Big Data Architect

The Azure Big Data Architect will be responsible for Enterprise data warehouse design, development, implementation, migration, maintenance and operation activities.

Contract
Remote

Apply to this position

Jibio can help you find your perfect position.

Apply Now  

About the position

Job Description

The Azure Big Data Architect will be responsible for Enterprise data warehouse design, development, implementation, migration, maintenance and operation activities. The position is a key resource or various Enterprise data warehouse projects and building critical data marts, and data ingestion to Big Data platforms'

Responsibilities
  • Participate in Team activities, Design discussions, Stand up meetings and planning Review with team.
  • Perform data analysis, data profiling, data quality and data ingestion in various layers using big data queries, PySpark programs and UNIX shell scripts.
  • Follow organizational coding standard documents, create mappings, sessions and workflows outlined within mapping specification documents.
  • Perform Gap and impact analysis of ETL and IOP jobs for new requirement and enhancements.
  • Create mockup data, perform unit testing and capture the result sets against the jobs developed in lower environment.
  • Updating the production support run book.
  • Create and update design documents, provide detail description about workflows after every production release.
  • Continuously monitor the production data loads, fix the issues, update the tracker document with the issues, Identify the performance issues.
  • Performance tuning long running ETL/ELT jobs by creating partitions, enabling full load and other standard approaches.
  • Perform Quality assurance check, Reconciliation post data loads and communicate to vendor for receiving fixed data.
  • Participate in ETL/ELT code review and design re-usable frameworks.
  • Create re-usable framework for audit control to capture reconciliation, mapping parameters and variables, serves as single point of reference for workflows.
  • Participate in meetings to upgrade the functional and technical expertise.
Technical Skills and Experience
  • Expertise implementing complex ETL/ELT logic.
  • Develop and enforce strong reconciliation process.
  • Accountable for ETL/ELT design documentation.
  • Good knowledge of Big Data, Hadoop, Hive, data security and dimensional model design.
  • Basic knowledge of UNIX/LINUX shell scripting.
  • Utilize ETL/ELT standards and practices towards establishing and following centralized metadata repository.
  • Effective communication, presentation, & organizational skills.
  • 6+ years of experience with Big Data, Hadoop on Data Warehousing or Data Integration projects,and familiarity with PHI and PII data.
  • Analysis, Design, development, support and Enhancements of ETL/ELT in data warehouse environment with Azure Technologies.
  • Strong development experience in creating powershell scripts, PySpark programs, HDFS commands and HDFS file formats
  • Writing Hadoop/Hive scripts for gathering stats on table post data loads.
  • Writing complex SQL queries and performed tuning based on the Hadoop/Hive explain plan results.
  • Proven ability to write high quality code.
  • 6+ years of experience building data sets
  • Familiar with Project Management methodologies (Agile)
  • Ability to establish priorities & follow through on projects, paying close attention to detail with minimal supervision.
Certifications Required
  • Microsoft Azure DP-200: Implementing an Azure Data Solution
  • Microsoft AzureDP-201: Designing an Azure Data Solution
Education
  • Relevant Bachelor of Science or Bachelor of Arts degree or similar experience