Mamata R.’s Post

View profile for Mamata R., graphic

Technical Recruiter at Clairvoyant : an EXL Company

Hiring Notification !! We're looking for Azure Data Architect at Clairvoyant, An EXL Company. Experience Needed: Minimum 9 to 14 Years Location: Noida, Gurgaon, Pune, Bengaluru, Hyderabad - Hybrid Email your CV: mamata.rawool@exlservice.com with the subject line Azure Data Architect. Must have skills: -8+ years of proven experience as a Software Technical Architect in Big Data Engineering. -Strong understanding of Data Warehousing, Data Modelling, Cloud and ETL concepts -Experience with Azure Cloud technologies, including Azure Data Factory, Azure Data Lake Storage, Databricks, Event Hub, Azure Monitor and Azure Synapse Analytics -Proficiency in Python, PySpark, Hadoop, and SQL. -Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. -Strong experience in common data warehouse modelling principles including Kimball, Inmon. Responsibilities:   -Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. -Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. -Collaborate with project managers for project/sprint planning by estimating technical tasks and deliverables. -Data Modelling: Develop and maintain data models to represent our complex data structures, ensuring data accuracy, consistency, and efficiency. -Technical Leadership: Provide technical leadership and guidance to development teams, promoting best practices in software architecture and design. -Solution Design: Collaborate with stakeholders to define technical requirements and create solution designs that align with business goals and objectives. -Programming: Develop and maintain software components using Python, PySpark, and Hadoop to process and analyze large datasets efficiently. -Big Data Ecosystem: Work with various components of the Hadoop ecosystem, such as HDFS, Hive, and Spark, to build data pipelines and perform data transformations. -SQL Expertise: Utilize SQL for data querying, analysis, and optimization of database performance. -Performance Optimization: Identify and address performance bottlenecks, ensuring the system meets required throughput and latency targets. -Scalability: Architect scalable and highly available data solutions, considering both batch and real-time processing. -Documentation: Create and maintain comprehensive technical documentation to support the development and maintenance of data solutions. -Security and Compliance: Ensure that data solutions adhere to security and compliance standards, implementing necessary controls and encryption mechanisms. -Improve the scalability, efficiency, and cost-effectiveness of data pipelines. -Take responsibility for estimating, planning, and managing all tasks and report on progress. #dataarchitect #azure #python #hiring #exl

Andre Samuels

Experienced Service Desk Analyst and Desktop Support Specialist

7mo

Great opportunity!

To view or add a comment, sign in

Explore topics