We're #hiring a new Data Architect in Hyderabad, Telangana. Apply today or share this post with your network.
Allwyn Corporation’s Post
More Relevant Posts
-
Hi All I hope you all are doing well. 😊 I would like to connect with you regarding a job opportunity in one of the Big4. Job Role - Data Engineer Experience - 6+ Years Location- Mumbai Job Summary: Data Modeler Job Description: Designing and implementing data models to manage and analyse data effectively. Work closely with data architects, data analysts, and database administrators to create and manage data systems that support efficient data management and analysis. Understand business requirements, translate them into technical designs, deal with complexities of integration of disparate systems, investigate the quality of the data in those systems and put it all together so it can be accessed, manipulated, and turned into business intelligence. Design conceptual/ logical/ physical entity relationship and/or relational models and multidimensional models for an EDW that implements flexible reporting capability in the context of one or more reporting toolsets. Provide multi-dimensional modelling expertise and leadership by designing data structures to support the growth and improved reporting functionality of the Data Lake/ EDW. Guide the development and maintenance of data standards, policies and practices; Guide the design of the data models, in alignment with the solution architecture. Work closely with database administration, ETL, and BI development staff to optimize overall design at an enterprise level. Responsibilities: Analyse Business Needs: Understand and translate business requirements into data models. Develop Data Models: Create conceptual, logical, and physical data models to support business processes. Database Design: Design and implement effective database solutions to store and retrieve company data. Data Migration: Oversee the migration of data from legacy systems to new solutions. Performance Monitoring: Monitor system performance by performing regular tests, troubleshooting, and integrating new features. Compliance: Ensure that database implementation procedures comply with internal and external regulations. Data Security: Understanding of authentication and authorization techniques. Developing the model to support data level security. Please connect with Renu Indoriya or you can mail your resumes to [email protected] Sanjeev Anand Techsist Solution Mohd Qaaid Riya Agarwal Showket Rahman Priti Yadav Nidhi Suryavanshi Shalima Singh Mamtesh Tiwari #Data Engineer #Data Modeler #Data Modelling #Data Migration #ETL #Business Needs #Database Design
To view or add a comment, sign in
-
Hi, We are #hiringimmediately Data Governance Consultant, If interested Please share your resume to [email protected]. Role: Data Governance Consultant Location: Chennai/Coimbatore- Work from office Experience: 5+ years Notice period: Immediate to 15 days max Data Skills: Data Modeling: Understanding of data models (star schema, snowflake schema) and the ability to identify data entities, attributes, and relationships relevant to the financial industry. Data Quality: Knowledge of data quality principles, metrics and techniques for data cleansing, validation, and transformation. Data Security: Understanding of data security best practices in finance, including access control, encryption, and data loss prevention. Data Catalog Management: Experience with managing data catalogs, including data classification, tagging, and metadata management, if Acryl utilizes a data catalog component. Data Lineage Tracking: Understanding of data lineage tracking principles and how Acryl implements it to track data origin, transformations, and flow throughout the system. Data Access Control: Knowledge of data access control best practices and how Acryl enforces access restrictions based on user roles and permissions. Technical skills and duties Has data engineering knowledge Has familiarity with graphql, python or java Has familiarity with key data platforms: Snowflake, Looker, Aurora Postgres, Fivetran, Airflow, SQL server Can work cross-functionally to identify dataset owners, gather documentation, and coordinate setting up the business glossary, domains, and other similar tasks. This information usually lives across multiple technical teams. Tag all tables and fields, PII/non-PII data Specify the relationship between datasets Have a project manager experience to track tasks against milestones Nice to have: Financial Data Management: Familiarity with financial data regulations and standards (e.g., Know Your Customer (KYC), Anti-Money Laundering (AML)). Basic understanding of financial products, services, and regulations relevant to small public financial companies. AIML : Knowledge of data science, AIML, MLOps, data protection, privacy
To view or add a comment, sign in
-
Dear Connections, We are hiring for AD - Project Data Engineer Skills: GCP, Data Science, Data Modeling, ETL pipelines, Data Visualization Location: Bangalore Exp: 6 - 10yrs Mode of Work: Hybrid can share your resume at [email protected] #DataEngineer# #DataScience# #DataModeler# #GCP# #Bigquery# #ETLPipelines# #Datavisualizatiin#
To view or add a comment, sign in
-
-
Job Title: Big Data Developer Experience: 3+ years Contract: 3 Months + Extension Location: Onsite Bengaluru (India, Embassy Tech Square, Bangalore) Expected CTC: 100000 per month Regrads Lokesh. K [email protected] #hiring #BigdataDeveloperhiring #hiringalert #BigData #DataScience #DataAnalytics #BigDataDeveloper #DataEngineering #Hadoop #Spark #Python #Java #Scala #MachineLearning #ArtificialIntelligence #DataEngineering #DataArchitecture #DataMining
To view or add a comment, sign in
-
#Hello_connections! #Hiring_allert!!! #Hiring_for_below_mentioned_positions! Data Engineer Senior Data Engineer Lead Data Engineer Experience: 5+ years Location: Bangalore Interested Candidates can share the resume to [email protected] #Datapipelines #Datamodeling #Snowflake #SQLQueries #DataLakes #DataWarehouse #Azure #Python #APIDevelopment #FiveTran #Coding #Documenting #Testing
To view or add a comment, sign in
-
#Hello_connections! #Hiring_alert!!! #Hiring_for_below_mentioned_positions! Data Engineer Senior Data Engineer Lead Data Engineer Experience: 5+ years Location: Bangalore Interested Candidates can share the resume to [email protected] #Datapipelines #Datamodeling #Snowflake #SQLQueries #DataLakes #DataWarehouse #Azure #Python #APIDevelopment #FiveTran #Coding #Documenting #Testing
To view or add a comment, sign in
-
Hiring Notification !! We're looking for Azure Data Architect at Clairvoyant, An EXL Company. Experience Needed: Minimum 9 to 14 Years Location: Noida, Gurgaon, Pune, Bengaluru, Hyderabad - Hybrid Email your CV: [email protected] with the subject line Azure Data Architect. Must have skills: -8+ years of proven experience as a Software Technical Architect in Big Data Engineering. -Strong understanding of Data Warehousing, Data Modelling, Cloud and ETL concepts -Experience with Azure Cloud technologies, including Azure Data Factory, Azure Data Lake Storage, Databricks, Event Hub, Azure Monitor and Azure Synapse Analytics -Proficiency in Python, PySpark, Hadoop, and SQL. -Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. -Strong experience in common data warehouse modelling principles including Kimball, Inmon. Responsibilities: -Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. -Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. -Collaborate with project managers for project/sprint planning by estimating technical tasks and deliverables. -Data Modelling: Develop and maintain data models to represent our complex data structures, ensuring data accuracy, consistency, and efficiency. -Technical Leadership: Provide technical leadership and guidance to development teams, promoting best practices in software architecture and design. -Solution Design: Collaborate with stakeholders to define technical requirements and create solution designs that align with business goals and objectives. -Programming: Develop and maintain software components using Python, PySpark, and Hadoop to process and analyze large datasets efficiently. -Big Data Ecosystem: Work with various components of the Hadoop ecosystem, such as HDFS, Hive, and Spark, to build data pipelines and perform data transformations. -SQL Expertise: Utilize SQL for data querying, analysis, and optimization of database performance. -Performance Optimization: Identify and address performance bottlenecks, ensuring the system meets required throughput and latency targets. -Scalability: Architect scalable and highly available data solutions, considering both batch and real-time processing. -Documentation: Create and maintain comprehensive technical documentation to support the development and maintenance of data solutions. -Security and Compliance: Ensure that data solutions adhere to security and compliance standards, implementing necessary controls and encryption mechanisms. -Improve the scalability, efficiency, and cost-effectiveness of data pipelines. -Take responsibility for estimating, planning, and managing all tasks and report on progress. #dataarchitect #azure #python #hiring #exl
To view or add a comment, sign in
-
#Hiring alert 🔔 Job Title : Data Goverance Experience: 12+ Work Location: Bangalore Notice: Immediate to Max of 30 days Job Description: 👉 Establish and govern enterprise-wide data governance, data quality, and data management roadmap 👉 Serve as a liaison between the business and Data and Analytic team and ensure that data related requirements are clearly defined, prioritized, and well understood 👉 Proactively champions the value of information (data) as a strategic business asset and revenue generator by providing the business and strategy teams with ideas and promoting those ideas into action; including developing and implementing communication, learning, change management, and adoption plan 👉 Create and maintain DG policies, playbook(s), processes, and procedures for guiding various data management processes, including the adoption of data security and privacy, data quality control, and data dissemination activities 👉 Align data management practices with regulatory requirements such as GXP 👉 Lead the creation, maintenance, and monitoring of the Data Governance policies, processes, procedures, and playbook(s), data catalogs, data dictionary, and business glossary for guiding the data management aspects of the Enterprise Data Lake 👉 Working with the business units, helping solve data quality/availability issues, and working with the data engineers and data scientists 👉 Define and establish the analytics governance for the use cases worked by the Enterprise Data and Analytics team. #Data governance #Data quality #Data strategy #Data stewardship #Catalog #Lineage If Interested please share the updated resume to [email protected] or Comment below Follow 🔔 Culminant Outlook🔔 for more updates
To view or add a comment, sign in