Hello Everyone, Urgent Requirement on W2 Job Title: Data Migration Lead WMS (some clinical side exp) Job Location: New Jersey (preferred) Experience: 10+ Remote role Project Duration 6 months Detailed JD (Pl share the Detailed Description, 1 liner JD will not work) • Determine the scope of Data migration. • Document the data migration strategy. • Document Data Migration Protocol to go along with the data migration strategy. • Produce the Data Migration Summary report based on the data migration strategy and protocol documents. • Participate in finalizing the code migration strategy in conjunction to the data migration strategy. • Co-ordinate moving of various transports from a WMS perspective. • Support Mapping and Construction of Data from Legacy Systems • Establish and Support Pre-Load and Post-Load Business Signoff • Support Internal and External Quality Control and Audit Processes • Lead Data Defect Triage and Remediation • Scope/Identify/guide any remediation of incomplete/incorrect Data in Legacy Systems. • Validation of all the data once its moved across the environments (DEVPQAQAProd) • Cutover planning and Mock cut-over coordination and fall over and fall out reconciliation working closely with Systems and Performance teams. Some of the points to be considered for cutover are: o Inclusion of reversion back to PROD original state post mock cutover exercises for first site o Plan for mock cutover for 2nd site as FDC will already be in PROD • POST go-live monitoring of all data related activities and the whole cutover plan. • Some of the other specific responsibilities will also involve the following: 1. Verification of Data Integrity between platforms – ERP, WMS, Serialization 2. Coordination of delivery/ creation of BY Test data in each environment – DEV, PQA, QA and Prod 3. Oversight of all things data migration environment to environment strategy – configuration, master and transactions as applicable a. Reconciliation of accelerator tool result for uplifts across platforms b. RCA for fill-out and follow-on corrective actions c. Updated approach review to not repeat history for each Component load to higher environments. 4. Data Conversion Strategy from lower to higher environments including BY configuration preservation where applicable and legacy MARC data conversion. a. LPN’s with serialized hierarchies b. Inventory (LPN) by location c. Orders by status i. Inbound ii. Outbound d. Orders not only by status including progress quantity details at header and at line item level e. Inventory by Location with LPN details regarding status, committed (all of the sub-tabs) f. Inventory reconciliation with ERP and Serialization g. Serialization Data h. Clinical Data Pls share the Profiles at [email protected] #W2 #USA #Data Migration Lead WMS
Sridevi I’s Post
More Relevant Posts
-
Job Title/Role : Data Migration Lead WMS some clinical side exp New Jersey preferred REMOTE role Project Duration 6 months Visa : USC/GC EXP : 10+YRS Detailed JD • Determine the scope of Data migration. • Document the data migration strategy. • Document Data Migration Protocol to go along with the data migration strategy. • Produce the Data Migration Summary report based on the data migration strategy and protocol documents. • Participate in finalizing the code migration strategy in conjunction to the data migration strategy. • Co-ordinate moving of various transports from a WMS perspective. • Support Mapping and Construction of Data from Legacy Systems • Establish and Support Pre-Load and Post-Load Business Signoff • Support Internal and External Quality Control and Audit Processes • Lead Data Defect Triage and Remediation • Scope/Identify/guide any remediation of incomplete/incorrect Data in Legacy Systems. • Validation of all the data once its moved across the environments (DEV PQA QA Prod) • Cutover planning and Mock cut-over coordination and fall over and fall out reconciliation working closely with Systems and Performance teams. Some of the points to be considered for cutover are: o Inclusion of reversion back to PROD original state post mock cutover exercises for first site o Plan for mock cutover for 2nd site as FDC will already be in PROD • POST go-live monitoring of all data related activities and the whole cutover plan. • Some of the other specific responsibilities will also involve the following: 1. Verification of Data Integrity between platforms – ERP, WMS, Serialization 2. Coordination of delivery/ creation of BY Test data in each environment – DEV, PQA, QA and Prod 3. Oversight of all things data migration environment to environment strategy – configuration, master and transactions as applicable a. Reconciliation of accelerator tool result for uplifts across platforms b. RCA for fill-out and follow-on corrective actions c. Updated approach review to not repeat history for each Component load to higher environments. 4. Data Conversion Strategy from lower to higher environments including BY configuration preservation where applicable and legacy MARC data conversion. a. LPN’s with serialized hierarchies b. Inventory (LPN) by location c. Orders by status i. Inbound ii. Outbound d. Orders not only by status including progress quantity details at header and at line item level e. Inventory by Location with LPN details regarding status, committed (all of the sub-tabs) f. Inventory reconciliation with ERP and Serialization g. Serialization Data h. Clinical Data Drop resumes at : [email protected]
To view or add a comment, sign in
-
Hi Folks We are #hiring for below position with dam sure #closing [email protected] 1) Position: Data Migration Lead Location: New Jersey for Remote Project Duration 6 months to start with Job Title/Role Data Migration Lead WMS some clinical side exp Detailed JD (Pl share the Detailed Description, 1 liner JD will not work) • Determine the scope of Data migration. • Document the data migration strategy. • Document Data Migration Protocol to go along with the data migration strategy. • Produce the Data Migration Summary report based on the data migration strategy and protocol documents. • Participate in finalizing the code migration strategy in conjunction to the data migration strategy. • Co-ordinate moving of various transports from a WMS perspective. • Support Mapping and Construction of Data from Legacy Systems • Establish and Support Pre-Load and Post-Load Business Signoff • Support Internal and External Quality Control and Audit Processes • Lead Data Defect Triage and Remediation • Scope/Identify/guide any remediation of incomplete/incorrect Data in Legacy Systems. • Validation of all the data once its moved across the environments (DEVPQAQAProd) • Cutover planning and Mock cut-over coordination and fall over and fall out reconciliation working closely with Systems and Performance teams. Some of the points to be considered for cutover are: o Inclusion of reversion back to PROD original state post mock cutover exercises for first site o Plan for mock cutover for 2nd site as FDC will already be in PROD • POST go-live monitoring of all data related activities and the whole cutover plan. • Some of the other specific responsibilities will also involve the following: 1. Verification of Data Integrity between platforms – ERP, WMS, Serialization 2. Coordination of delivery/ creation of BY Test data in each environment – DEV, PQA, QA and Prod 3. Oversight of all things data migration environment a. Reconciliation of accelerator tool result for uplifts across platforms b. RCA for fill-out and follow-on corrective actions c. Updated approach review to not repeat history for each Component load to higher environments. 4. Data Conversion Strategy from lower to higher environments including BY configuration preservation where applicable and legacy MARC data conversion. a. LPN’s with serialized hierarchies b. Inventory (LPN) by location c. Orders by status i. Inboun ii. Outbound d. Orders not only by status including progress quantity details at header and at line item level e. Inventory by Location with LPN details regarding status, committed (all of the sub-tabs) f. Inventory reconciliation with ERP and Serialization g. Serialization Data h. Clinical Data #benchsalesrecruiters #sales #c2c #remote #hiring Regards Pavankalyan #atvsllc
To view or add a comment, sign in
-
When upgrading to a cloud-based Warehouse Management System (WMS), the process of data migration often presents hidden challenges. These challenges can impact the efficiency and success of the migration. Here are five key reasons why these challenges occur: 👉 Complex Data Structures and Legacy Systems Upgrading to a cloud WMS often involves dealing with complex data structures and legacy systems. This complexity can result in overcomplicated migration processes, especially when trying to integrate old data formats with new cloud-based systems. 👉 Lack of Clear Migration Objectives Similar to over-engineering, which starts with unclear goals, data migration challenges often stem from a lack of clear objectives. Without a well-defined goal for what the migration should achieve, the process can become unnecessarily convoluted, leading to inefficiencies. 👉 Inadequate Communication and Planning The success of data migration heavily relies on effective communication and meticulous planning. A lack of coordination between different teams and stakeholders can lead to a migration strategy that is more complex than necessary. 👉 Underestimating the Scale and Scope of Migration Underestimating the scale and scope of migrating to a cloud-based WMS can lead to a bloated and over-engineered process. This is often due to a failure to fully understand the intricacies and requirements of the new cloud environment. 👉 Overlooking User and Operational Needs Focusing too much on the technical aspects of migration without considering the end-user experience and operational needs can lead to an overly complex system. It’s important to align the migration process with practical, on-ground requirements. Here's How To Address These Challenges: ➡ Set Clear Migration Goals: Define specific, achievable objectives for the data migration to guide the process and prevent scope creep. ➡ Improve Communication and Collaboration: Foster open communication and collaboration between all parties involved in the migration process. ➡ Comprehensive Planning and Assessment: Conduct thorough planning and assessment to understand the full scope and requirements of the migration. ➡ Focus on User and Operational Needs: Prioritize the needs of end-users and daily operations in the migration strategy to ensure the new system is practical and efficient. ➡ Adopt a Phased and Iterative Approach: Implement the migration in phases, allowing for adjustments and refinements along the way to avoid over-complication. Don't let the complexities of data migration in cloud-based WMS upgrades hinder the efficiency and growth of your supply chain operations. Reach out to us and we’ll help you streamline your data migration process. With the right technology solutions tailored to your specific needs, we'll help make your transition to a cloud-based WMS smooth and efficient. #wms #datamigration #warehouse #data #fulfillmentiq
To view or add a comment, sign in
-
𝐀 𝐦𝐚𝐧𝐮𝐟𝐚𝐜𝐭𝐮𝐫𝐢𝐧𝐠 𝐜𝐨𝐦𝐩𝐚𝐧𝐲 𝐠𝐞𝐧𝐞𝐫𝐚𝐭𝐞𝐬 𝐢𝐧𝐝𝐞𝐧𝐭𝐬 𝐨𝐧 𝐭𝐡𝐞𝐢𝐫 𝐒𝐀𝐏, 𝐰𝐞 𝐧𝐞𝐞𝐝 𝐭𝐨 𝐞𝐱𝐭𝐫𝐚𝐜𝐭 𝐭𝐡𝐚𝐭 𝐝𝐚𝐭𝐚 𝐚𝐧𝐝 𝐢𝐦𝐩𝐨𝐫𝐭 𝐭𝐨 𝐨𝐮𝐫 𝐁𝟐𝐁 𝐰𝐞𝐛𝐬𝐢𝐭𝐞, 𝐚𝐥𝐛𝐞𝐢𝐭 𝐰𝐢𝐭𝐡 𝐭𝐡𝐞𝐢𝐫 𝐜𝐨𝐧𝐬𝐞𝐧𝐭. 𝐇𝐨𝐰 𝐜𝐚𝐧 𝐰𝐞 𝐝𝐨 𝐭𝐡𝐢𝐬 𝐯𝐢𝐚 𝐀𝐏𝐈, 𝐞𝐭𝐜.? #techerpsolution #8595547240 Certainly! Here's some theoretical overview of how you can approach integrating SAP data with your B2B website: By understanding SAP's data structure, leveraging appropriate APIs, implementing robust security measures, and adhering to best practices in data integration, you can effectively integrate SAP data with your B2B website to enhance functionality and provide value to your users. 1. *Understanding SAP's Data Structure*: SAP typically stores data in structured formats within its system, often using a relational database management system (RDBMS) such as SAP HANA or SAP ASE. It's essential to understand the data model and schema used by SAP to effectively extract the relevant information. 2. *API Integration*: SAP provides various APIs for accessing its data, including SAP NetWeaver RFC SDK, SAP Gateway, OData services, and SAP Cloud Platform Integration. Each API has its strengths and use cases, so you'll need to choose the one that best fits your requirements. 3. *Authentication and Authorization*: SAP's APIs typically require authentication and authorization to access the data. This ensures that only authorized users or systems can retrieve sensitive information from SAP. You'll need to implement authentication mechanisms such as OAuth, API keys, or X.509 certificates to securely authenticate with SAP's systems. 4. *Data Extraction Methods*: Depending on the complexity and volume of data you need to extract, you can choose between synchronous or asynchronous extraction methods. Synchronous methods involve real-time querying of SAP's database via APIs, while asynchronous methods may involve batch processing or data replication techniques. 5. *Data Transformation and Mapping*: Once you've retrieved the data from SAP, you may need to transform it into a format that's compatible with your B2B website. This could involve mapping SAP's data fields to corresponding fields in your website's database schema and converting data types or formats as necessary. 6. *Integration Middleware*: Consider using integration middleware or ETL (Extract, Transform, Load) tools to streamline the data integration process. These tools can help automate data extraction, transformation, and loading tasks, reducing the complexity of integrating SAP data with your B2B website. 7. *Error Handling and Monitoring*: Implement robust error handling mechanisms to deal with issues such as network timeouts, API rate limits, or data validation errors. Monitor the integration process to ensure data consistency, reliability, and performance. #techerpsolution #abap #sap #mm #fico #professionals #leader #job #professionaldevelopment
To view or add a comment, sign in
-
Migration of Data with LSMW IN SAP LSMW (Legacy System Migration Workbench) is a valuable tool in SAP for transferring data from external (legacy) systems into an SAP system (typically SAP ECC). It can also be used for some data loads within SAP, but with limitations compared to newer tools. Here's an overview of LSMW for data migration: Capabilities: Import Data: LSMW facilitates importing data from various sources like flat files (spreadsheets) or sequential files. Data Conversion: It allows you to convert data from the legacy system format to the format compatible with SAP. Standard Interfaces: LSMW leverages standard SAP interfaces for data import, including: Batch Input /BAPI : Uploads data in batches for background processing. Direct Input: Uploads data directly into SAP tables. IDocs (Intermediate Documents): Enables electronic data interchange (EDI) with external systems. Process Flow: LSMW follows a step-by-step process for data migration: Project Definition: Create a project, subproject, and object to define the specific data migration task. Data Definition: Specify the source data structure and map it to the corresponding SAP table structure. Data Conversion (Optional): Define conversion rules to transform data from the source format to SAP format if necessary. Data Transfer: Select the data file and configure the import process using a chosen standard interface (Batch Input, Direct Input, or IDoc). Data Validation and Error Handling: Validate the data before import and address any errors encountered during the process. Advantages of LSMW: User-Friendly: Offers a graphical interface for configuring data migration tasks. Flexibility: Supports various data sources and interfaces. Cost-Effective: A built-in tool within SAP, eliminating the need for additional software. Disadvantages of LSMW: Complexity: The configuration process can be complex for intricate data structures and large data volumes. Error Prone: Manual configuration increases the risk of errors during setup. Limited Functionality: Not ideal for complex data transformations or real-time data integration scenarios. Limited Use in S/4HANA: While usable, SAP recommends exploring other tools for data migration in S/4HANA due to potential compatibility issues. Alternatives to LSMW: SAP Data Migration tools (SDM): A comprehensive suite for complex data migration projects, especially for migrating to S/4HANA. Third-party migration tools: Offer additional features and capabilities compared to LSMW. In conclusion, LSMW remains a valuable tool for basic data migration tasks from legacy systems to SAP. However, for complex scenarios or S/4HANA migrations, consider exploring alternative solutions that offer more robust functionality.
To view or add a comment, sign in
-
💡 Integrating EDI and ETL for Seamless Business Operations Is your business struggling to streamline data flow and make real-time, informed decisions? Integrating #EDI (Electronic Data Interchange) and #ETL (Extract, Transform, and Load) technologies can transform your operations by automating data transfers and ensuring consistency, minimizing human error, and providing real-time visibility into key activities like supply chain, inventory, and customer behavior. According to Harvard Business Review, businesses that can make quick, data-driven decisions have a competitive edge. Learn more about how integrating EDI and ETL improves operational #efficiency and decision-making by visiting the link below. 🔗 Read more: https://lnkd.in/eM-KRtJb At #PremierBPO, we specialize in tailored EDI and ETL integration strategies, providing ongoing support and ensuring your organization remains data-driven. From real-time data access to streamlined processes, we’ve got you covered. To discover how we can enhance your business outcomes, contact us here: https://lnkd.in/eT9aVqFU #DataIntegration #OperationalEfficiency #BusinessGrowth #BPO #TechSolutions
To view or add a comment, sign in
-
How to overcome challenges in data quality If you have ever been involved in operating or maintaining an ERP system, you are certainly familiar of the concept of master data. It is meant to be the ultimate truth of what data is correct and, unfortunately, it tends to be always broken. The more there are different information systems and the more global and distributed the business is, the larger the challenges of the master data are. This article discusses tried and true principles for master data management, I created and honed during the past 30+ years at a global machine constructor, the leading global developer and supplier of technologies, automation and services for the pulp, paper and energy industries. In short, the key principles are: 1. Centralized control of master data is the only solution that works. 2. Form a cross-organizational master data governance council to resolve master data challenges as they arise. 3. Managing data is managing people: you need clear definitions for roles and processes. 4. Complement centralized control with great transparency. 5. Continuously work on deviations, automating all the checks you can. In all its simplicity, this list may sound like a no-brainer. To establish and maintain rigorous practices that apply to all relevant organizations is, however, a tough call. At our company, one of the keys to success has been the master data governance council. Our moment of truth came in 1999 in the middle of a major mess. There was no governance. As many as 300 people maintained the common commercial item data. Titles and their translations were not harmonized. I recall anchor bolt having been translated from French to Russian as a ships anchor-to-be-glued… It was not rare to have a single title spelled in a dozen different ways. We put together a team of subject matter experts to fix the master data. Sure, these people would have had “better things” to do but it was utmost necessary to use their time and spend a lot of money to get our act together. I began thinking: we should never need to bring these people together to fix the already created and released data, they should have created that data at the first place; keeping the data intact would be much cheaper than executing regular data cleaning campaigns. We decided to have a small and dedicated core team responsible for master data. We call it the MDM Center. It manages and maintains globally common item data and objects, including Customer and Supplier data, too. Only the data that require local input are outside of the control of the MDM Center. It wasn’t an easy choice but having done it, now I’m convinced that only centralized management works for master data: agree to ways of working and work as agreed. ... stay tuned for part 2 ...
To view or add a comment, sign in
-
Data Migration in SAP Projects: Data migration refers to the process of transferring data between different storage types, formats, or systems. In the context of SAP, this usually involves moving data from legacy systems, third-party applications, or even older versions of SAP itself into the new SAP environment. Effective data migration ensures that relevant, accurate and timely data is available for operations and decision-making. Understanding Data Migration Key Stages of Data Migration 1. Planning: This stage involves defining the scope, timelines, resources, and overall strategy. Work closely with stakeholders to identify critical data elements, determine data quality requirements, and establish benchmarks for success. 2. Assessment: Conduct a thorough assessment of the existing data landscape. Evaluate the data to be migrated & their formats. 3. Design: In this phase, Choose the appropriate data migration approach (e.g., Big Bang vs. Phased) and define the mapping of data from the source systems to the target SAP system. Document the mapping specifications to ensure a clear understanding of how data will transition. 4. Data Cleansing: Cleanse the data to eliminate duplicates, inconsistencies and inaccuracies. Ensuring data accuracy before migration is essential for smooth operations post-migration. 5. Migration Execution: This stage includes the actual transfer of data. Utilize specialized data migration tools to facilitate the process, such as SAP DMC / RDM / LSMW / EMIGALL & 3rd party tools. Execute a successful migration by following the predefined data mapping and transformation rules. 6. Validation: After migration, validate the data accuracy to ensure that all required data has been successfully transferred and is functioning as intended within the SAP system. 7. Post-Migration Activities: The work doesn't end with migration. Conduct testing to identify any remaining issues. Furthermore, train end-users on the new SAP system to maximize adoption and productivity. Monitor the system closely after migration for any issues and address them promptly. Common Challenges Data migration is not without its challenges. 1. Complete backup of the legacy data is made prior to migration. This acts as a safety net if issues occur during the process. 2. Organizations often face issues such as incomplete data, data corruption or mismatch in data formats. 3. Prepare the organization for the changes that data migration will bring, as this often affects processes and employee roles. 4. Resistance to change from employees or a lack of adequate training can complicate post-migration scenarios. Addressing these challenges through careful planning and continuous communication is vital to a successful migration. Data Migration is carried out via two approaches. 1. Phase / Wave based 2. Big Bang Based Stay tuned for my upcoming post on the Phase/ Wave Based and Big Bang data migration approaches
To view or add a comment, sign in