In an interview with Blocks & Files, Russell Fishman, Senior Director of Product Management at NetApp, discussed how intelligent data infrastructure helps enterprises with GenAI inferencing through access to large swathes of connected data. Fishman said connected data for inferencing is squarely in NetApp’s lane and GPU server racks are becoming extraordinarily demanding of electrical power and capex dollars, which will restrict AI training locations. AI training is becoming a specialized niche, whereas AI inferencing is not. Data silos impede more general data access. The more inside data an organization’s LLM has available through RAG, the better its responses. Fishman said, “The point is, data and storage is becoming more of a general concern for AI as we look across companies. So what actually needs to change?” This is where NetApp has an advantage with its pioneering data fabric concepts – an “intelligent data infrastructure” – and the ability for its customers to move and access data in the NetApp ONTAP data estate, with OEM support, connections to virtually any server, running in the public clouds, and tens of thousands of customer implementations. https://ntap.com/3UyiSHw
Tiago Andrade’s Post
More Relevant Posts
-
In an interview with Blocks & Files, Russell Fishman, Senior Director of Product Management at NetApp, discussed how intelligent data infrastructure helps enterprises with GenAI inferencing through access to large swathes of connected data. Fishman said connected data for inferencing is squarely in NetApp’s lane and GPU server racks are becoming extraordinarily demanding of electrical power and capex dollars, which will restrict AI training locations. AI training is becoming a specialized niche, whereas AI inferencing is not. Data silos impede more general data access. The more inside data an organization’s LLM has available through RAG, the better its responses. Fishman said, “The point is, data and storage is becoming more of a general concern for AI as we look across companies. So what actually needs to change?” This is where NetApp has an advantage with its pioneering data fabric concepts – an “intelligent data infrastructure” – and the ability for its customers to move and access data in the NetApp ONTAP data estate, with OEM support, connections to virtually any server, running in the public clouds, and tens of thousands of customer implementations. https://ntap.com/3UyiSHw
NetApp claims 'incumbency is the new cool' in the AI era – Blocks and Files
https://blocksandfiles.com
To view or add a comment, sign in
-
In an interview with Blocks & Files, Russell Fishman, Senior Director of Product Management at NetApp, discussed how intelligent data infrastructure helps enterprises with GenAI inferencing through access to large swathes of connected data. Fishman said connected data for inferencing is squarely in NetApp’s lane and GPU server racks are becoming extraordinarily demanding of electrical power and capex dollars, which will restrict AI training locations. AI training is becoming a specialized niche, whereas AI inferencing is not. Data silos impede more general data access. The more inside data an organization’s LLM has available through RAG, the better its responses. Fishman said, “The point is, data and storage is becoming more of a general concern for AI as we look across companies. So what actually needs to change?” This is where NetApp has an advantage with its pioneering data fabric concepts – an “intelligent data infrastructure” – and the ability for its customers to move and access data in the NetApp ONTAP data estate, with OEM support, connections to virtually any server, running in the public clouds, and tens of thousands of customer implementations. https://ntap.com/3UyiSHw
NetApp claims 'incumbency is the new cool' in the AI era – Blocks and Files
https://blocksandfiles.com
To view or add a comment, sign in
-
In an interview with Blocks & Files, Russell Fishman, Senior Director of Product Management at NetApp, discussed how intelligent data infrastructure helps enterprises with GenAI inferencing through access to large swathes of connected data. Fishman said connected data for inferencing is squarely in NetApp’s lane and GPU server racks are becoming extraordinarily demanding of electrical power and capex dollars, which will restrict AI training locations. AI training is becoming a specialized niche, whereas AI inferencing is not. Data silos impede more general data access. The more inside data an organization’s LLM has available through RAG, the better its responses. Fishman said, “The point is, data and storage is becoming more of a general concern for AI as we look across companies. So what actually needs to change?” This is where NetApp has an advantage with its pioneering data fabric concepts – an “intelligent data infrastructure” – and the ability for its customers to move and access data in the NetApp ONTAP data estate, with OEM support, connections to virtually any server, running in the public clouds, and tens of thousands of customer implementations. https://ntap.com/3UyiSHw
NetApp claims 'incumbency is the new cool' in the AI era – Blocks and Files
https://blocksandfiles.com
To view or add a comment, sign in
-
In an interview with Blocks & Files, Russell Fishman, Senior Director of Product Management at NetApp, discussed how intelligent data infrastructure helps enterprises with GenAI inferencing through access to large swathes of connected data. Fishman said connected data for inferencing is squarely in NetApp’s lane and GPU server racks are becoming extraordinarily demanding of electrical power and capex dollars, which will restrict AI training locations. AI training is becoming a specialized niche, whereas AI inferencing is not. Data silos impede more general data access. The more inside data an organization’s LLM has available through RAG, the better its responses. Fishman said, “The point is, data and storage is becoming more of a general concern for AI as we look across companies. So what actually needs to change?” This is where NetApp has an advantage with its pioneering data fabric concepts – an “intelligent data infrastructure” – and the ability for its customers to move and access data in the NetApp ONTAP data estate, with OEM support, connections to virtually any server, running in the public clouds, and tens of thousands of customer implementations. https://ntap.com/3UyiSHw
NetApp claims 'incumbency is the new cool' in the AI era – Blocks and Files
https://blocksandfiles.com
To view or add a comment, sign in
-
In an interview with Blocks & Files, Russell Fishman, Senior Director of Product Management at NetApp, discussed how intelligent data infrastructure helps enterprises with GenAI inferencing through access to large swathes of connected data. Fishman said connected data for inferencing is squarely in NetApp’s lane and GPU server racks are becoming extraordinarily demanding of electrical power and capex dollars, which will restrict AI training locations. AI training is becoming a specialized niche, whereas AI inferencing is not. Data silos impede more general data access. The more inside data an organization’s LLM has available through RAG, the better its responses. Fishman said, “The point is, data and storage is becoming more of a general concern for AI as we look across companies. So what actually needs to change?” This is where NetApp has an advantage with its pioneering data fabric concepts – an “intelligent data infrastructure” – and the ability for its customers to move and access data in the NetApp ONTAP data estate, with OEM support, connections to virtually any server, running in the public clouds, and tens of thousands of customer implementations. https://ntap.com/3UyiSHw
NetApp claims 'incumbency is the new cool' in the AI era – Blocks and Files
https://blocksandfiles.com
To view or add a comment, sign in
-
In an interview with Blocks & Files, Russell Fishman, Senior Director of Product Management at NetApp, discussed how intelligent data infrastructure helps enterprises with GenAI inferencing through access to large swathes of connected data. Fishman said connected data for inferencing is squarely in NetApp’s lane and GPU server racks are becoming extraordinarily demanding of electrical power and capex dollars, which will restrict AI training locations. AI training is becoming a specialized niche, whereas AI inferencing is not. Data silos impede more general data access. The more inside data an organization’s LLM has available through RAG, the better its responses. Fishman said, “The point is, data and storage is becoming more of a general concern for AI as we look across companies. So what actually needs to change?” This is where NetApp has an advantage with its pioneering data fabric concepts – an “intelligent data infrastructure” – and the ability for its customers to move and access data in the NetApp ONTAP data estate, with OEM support, connections to virtually any server, running in the public clouds, and tens of thousands of customer implementations. https://ntap.com/3UyiSHw
NetApp claims 'incumbency is the new cool' in the AI era – Blocks and Files
https://blocksandfiles.com
To view or add a comment, sign in
-
In an interview with Blocks & Files, Russell Fishman, Senior Director of Product Management at NetApp, discussed how intelligent data infrastructure helps enterprises with GenAI inferencing through access to large swathes of connected data. Fishman said connected data for inferencing is squarely in NetApp’s lane and GPU server racks are becoming extraordinarily demanding of electrical power and capex dollars, which will restrict AI training locations. AI training is becoming a specialized niche, whereas AI inferencing is not. Data silos impede more general data access. The more inside data an organization’s LLM has available through RAG, the better its responses. Fishman said, “The point is, data and storage is becoming more of a general concern for AI as we look across companies. So what actually needs to change?” This is where NetApp has an advantage with its pioneering data fabric concepts – an “intelligent data infrastructure” – and the ability for its customers to move and access data in the NetApp ONTAP data estate, with OEM support, connections to virtually any server, running in the public clouds, and tens of thousands of customer implementations. https://ntap.com/3UyiSHw
NetApp claims 'incumbency is the new cool' in the AI era – Blocks and Files
https://blocksandfiles.com
To view or add a comment, sign in
-
In an interview with Blocks & Files, Russell Fishman, Senior Director of Product Management at NetApp, discussed how intelligent data infrastructure helps enterprises with GenAI inferencing through access to large swathes of connected data. Fishman said connected data for inferencing is squarely in NetApp’s lane and GPU server racks are becoming extraordinarily demanding of electrical power and capex dollars, which will restrict AI training locations. AI training is becoming a specialized niche, whereas AI inferencing is not. Data silos impede more general data access. The more inside data an organization’s LLM has available through RAG, the better its responses. Fishman said, “The point is, data and storage is becoming more of a general concern for AI as we look across companies. So what actually needs to change?” This is where NetApp has an advantage with its pioneering data fabric concepts – an “intelligent data infrastructure” – and the ability for its customers to move and access data in the NetApp ONTAP data estate, with OEM support, connections to virtually any server, running in the public clouds, and tens of thousands of customer implementations. https://ntap.com/3UyiSHw
NetApp claims 'incumbency is the new cool' in the AI era – Blocks and Files
https://blocksandfiles.com
To view or add a comment, sign in
-
In an interview with Blocks & Files, Russell Fishman, Senior Director of Product Management at NetApp, discussed how intelligent data infrastructure helps enterprises with GenAI inferencing through access to large swathes of connected data. Fishman said connected data for inferencing is squarely in NetApp’s lane and GPU server racks are becoming extraordinarily demanding of electrical power and capex dollars, which will restrict AI training locations. AI training is becoming a specialized niche, whereas AI inferencing is not. Data silos impede more general data access. The more inside data an organization’s LLM has available through RAG, the better its responses. Fishman said, “The point is, data and storage is becoming more of a general concern for AI as we look across companies. So what actually needs to change?” This is where NetApp has an advantage with its pioneering data fabric concepts – an “intelligent data infrastructure” – and the ability for its customers to move and access data in the NetApp ONTAP data estate, with OEM support, connections to virtually any server, running in the public clouds, and tens of thousands of customer implementations. https://ntap.com/3UyiSHw
NetApp claims 'incumbency is the new cool' in the AI era – Blocks and Files
https://blocksandfiles.com
To view or add a comment, sign in
-
In an interview with Blocks & Files, Russell Fishman, Senior Director of Product Management at NetApp, discussed how intelligent data infrastructure helps enterprises with GenAI inferencing through access to large swathes of connected data. Fishman said connected data for inferencing is squarely in NetApp’s lane and GPU server racks are becoming extraordinarily demanding of electrical power and capex dollars, which will restrict AI training locations. AI training is becoming a specialized niche, whereas AI inferencing is not. Data silos impede more general data access. The more inside data an organization’s LLM has available through RAG, the better its responses. Fishman said, “The point is, data and storage is becoming more of a general concern for AI as we look across companies. So what actually needs to change?” This is where NetApp has an advantage with its pioneering data fabric concepts – an “intelligent data infrastructure” – and the ability for its customers to move and access data in the NetApp ONTAP data estate, with OEM support, connections to virtually any server, running in the public clouds, and tens of thousands of customer implementations. https://ntap.com/3UyiSHw
NetApp claims 'incumbency is the new cool' in the AI era – Blocks and Files
https://blocksandfiles.com
To view or add a comment, sign in