BlueXP is now NetApp Console
Monitor and run hybrid cloud data services
hello and welcome back to NetApp converge 2024 glad to have you on board with us so excited about this we as I kind of teased a little bit we have a panel I'm joined back by sandep Singh SVP and GM for net apppp I'm also joined by Bob Petty who's the VP and general manager for Enterprise platforms at Nvidia andCarmen amini who's the vice president GM of Lenova server storage and software so welcome on board guys this is like I mean really huge announcement today I think again you can't get through Imean I think we'd be thrown off the internet if we didn't talk about AI today and especially the fact that there's an aiod Ithink again Iwe were talking and I'm like Hey listen I was here when Flex pod started back in the days that's how old I am I guess from that perspective and how far I go back but I think one of the things that has changed is you know we're a year and a half out from kind of that chat GPT moment people are looking at geni and saying okay great I wanted to test the waters maybe I got into some pilots and proof of Concepts but now what so what are you guys seeing from a market perspective with Gen and how it's really being deployed yeah so II'll take a stab at that first um you know we'veseen a tremendous boost in the cloud in terms of building large language models as you mentioned chat GPT moment the iPhone moment for AI but uh a lot of we've established this Enterprise platform groups at Nvidia specifically to address what we see as the biggest opportunity and more than half 50 75% of the AI in the world we see to be done on Prem uh from a you know government regulation standpoint uh a sovereign AI standpoint IP standpoint you want the AI running where your data is and that's why something like the a aiod is sosignificant so people are have done the poc's um they might be a little concerned about do they need the same infrastructure as Google and the answer to that as you'll see isno um but we definitely see theEnterprise explosion beginning to happen now and uh this effort I think isuh going to make it easy for people toget started and continue to grow uh in a graceful a graceful way yeah and I would just add to it you customers Enterprises are absolutely looking at how do they harness the power of gen Ai and you make it work for them in the context of their Enterprise so it becomes critically important for them of how do they take Enterprise data combine it with the pre-trained llms and this is just that opportunity of providing a aiod uh that basically packages the best of breed across Nvidia across Lenovo and Neta and provides that in a prevalidated pre-integrated form factor to customers um so that they can effectively just do geni orrag in a box uh in that way yeah absolutelyand I think this is where customers that have tons of data they want to get a business outcome out of that data and this is where AI comes into play and I think the part parnership between the three of us here is about how do we make it easy for them and be able to get outcom driven value with the solution we could deliver yeah Ithink that is the outcome and what we see in we uh we partner up with a company called ETR they do quarterly Tech spending intentions and one of the things they do is look at AI use cases be it hey we want to do you know for software development well I don't want my code going out into the public cloud and things of that nature I want to do customer support customer success help you know with the people be more responsive to customers and then there's also kind of the marketing or Co or uh I guess you could say uh messaging generation type of stuff that they're using those are kind of the three things what are kind of the attributes that end user organizations will really find compelling out of the these three the three of you coming together to build this aiod why don't we start with Carmen Ithink it's about how we could bring the best rest of the Technologies um you know asBob implied the early adopters of AI was all the cloud service providers that drove massive scale of large language modu systems most of the Enterprise customers that did not have the skills or the knowledge of what do we do around AI it's about the partnership we could bring in to really simplify that Journey right um you know AI is you know at the old days were thinking Cloud it was very scary initially when you thought about Cloud what should I do and an Enterprise customer AI is the same concept and I think we're trying to bring the value to simplify that experience and the journey between the Partnerships we have here yeah no absolutely the you know the work to do training and large language model training is very different from what uh we believe Enterprises want to do are doing and will need to do and that's taking those models um fine-tuning them and then adding additional data to them call it retrieval augmented generation or rag workflows so that it represents their semantics their language their industry semantics theircompany metrics and their data um and you know our uh Nvidia AI Enterprise is a has been founded to really help the Enterprise get to the data with things like Nemo retriever that neap has uh integrated into their 800 storage platform um to set up guard rails because they care about privacy and security uh they we want to take advantage of all the latest and greatest features um you know running on the Lenovo platform and accessing data on that app so we have Nims microservices um that are updated on a regular basis so they can get the finely tuned AI for their respective use cases um and I think you know just theculmination of a fantastic platform that Lenovo has and the management stack um Nvidia gpus and our ovx certified uh reference platform with l4s CPUs AI Enterprise and theCuda grass the piece that I've been salivating over is the integration with net apppp um with Nemo retriever you on halfthe Enterprise data in the world is close yeah majority of the Enterprises are storing their unstructured data sets on net app yeah so if you're building a chatbot for customer service that chat bot's going to access your customer service database it's got to access your CRM your sales database um you're not going to upload that to the cloud yeah Iguess what you're really getting at is kind of what thesecret sauce is that you're each bringing to this andIthink why don't we kind of jump into that because you started going down that yeah and maybe I'll add a higher level first I'll just you know also articulate that we've had a long Rich history of partnering with Nvidia uh at the same time Lenovo and net app have had a five-year you amazing partnership and we're altogether taking this to the next level um it's bringing that best of breed across Computing gpus and the Right Storage infrastructure and then combining it with the necessary software stacks and overall you know gen AI capabilities um that organizations need that's what this ultimately AI pod represents uh making it easy to use making it bringing the power along U to go and do retrieval augmentor generation F tuning or inferencing and make it affordable uh for Enterprise customers yeah andE easy to build upon what I love about the aiod is you people may think like I said you they have to go build this massive infrastructure and at maybe some point they will but they can start with the right size continue to add use cases and as data grows and use cases grows and value grows they've got these very easy Deployable chunks of these AI pods that they can deploy throughout the Enterprise yeah I think the key thing we're trying to do as Lenovo and the partnership we've established with both net apppp and Nvidia is bringing AI to all uh because again AI is a journey and it's a different size and deployment for every size of customers from a small medium to all the way large Enterprise customers and it's how do you make it easy how do you take the best of breed of Technologies put it together for the application gener and Rag and inferencing and actually be able to consume and deploy andget the outcome that you're looking for and that's the value we're bringing together three of us yeah no Ithink it'skey because we actually talk about it back in September we came out with this power law of geni where you all the way on the left here left that would be my left they're right uh youreally have all those large language models that are getting you know built the chaty BTS The you know all of the ones the Titans and uh every other one that the cloud guys have but that's those guys are building those most companies are not building huge llms what they're doing is taking an open model you know taking a llama 2 a llama 3 yeah fine-tuning it and bringing they're bringing it on Prem and then fine-tuning it for financial services because I want to help my customer support people understand how to answer a question when it's complex still keeping human in the loop to your point and that's got to be a piece of it with the guard rails and how you bring it all together with the open models yeah do absolutely theyou know wewant ethical AI first and foremost uh and wewant to protect data privacy and then we have the slew of you know regulations and uh that we're going to have to adhere to um that's why the data and the access to the data and these regg workflows is so important who can access it when can they access what information can they get to and then what guardrails do you place on um whathow far you let AI you know imag so that you can avoid hallucinations and any unnecessary outcomes Iwant to touch on one point in terms of bringing that infrastructure down is you know we collectively designed this to ensure that it could easily fit into the Enterprise footprint you know so it's got our spectrx ethernet it's not infiniband like you typically see in large scale it's got spectrx infiniband you know we're putting it into 10K 12K racks not 100K powered racks that you see in the l so it's really meant to be an easy drop into your existing uh infrastructure and get going those are fantastic points to just reinforcing that the design center here is for Enterprises to be able to easily consume it and bring their you know data and do rag with it and make leverage Nai with it yeahno Ithink that's a great place kind of like leave it from that perspective because I think it is about the use cases and but I will ask one last question if we're sitting here a year from now after this guy has done talking and start talking about this and say Hey listen here here's what successes looked like for the last year what are you hoping to see for next year look I think itstarts first and foremost with helping customers being that strategic partner to customers uh in enabling them to truly go and harness the power of gen Ai and that's the beginning Point um uh for the success what success looks like uh for the aiod um secondly I will add is basically for our overall Partners this is a fantastic opportunity for partners because partners are the go to market motion here um they are the ones that now also get to take their customers and help them harness the power of gen um while delivering all of the pre-integration services at the same time um so there's an absolute metric here making sure that this is a successful outcome for our overall goto Market partners and then for the three of us U this just continues to represent in my view of that collaboration that customers are expecting as vendors uh where we are the stewards of enabling this Enterprise and responsible AI for them yeah no I think it's a great Point certainly um I would like to a year from now I'd like no one to be fearful or afraid to implement people should be starting now and a year from now they should have demonstrable proof across a number of different use cases within and it'snot just about making more money and making more profit it's about improving lives improving productivity you know togo all uh Visionary it'scuring cancer it's curing world hunger it'senabling you know underserved communities by bringing this technology down andautomating some of the things that um allow people to focus on um improving the quality of their lives but for first and foremost is I I'd like people to accept this as the way of computing because itis it is going to be the way of computing um fromhere on out and uh you know it'sfear no fear no more I think it's you know echoing both it's about being considered the trusted partner for our customers journey and AI and as sandep said it's bringing our Channel partner Community as they're the other arms that are engaged with our end customers mutually to really enable delivering the power AI to drive different business outcomes for our customers so we'revery excited yeah Ilove leaving things on a very positive note I'm very positive about gen Ai and AI in general and I think again this is Ithink really exciting news that you guys are launching today and I think just you know walking around last night and talking to Partners they're excited about all the news over thisweek here and are really leaning in on this and Ithink it's great so you know thank you for having us here and thank you for being on with us and you know again hope to have you back uh you know next year when we're talking about the successes sounds great thank you yeah thank you Bob thank you Cameron thank you appreciate it so and thank you for watching the cube here we're talking about tech analysis and news and we're the best at it so staytuned and we've got more to come from converge 2024 with net apppp
Sandeep Singh, NetApp, Bob Pette, NVIDIA and Kamran Amini, Lenovo join theCUBE host Rob Strechay during NetApp Converge 2024 and their major announcement.