Join us live as we recap Super Computing 22!
- Quick SC22 Overview and Synopsis
- The Future of Supercomputing at SC22
- The Cray-1 Supercomputer and HPC at SC22
- Enterprise Linux, HPC, and Fuzzball at SC22
- Using Rocky Linux in the National Laboratories
- The Widespread Use of Apptainer in the HPC World
- How Enterprise Computing is Changing Innovation
- Why Innovation is Accelerating in the HPC Market
- How CIQ is Utilizing Both HPC and Enterprise
- New Innovations in RISC-V at SC22
- Increased Chip Density Booth at SC22
- Innovations in Immersion Liquid Cooling Solutions
- The Prevalence of Apptainer and Rocky Linux in HPC
- Exploring the Student Cluster Competition at SC22
- Future Plans for CIQ Webinars
- Zane Hamilton, Vice President Sales Engineering, CIQ
- Dave Godlove, Solutions Architect, CIQ
- David DeBonis, Senior Software Engineer, CIQ
- Brock Taylor, Vice President of High Performance Computing, CIQ
- Justin Burdine, Director of Solutions Engineering, CIQ
- Rose Stein, Sales Operations Administrator, CIQ
- Forrest Burt, High Performance Computing Systems Engineer, CIQ
- Dave LeDuke, Strategy, Marketing, and Operations, CIQ
- Jonathan Anderson, Solutions Architect Manager, CIQ
Note: This transcript was created using speech recognition software. While it has been reviewed by human transcribers, it may contain errors.
Full Webinar Transcript:
This is a little bit of a live stream and a live place. It's new for us, so we're trying to do the best we can. We appreciate you joining us. I'm Zane Hamilton with CIQ. I'm here with Dave LeDuke and we are actually here live at SC22 in Dallas, Texas.
We've got lots of fantastic companies and organizations that came and stood behind what we were doing.
Quick SC22 Overview and Synopsis [5:27]
Well guys, we hope you enjoyed the time that we had. Just briefly showing you what we've been up to and showing you the booth. We ask that you come back after Thanksgiving and we will probably dive in more on what we learned and what we saw. And we really appreciate your time. It's been great. Thank you very much.
Good morning, good afternoon, good evening. Welcome, wherever you are. We appreciate you joining us for this webinar, again after SC22. And that was fun going back and watching some of those, some of the different clips, it seemed like a long time ago, but it was actually the week before last. So today we're going to be actually recapping what we did at SC and talking with our crew about what we learned, what we saw, and going back through the experiences from some of the people who have done this before and then some of the new experience. So we could bring everybody in.
Thanks for joining guys. I know there was a lot that happened during the week of SC22. I know that we saw a lot of people. We got to interact with a lot of different people and see a lot of different things. One of the things that I enjoyed the most, other than meeting you guys, actually seeing some of the people we've had on this webinar before, we got to see Dr. Alan Sill and Misha and Stack. And it was fun to see everybody in person. I put a real face with a real name, so it's awesome. We really appreciate you guys who have joined us before and it was great to meet you. I'm going to start off with Mr. Godlove. What was your favorite part of SC22?
Well, I think that you probably hit the nail on the head when you said it was great to see everybody else. I think that was probably my favorite part of SC22 was just being able to meet a bunch of other people, including yourself, Zane, and including a bunch of other people here on the call in person for the first time. Or if not in person for the first time, to see them for the first time in a long time. So that was really great to get together with everybody and sort of begin those face to face relationships or renew them. And then also obviously it was just great to just get the message out and to start letting people know what the company is all about, what our products are, what our projects are, what we're working on and what we see for the future.
Great. I'm going to go to the next Dave. Dave, I think you had an interesting experience because you went and actually got to take part in sessions and see other things outside of the booth. So tell us about your time.
I've been going to SC22 since 2012. And usually I'm looking more from a technical track, so I attended a lot of the technical papers session and a lot of the keynotes, the awards, things like that. Less on the floor, but it was great to meet everybody in person. Ran into a lot of people I knew from the labs and from industry as well. So it was a really good experience.
So what was from that perspective, since you did get to go to all of that stuff and participate in more of that technical track, what was the favorite thing that you participated in?
The Future of Supercomputing at SC22 [9:14]
I was in a session that was the future of supercomputing. Basically it was a panel discussion. It was Torsten and Dongarra who is obviously one of the founders of the Top500 and HPL or Linpack. And the metrics. Also my former PhD advisor, Dorian Arnold, was on that panel as well. And so to see their perspective saying what is driving the future of HPC from the many different facets they were discussing, some of it is the current technology of the day. It's AIML versus Cloud that it was years before versus Heterogeneous Architectures years before that. Some of them were just saying, we're the future of HPC. It's the researchers, it's the industry, the drivers of the direction of where we take this more than anything else that is more fleeting as the current phase. So that was interesting to me.
That's great. It's always great to catch up with little people like that too. Brock, you were all over the place. You talked to a lot of people and were involved in different talks.
It was still surprising though, for me, how much I was at the booth. I think that's the most I've spent in a booth, maybe in all the shows. Because normally I can block out time. I get around the show floor and go see people. This year was actually very hard to do. I planned to do a lot more than I did, but anytime you try to leave, there's somebody there to talk to, and the next thing you've been talking to them for 10 or 15 minutes. And it was just amazing to see given our location, not on the millionaire's row as I called it, how many people were actually coming to seek us out. And that made for a lot of traffic, a lot of busyness in the booth and around the booth. and I was able to get to booths close by. But it was rare that I could actually get over to the other side of the show, which was a little disappointing personally. But as a company it was very exciting to see all the activity that was going on in our area and how many good conversations we had. But the show, I wouldn't say it's back to full strength, but it was close. There next year you'll probably see everybody's back and it could be another great year. So it was fun.
It was fun. It's good to see you. Justin, it was your first time as well, so what'd you get from it?
This was definitely my first time. I've been to big shows like this before, but this was really fun for me to dive in and start talking to the people who were there. I didn't really know what to expect from the types of people who would be there. But it was fun to get to know some of the researchers, get to know the people who really make up this industry. And it was just interesting to hear how they're using Compute, which is very different from the world I grew up in, which was Enterprise Computing. So it was just a lot of eye opening conversations and, and it was also great to meet a lot of the people I've been talking with, since I've been here for the last four months, and to actually meet people in person. So I think this is my first show since I've been back since on the other side of COVID, so anyway, really energizing and exciting.
That's great. Thank you Justin. So Rose, I know you probably did the most talking to the most people. If we had an award, you would definitely get the award, so that was awesome. It was energizing watching you do that because you had a lot of passion and a lot of just willingness to talk to people. So we really appreciate that. But tell us about your experience.
So if any of you guys have been following us along the last few weeks when we were talking about what to expect and what we're excited about, I said that I wanted to shake hands with a robot, like a robot man . And I was told that that isn't really this kind of a conference. So when we walk in there, I will tell you it was massive. And I did it, in the corner of my eye, actually, I think somebody came up to me, I think maybe it was Justin, he was like, Rose, Rose , over there. I'm like, what's happening? There was a little robot dog named potato. And so the robot dog was doing its thing and I was like, oh my God, this is so amazing. And I turned and there was a robot man. I think it was actually a man in a suit, but it was awesome.
It was so fun. So I definitely stayed in the booth most of the time. There were lots of people that were coming and talking to us. I will never forget, I took videos of the robot dog and the robot man doing robot things together. It was super cool. But probably the coolest thing for me too was just talking to people and seeing the connection between how we actually support the work that then the scientists do, right? And like bridging that gap between, so the scientists actually have more control over how their programs are working. So that was, that was really fun. It was a great time.
That's awesome. Thank you Rose. Forrest, I think one of my favorite parts of SC22 was seeing you see the Cray-1. That was probably a highlight for me, but I know you saw a lot of different things and you had a lot of fun. But tell us the one thing that really stuck out to you.
The Cray-1 Supercomputer and HPC at SC22 [15:13]
The Cray-1 was really, really cool. That came out of nowhere there as we were walking around with Zane and Justin. No, hey, let's go over here, check this out, see what this is. So that was really, really cool. I think the one biggest thing for me, I've talked a lot about the chips. The chips are really cool. I'm sure some of you have seen my posts on LinkedIn of the chips, stuff like that. The coolest thing for me was the people, both internal and external to CIQ. I spent a lot of time prior to CIQ and the Academic National Lab sphere talking to researchers, getting really, really in depth with researchers about what they do. Like interviewing and writing articles at certain points with researchers just about how HPC affected their work.
So to get, to get back out there and in person, just to have that free exchange of ideas with all of these incredible minds that were at this big confluence of so many different industries, was just an incredible experience. It was super cool to get to talk to, like I said, domain scientists from research institutions and all kinds of different fields. Private researchers at all kinds of different chip tech firms and different places like that. The people from private industry, like us, are getting to find out what other products and trends and things like that they're seeing in the industry right now. It was really incredible to be in a place where everybody was together at once from the people making the tech, to the people using the tech, to the people researching the next generation of the tech.
So it was, it was an incredible exchange there. And of course like I said, internal and external. After being at CIQ for a year and a half, and seeing this go from like 17 people when I joined, to now 70 people almost. We have really done some incredible work here together. So it was really cool to get to see so many of the incredible minds we have here at CIQ in person and finally after working with you all for so long. So that was a lot of fun. It was really great to interact with everyone at the booth and just really have that unity of our company and to get to see everyone in person and meet. So it was a fantastic event. I loved the people. It was a ton of fun.
That's great. Thank you Forrest. Mr. LeDuke, I know you put in a lot of time, a lot of effort, a lot of late days, late nights going through and getting this all set up, and we really appreciate it. So from your perspective, from a marketing perspective, what did you take away from SC22?
I would say not just just me, we have a really great team that supported this event, and really everyone here did it. So, it was such a team effort. And that was a huge part of it. I'd say probably the number one thing for me was finally to get some Rocky swag. I joined the company and I was like, hey, Greg's got this cool hat. When am I going to get a hat? And finally I got my Rocky t-shirt, got my Rocky hat, got my Rocky beanie. And so that was wonderful. I kid, I kid. I think actually to echo what Forrest said, just layer one for me was seeing my fellow CIQ'ers, if that's what we are, in 3d. Just seeing people attached to bodies and being able to have that connection with people that we've all been working together in this zoom mode and slack mode and there's so much richness to getting together.
So, we sent 21 people to the event. That was one of the major expenses and investments of this event. And as somebody who's done a lot of events, and the size of our company wouldn't normally justify anything like that, but I think this really paid off. I mean, number one, we made so many amazing connections with our customers and partners, but we made those connections with each other and that is so powerful. That is just I think that was so worth sending such a big percentage of the company to this event. So one of the things I was really impressed by was the interest in Rocky. So many people came up and as soon as they saw Rocky, they lit up.
Enterprise Linux, HPC, and Fuzzball at SC22 [19:30]
They were they happy to see us. Many people, due to changes in the industry that I think are well known, were looking for an Enterprise Linux, that was the true successor to CentOS. People really depend on this and HPC more than anywhere, I think. So it was a pleasure to me that the first part of every conversation was a great, you guys are here, you're supporting Rocky. And then that opened up doors to start talking about the rest of our stack. Apptainer, Warewulf, and start to give a little preview of Fuzzball. So when I first started working on this show, we talked a lot about CIQ's history at this show, and it's really been a trailblazer in many ways. They did all kinds of things like the treasure hunt, which they invented, and how the short organizers are doing.
And we thought a lot about what promotion, what razzamatazz can we whip up to get a lot of awareness and interest in CIQ. We didn't have to do any of that. We didn't, we did it legit. No gimmicks. Rocky. Rocky was the welcome mat that just brought everybody in. And so that was amazing for me to see in person, to see the huge groundswell of interest and support around what we're doing around Rocky. And then to use that as a way to start talking about things like Josh Nord did a back of the napkin calculation and we've got 500 person years, at least, of deep expertise in Linux, Open Source technologies. And to show some of that talent, to bring that talent to Dallas and to have people meet and rub up against that, especially Greg, the rock star at the show.
Using Rocky Linux in the National Laboratories [21:25]
Then a personal highlight for me was the National Laboratories booth that was there that highlighted all of the different laboratories around the country, many of whom, probably all of whom, are using Rocky Linux at this point. Some of whom are our customers. Also NASA, and to see the world changing research that they're doing because, for me, I'm not the sharpest technical pencil in the box of crayons. Which makes no sense whatsoever. So for me, what's important is seeing the results of the work we do in the world changing research that's being enabled by these people who are really going to save the planet. They're going to deal with disease, they're dealing with exploration, they're dealing with the frontier of science and knowledge. How does it get any better than that?
It made me proud to go to that booth as an American, and say look at all this innovation that's happening here. There's real hope here. This is a positive initiative that we are enabling. It filled me with optimism at a time when politics and all this stuff that's going on in the world is a pretty dark time. But then you see, look at what these guys are doing. Look at where they're taking us. Look at all the new opportunities that we're having. And I am super proud to be part of CIQ that's really unlocking that potential. So that's my highlights.
Yeah. That's fantastic, Dave. Thank you, Jonathan, I haven't forgotten about you this time. Unlike at SC22 when we seem to have left you behind and didn't go to dinner with us. So again, I apologize for that, but tell me about what you thought or what was your favorite thing about a scene, other than the people?
My favorite thing was that time that I was sitting alone in my hotel room while all my colleagues were off at dinner. So to really talk about what the highlights were for me would be to retread so much of what's already been set. This was the first time that I'd been at SC22 with such a booth presence that I was a part of, there was a real sense of pride in that. That was cool. I wouldn't have anticipated that or predicted it for myself, but it was cool to be in the booth and be part of the team that was so large to be present there. Getting to do team building stuff both on the show floor and then in little party stuff afterwards or in the evening. That was great to get to spend time with people and see how quickly we just gelled together.
And it didn't feel like a bunch of strangers that just got thrown in a room. It was immediately as though we had been working together for several months or years, which was great. One of the first things for me individually though, when I arrived at the show, I went to the Dell HPC community thing, right off the bat. And I had like a half hour of not seeing anyone that I recognized, and I had a little mini panic moment of wondering if I'd been out of the community for too long that I wouldn't actually know anyone anymore. And then very quickly after that, there were people recognizing me or that I recognized and shaking hands, even hugs a few times. And then that culminated for me. We were we were booth neighbors with the guys from University of Chapel Hill, and some guys that I've worked with over there before noticed me walking down the space between the booths and accosted me in the hall and pulled me into their booth to show that they were closing out a case that they had open against iRODS to add Rocky support for iRODS in their build system and testing infrastructure.
And to see, not only that there were people in this community that knew that I was there and cared who I was, but were excited about the work that we were doing at CIQ was really cool. And it was great to be a part of that at the show.
That's fantastic, John. Thank you. It was amazing meeting everyone. Very cool. Yeah.
Zane, did you give your top highlight?
I think everybody's already pretty much touched on it. I think seeing all of you was probably the part I was really most interested in. I mean, I've worked with probably Forrest the longest on this, so being able to see Forrest after seeing his passion for all of HPC was probably one of my favorite things. And then I'm trying to think who started next, if it was Godlove or if it was Anderson, but it was just great to see everybody and actually be around instead of just being on Zoom. So I really appreciated that. And then Justin and I share a very similar experience, so most of it is around Enterprise, and this is a very different world than Enterprise Computing. So being able to actually see and to Mr. LeDuke's point, seeing the research that's being done.
And I'm a very visual person, so seeing the visualizations that they were doing and actually showing not only, Hey, we're running this piece of software, but what are they doing with it? Is something that's important for me, to be able to understand, like, what are we actually impacting and doing? So seeing those simulations of different things from wind tunnel testing for the Porsche that was sitting outside that was, was really cool. The wind turbines and seeing that the way that the wind comes off, the tip of the blade and where it creates turbulence, being able to visualize that stuff is just amazing to me, so that was probably my favorite part of the whole thing. Mr. Godlove, I know you had something else you wanted to add.
I mean, I've always pretty much got something else I want to add. So whenever Dave started talking a little bit about our treasure hunt that was going on. And I thought that that was actually something that we had originally invented because I was looking at it and I was like, this is very similar to something we did several years ago, but I didn't know if it was something else. So it was really cool when you just said that, and it made me think too about, so a really great thing about this meeting that I really, really enjoyed is I've been around this community, around specifically the Apptainer what was previously singularity, community for pretty long time now going on about six years.
And I can remember when we went to Supercomputing and we referred, we didn't have a booth, we didn't have anything. And we referred to what we were doing there as a gorilla marketing campaign. And so we did stuff like, we went around and we contacted other booths beforehand and asked them if they would put up a little sign up talking about Singularity. And then we do these scavenger hunts and stuff to get people to go to these other booths and talk to them about singularity. And it was cool then because it was almost like this underground secret, like it had this punk vibe to it where it was like really the hardcore people, the hard hardcore scientists knew about singularity and knew what it was and were using it.
And, that was really cool because then when we went to some more, and like, we started giving booth talks and doing stuff like that, but still, it was sort of like this underground thing where you would hear somebody mention it and you would be like, oh, wow. Out of context. And you'd be like, oh that's our thing. That's the thing we're working on, you know? And it was like a rare sort of electric moment. Now things have progressed, and I spent most of my time at the booth because we had a big beautiful booth with lots of people, and we had lots and lots of traffic, so it was awesome to be at the booth. But I went to a couple of birds of a feather talks or meetings and other things.
The Widespread Use of Apptainer in the HPC World [29:33]
And I think pretty much everything that I went to, Apptainer came up in conversation as just a given, as a fundamental thing that it's like, of course we're using this, everybody's using this. And it didn't have that underground, electric sort of secret feeling anymore. Now it's like this established, fundamental piece of software that everybody relies on, and it just made me feel so great to like being part of a community of people that have worked on this thing, which has changed, fundamentally, the way that HPC is done. And I was extremely gratified that I've had the opportunity to be a part of this. And it makes me think that's just the tip of the iceberg. That's just the start. Now we're actually starting to get some traction. Now we actually have a legitimate booth. We've already changed the way that HPC is done in this one way. I think next we're going to revolutionize it, right? We're going to completely not just change it, but remake it.
How Enterprise Computing is Changing Innovation [30:43]
I can add to that too, just from attending these over the years and also being over at Sandia for some time, the initiative of the labs were, at one point, to innovate. And to create and to be the first to innovate? And it changed a little with the commodity market when PCs came out, right? And Enterprise Computing, because it was no longer you buying these specialized pieces of fabric or processors or interconnects or whatever it is. It changed to where the labs were needing to leverage the community and consume from the community. So their mission shifted a bit. Now, they are focusing on making the industry stronger rather than making a computer for them to consume, which is a great ecosystem to be in. And you can see the shift happen over different sessions, different meetings of SC22 over time, again commodity-based hardware. Then all of a sudden heterogeneous computing came into the mix, and we had to leverage a lot of different components all at once.
Some FPGA, some GPUs, even offloading to a nick is a thing that you can do for optimizing in an HPC system. Then Cloud Computing came in and we had to leverage that community more. AIML is in there now, however, we're still doing, at the very large supercomputing centers, we're still doing bulk synchronous processing. We're not doing multi-tenancy as much. And it showed, because one of the talks were by Dongarra, the keynote, was we are sure we're getting these Linpack numbers, and we're getting flop rates that are in the Exascale, Exaflops, but our actual utilization of the total system is less than 1%. That's because we're not really using that shared tendency on each of the nodes. We're not really reducing the footprint in power and performance for right sizing the application, depending on its workload, depending on if it's communication bound or not. So it's really the efficiency gain of just going to multi-tenancy or even what similarity allows, is going to be the groundbreaker in the next 10 years.
Yeah. I would say that's a huge challenge as well, because you've got the programming models that are still dominant, MPI in particular. You're making that assumption. You have the full machine and you're relying on a lot of different pieces to place processes, threads correctly, keep them there. And it can, you can be very manually intensive. And I think that shift that's coming is how the ecosystem and the ecosystem of vendors start working together in a much better fashion to help solve that problem for the researchers. So that, as that panel says the future is about the research and what they do and the, the vendor ecosystem has to serve up the computing on a platter for them to consume, right? And consume in efficient ways. And it's going to be interesting to see, I know there's many different programming model research going on and how you shift the application space to something that's different, and that usually takes time. But I think that's going to be a big part of the future.
Why Innovation is Accelerating in the HPC Market [34:47]
On that, Brock and David DeBonis, a question about that. As we've seen things. I mean, it sounds like we're still doing some of the things in the old way, and we've started to see a little bit of a shift. Is it linear or is it actually starting to accelerate faster? Are we getting there faster? Is it something that's just going to continue dragging behind?
I think the driver here is really the large cloud providers, right? Because they're doing this and it's in their interest to make sure they're getting the best efficiency over their large systems by sharing tendency and also making sure that, okay, this job, I'm monitoring it. And I see that it's really only using about 10% of the CPU utilization adaptively addressing that. I think a lot of the new innovation is going to come out of some of those centers that house the large cloud providers, but eventually that is going to feed the HPC market. We've seen it in the past. And it's going to be consumed, maybe not driven by the HPC community, but it will be consumed and it'll need to be adapted. In the talk that I said was on the future of HPC or the future of supercomputing, one of the things was to embrace the trend.
So AIML, what I feel happened about maybe five, seven years ago was when Cloud was becoming more dominant. HPC was looking at Cloud as, that's silly because you can't control the hardware. You can't optimize it. You can get, you can't get the close interconnects, but you're going to get that soon. And you do in specialized hardware or specialized service providers. The thing that they need to do is that everybody's having the same problem right now. Everybody's having the problem of data movement and getting the compute close to the data that you're computing, that you need for doing the computation over. That still hasn't been solved. That's always been an issue. I think that we're getting closer. But I think it's all about the whole ecosystem. It's not the HPC ecosystem, it's the whole Enterprise cloud and even desktop computing ecosystem that's going to be the game changer.
How CIQ is Utilizing Both HPC and Enterprise [37:22]
And I would say that a big part of what's exciting about CIQ is, as we've said, some of us have been in HPC for a long time. Some of us are new to HPC. We're doing that crossover. It's not just an Enterprise approach to HPC or an HPC approach to Enterprise. It's shedding the history and looking at how you move it forward and actually capitalize on all this innovation that's going on. And it can't be just us. We can't solve all the problems, and there's always another bottleneck when you resolve one, the next one shows up, right? And so there's all kinds of things going on. It's a very interesting time to be in HPC because of this shift away from what used to be a very rigid system to now it is more dynamic. It is starting to understand that what Enterprise has brought to the table is very valuable. We just have to figure out how to blend and how to tap into that power or power efficiency as you brought up. Absolutely.
So, Mr. DeBonis brought up heterogeneous computing and how it's taking off and getting wider. I know Forrest, you and Joshua had spent some time talking about RISC-V and where that's at. You want to touch on that a little bit?
New Innovations in RISC-V at SC22 [38:46]
The biggest update I have on RISC-V is that the compactness achievable with those cores is allowing for incredible density on chip. And it is opening up really interesting applications in fringe HPC use cases. Things that are fundamentally HPC like, for example, cryptocurrency mining, which is essentially just mass cryptographic calculation. Is there anybody else throwing that many GPUs or specialized hardware at a problem in the HPC? Cryptocurrency is a little bit of a controversial world, but we'll ignore that for a moment. Some of the biggest updates around RISC-V that I've seen are on chip density and what's actually achievable with those cores. Suddenly you're able to put thousands of cores on one chip. So it's a very interesting new micro-architecture that is starting to, or architecture I should say, that's starting to throw up a good challenge to X86-64 in the same way that ARM is starting to come up in computing a little bit more.
Increased Chip Density Booth at SC22 [40:08]
I'm going to add to that too because the booth that Forrest went to with the large chip, I can't remember. Amazing density of their stuff, and it's a proprietary processor of course, but still, to get 850,000 cores on a single piece of fabric is amazing. There were, early in the 2000's and 2010's, there were organizations like D. E. Shaw, I don't know if anybody knows who D. E. Shaw is. He used to be a Wall Street person who was forecasting for stocks, and decided to start his own company doing molecular dynamics and drug discovery. And he built a system called Anton which was a computing platform, with a special purpose. He put his own money into this. That was a three dimensional gridded FPGA architecture that would do molecular dynamics as if it was in new real space. Each processor had its nearest neighbor that would do its halo exchange of the information it needed to do its next step in the simulation. And it was very close. And that was key. So that's what you're seeing with some of the stuff like that specialized 850,000 cores, is now you can almost make a chip that is spatially distributed appropriate to the simulation you're trying to do to get the efficiencies of closeness and memory sharing. And I think that's amazing.
Innovations in Immersion Liquid Cooling Solutions [41:55]
Absolutely. One of the other things that it brings up that I saw a lot of as I walked around it seems like, and maybe this isn't new to HPC, but it's a little newer, especially Enterprise, everything is liquid cooled racks, liquid immersion, liquid, liquid, liquid. And I look at it as a gimmicky, but I know Forrest, you and I talked about this quite a bit, that there are real requirements for it now. It's not just a cool thing to have, it's something you need. I know if you guys saw that as well as you were walking around, but I'd be interested to hear your thoughts on that.
I'll just really quickly share what I saw there. I too have always seen the immersion, liquid cooling, that type of thing is a little bit of a gimmick. Especially immersion liquid cooling is fairly well established in different places like chill doors and stuff like that. But immersion has been a little bit of a gimmick. It's like “look at us.” There's the famous immersion racks and stuff like that. Yeah, which are now running Rocky.
What has happened now is that there's been a jump in GPU power profile. If I'm recalling correctly, the A100 was like a 400 watt GPU maximum. If you've got I think the 80 gigabyte SXM socket version. The H100 that the Nvidia has just come out with is now a 700 watt power profile. And so you've got eight of those sitting in a 4U chassis, 700 watts worth of heat in each one. What I heard from a company that I talked to there that was building practical immersion systems is that it's necessary. They were already starting to see immersion becoming necessary in some places with the A100, but especially with like the jump to the H100 and the 700 watt power profile there, it's becoming necessary to actually bring immersion out of being a gimmick and as something as regular as something could fit in a standard 4U chassis.
If I can add to that, I think some of that technology actually came from military embedded systems because of the need to have them cooled in these very high temperature environments. So it's amazing that you see some of the embedded systems feeding into this ecosystem. Even some of the smaller RISC-based architectures have been around forever in embedded systems for special purpose operations. When I was at Viasat, we used RISC-based architectures all the time. It is interesting, the convergence of HPC, Enterprise, Cloud, Desktop and Embedded Systems is finally coming to an apex. It feels like, to me. I've been working in my field for 25 years or so, and it does seem like everything's getting closer and easier to do and feeds off each other more than it was specialized.
I concur. And I think too, Forrest's point, it's not just the accelerators that are cranking up in power. You look at the processors that are coming and you're talking about versions that can only be liquid cooled. And when you're combining all that immersion, full immersion makes a lot of sense. Especially, we saw another one of our neighbors selling. We jokingly called them the ice cream machines, but they look like small freezers that would be in people's garages. But that form factor fits really well for businesses that can put very, extremely powerful computing systems in that size and have it in their buildings and immersion allows that cooling without the more formal data center, raised floor, that goes along with it. So I think a lot of these technologies are all coming in, and that's why I think the vendor ecosystem is so important to pushing this forward. It's going to take a mountain of effort.
And the software to enable it all. It's not just the vendors of all the tips. I mean, that's where I think CIQ is in a very good place?
I was just going to say, I have considered for a long time for CIQ to be the cutting edge software innovation side that matches all this hardware stuff. We fit perfectly. Exactly, Just like that.
So outside of the visualization pieces and the liquid immersion piece and liquid cooling, because that was everywhere. I'm curious, Jonathan, when you were walking around as an HPC guy, what other things did you see that were interesting to you that you would point back to as that's something that you'd like to talk about?
Honestly, I didn't spend that much time walking the show. So this show wasn't for me as much about seeing the new stuff on display. The things that stood out to me were the openness to collaboration and the interest in collaboration that I saw with those partners that I did interact with. And that's true with partners that we were there to meet formally. But even like, I gave a quick presentation on some work that we'd been doing and that Yoshi had been doing on OpenRadioss, and we presented that at the Intel booth and showed that with how we had gotten it working with Intel's oneAPI suite. And, first of all, it was the very last show on the last day and Intel still managed to pack out their little booth space. So that was cool, in terms of people there watching a presentation. But there were multiple people that wanted to come up and talk to me afterwards about how to get involved with a processor and project like that, including some people from Kitware wanting to talk about visualization and working together on ParaView. And that's what most excited me, since I didn't see as much of the individual technology stuff at this show. But I also think that that's part of the time that we're in. That's what's been blocked by the goings on in the last couple of years and what I think people are excited for to see returning.
The Prevalence of Apptainer and Rocky Linux in HPC [48:36]
Thank you. Mr. GodLove, I have to admit too. I had a lot of conversations with a lot of people and Apptainer wasn't as prevalent as Rocky, which I think that was to be expected, but it was amazing how many people came up and wanted to talk about Apptainer and what they were doing with it. It was great to see Dr. Dave in the booth as well. I really appreciate all he does in spending some time with us. But from that perspective and being able to see it, like you said, it's, it's grown and it's changed and seeing it be a mainstream platform, what do you see is next?
So one thing that kept coming up again and again at the show that I kept hearing about was orchestration and Kubernetes and things of that nature. And so, I think that, I don't know if it's proper to say that we are trailing, but we probably in HPC have not yet gotten to the need. The same types of needs that maybe Enterprise Cloud Native types of groups have gotten to more quickly with orchestrating containers, but we are getting there, right. And I think that is one of the things that I'm seeing. So I saw a lot of people talking about Kubernetes and saying. I hate to put it this way, but a lot of it was like “Kubernetes is awesome”. We want to use it, but we just gotta figure out what to use it for. So there's a lot of that. But then there were also a lot of folks who were trying to use it and were starting to figure it out and were seeing that there's a lot that doesn't really fit well in HPC. And so, it was really great to be able to talk to people about the next big thing which is to be able to orchestrate Apptainer containers in a way which is job-centric instead of service-centric. Which sounds like a small thing, but is a completely different paradigm and really changes the way that you think about orchestration and the way in which it's done. And so that's definitely the next big step that we're going to take in the container space. And then I think that HPC is going to take in the container space, and is figuring out how to orchestrate these containers and orchestrate them from a job-centric perspective instead of from a service-centric perspective, which is a totally different mindset and is going to require some different tools.
That's a really great way to say it, Dave. I think that it goes back to, and I don't mean to sound harsh, but the rigidity of the programming models in HPC and NPI. And again, if you put the wrong thing together just on one node, you're going to slow that entire job way down. And the service is to pack as much as you can to get that efficiency in. If you only need a few cores, you only use a few cores. But it's that understanding that the containerization for an HPC workload, even ai, which I consider HPC. It's more job-centric and that job needs to be guaranteed resources and needs its boundaries guaranteed that another task that's totally unrelated isn't going to come in and take valuable resources away at a time when it needs them. And so that's a big challenge. It is a different mindset.
That's a good point. And something that Dongarra brought up too, in his talk, is that we shouldn't be rating the efficiency of the system as the amount of compute flops it does. It should be the amount of science we do. And so, if you take a job-centric approach to things, if you take an application-centric approach to things, you could be optimizing for the application rather than optimizing for the complete system like most schedulers do.
And you're thinking most clusters are not single user clusters. It's multi-tenant, multi-job, and of all different sizes. And we should be measuring throughput of an entire set of team workloads over a given period of time. How much do they get done? And that's what the industry really cares about. It's what researchers ultimately care about. How fast they're getting the results back. What it takes to get their work done, and not the flops of the system. Right. At the end of the day, that's, it's not as meaningful to them.
And that's true. There are different models, right? Capability versus capacity systems. And I think you're seeing a shift from what the DOE supercomputing centers need, to basically the general user and more towards what the industry is needing to consume on a daily basis as a normal piece of equipment that they need for their compute.
That's been interesting. I've had conversations with some centers that have several different HPC environments. The one I'm thinking of in particular like 40 different environments, and they said they typically sit about 70% idle because nobody wants to use the other person's resource, so they just don't optimize anything. So I find that fascinating and I'm glad we're looking at a way to bring that back together and make that where we're not just having a bunch of stuff sitting idle.
Especially when you have a supercomputer on the Top500 usually, I mean you're talking about some of them are up to 20 megawatts that they need for power. And to have that sitting idle is not very good. That's why they pack those systems. I mean, the large systems are to get your job in there, you're going to be waiting for a while, and you're hoping that it actually succeeds all the way through it or else you're waiting for a longer time at the end of the queue again. But that's a key point is it's a lot of cost just to have the equipment. We should be utilizing it more effectively.
Exploring the Student Cluster Competition at SC22 [54:58]
Absolutely. I think as we're getting close on time, I know one of the things that we didn't talk about yet that we've been asked to talk about a little bit Forrest, is the student competition. I walked back there at one point in time and they were all off doing something else, so I didn't get a chance to talk to them, but I know some people did and I'm interested to hear a little bit about that. So, I don't know which one of you actually went back and spent some time.
I went back and checked out the student cluster competition. We have a comment from one of the participants, Drew from the impact team at Indiana University Bloomington and Purdue. The student cluster competition was really, really cool. The coolest thing about it was that, if everything I heard was entirely correct, we had three out of the eight teams running Rocky on their systems. The impact team was very open about running Rocky, so we were really, really pleased to see that. We're really excited to see them coming by to get swag and stuff like that from our booth. So that was, as you know, legitimate CIS admins of Rocky cluster that they're running. So that was really, really cool. Like I said, I heard a couple of the other teams were also running Rocky which was really, really awesome. And just in general, it was super cool to see all these competing minds clear out there at SC22, getting the opportunity to come participate in an event like that as students. So that was totally awesome to see. And, like I said, the fact that we saw three out of the eight student teams running Rocky on their systems was very, very exciting. So it was really, really awesome.
Very cool. Brock, do you get a chance to make it back there?
I did not. I remember as I was running to a meeting in one of the sets of whisper suites. I saw it out of the corner of my eye, but I didn't get to go back there. I know Glen, who didn't join today, I think he was able to get back there and talk to some.
Jonathan, Dave, did you guys go back there?
I didn't get that back there, unfortunately. I did see it in the corner of my eye at one point.
I think I had the same experience that I walked past when no one was there. But I was really glad to hear people were able to use Rocky. I think the cluster competition is a really good example of the value of something very explicitly open like Rocky. Free licenses are one thing, but there's a flexibility in just having a completely free operating system that is like the one that you would want to run in production, let alone exactly the one you would want to run in production that you can just download and use in any way either on virtual machines in your desktop environment or on the cluster that you're deploying. So, to me it just speaks to the value of one of the things that we're there to support and to see it starting out right there at the ground floor of HPC.
I walked by. And my reaction was, my God, these guys are young, but I guess that's just how it is. But some of the things I saw there were amazingly international. There were several teams from out of the country. This is a really international phenomenon, and we really haven't talked about that much, but the show does attract a worldwide audience. So seeing young kids from all over the world and also in the US university teams were international. So it was a very cool thing to see that's where the future's headed in this area.
Future Plans for CIQ Webinars [58:47]
Fantastic. And I know that we did a webinar during this, or tried to do one, and it worked out okay. It wasn't great, but it was okay. I think next year we have lots of plans for a lot of different ideas. I think we talked about maybe doing one in the morning of what we're looking forward to. Bringing in some people that we talked to the day before. It'd be really cool to get the student cluster people to come over and talk to us. Like, why'd you choose Rocky and what are we doing with it? And then maybe a wrap at the end of the day. So I think next year look forward to a lot more live streaming content from us and being able to interact and bring some of you guys in to talk to us and tell us your experiences.
And while we're talking about next year, I just want to make the point that HPC is a critical market for CIQ and it's a market that we have such an amazing presence in. It's not the only one. We're making head roads into Enterprise markets. So next year, in 2023, we're going to be in a lot more places. We'll obviously be at the crucial HPC events. We're going to look at some Enterprise oriented events and also Cloud events, often with our partners because that is a way we can show some of the collaboration that we've engaged in with Google, Oracle, Amazon, et cetera. And then also, I think we'll start to pioneer some vertical market events because we're getting some great success, for example, in telecom with Rakuten. Definitely have deep expertise in BioLife Science. So it's going to be a busy year, everybody. I do look forward to that and for us to up our game in terms of how we can do this at different scales. Large and medium and small and being with partners at various different events to connect with our customers and potential customers.
Absolutely. But we are right up on time guys. I appreciate you coming and sharing your experiences that you had. Go ahead, Brock.
I was going to ask Dave LuDuke if he happened to have one of the cards with our QR code handy for anybody who's listening and actually wanted to go see some of the things we did. We have a page that that QR code will take you to. Because if you didn't get to go, there's a lot of great information and you can reach out to us and find out more information about Rocky. And I think somebody's even asking how do you get more CIQ Swag. Definitely come to a show we're at. That's a good way to do it.
Is that the only question, Dave? Did you see something else that we missed? Is that it? Excellent. Well, like you said we've got your comments right here. If you don't mind, just you can go to [email protected] and shoot us a note. We'll follow up with you. But we really appreciate you guys joining us. It was great to see everybody in person. We're looking forward to next year and for following up and talking to the people that we've met and having more conversations. So, it's good to see you guys. Thank you for stopping by. We look forward to seeing you guys next week. Thank you.