Telemetry News Now.
Welcome to another episode of Telemetry News Now. We are recording the week of April one. It is not April first today, so I have no clever April fools joke for you, but I do have plenty of headlines about AI, networking, and some geopolitical stuff that we're going to learn about. So let's dive into the headlines for today.
So first off, from Doug Madory, our very own Doug Madory here at Kentik from the Kentik blog on March twenty seventh, just a few days ago, Starlink, which has been traditionally known for its direct to consumer satellite Internet service, has recently expanded its role by launching community gateways.
Community gateways, if you're not familiar and haven't been reading about it, are high capacity satellite Internet installations that deliver Internet transit to remote communities. That's the differentiator there. So ISPs and and things like that. Doug discusses the island nation of Nauru in the in the Pacific Ocean as an example.
Now the way it works is that you can purchase the satellite infrastructure station, which is actually comprised of four stations, each of them enclosed in like a spherical radome. It's about a million dollars, a little bit more. I saw one point two five million. After that, there's a professional services to install and then about seventy five thousand dollars per gigabyte per month as the recurring fee.
The gateway stations themselves use Starlink's Ka band antennas, I think that's how it's pronounced, on the low earth orbit satellites, which are often sitting idle. That particular antenna is not really doing anything if they're not transmitting to a fixed station on earth. There's an inefficiency there that's being taken advantage of.
Once the signal is in transit from the Earth and into the low Earth orbit satellite system level, they also make use of inter satellite laser links to transmit data, solving the issue with no service when not in range of an Earth based station. So there's a couple of things going on here. Now if you go and read the post, you're gonna see Doug comments that he believes these community gateways are probably not going to be deployed in heavily populated areas. So like imagine a city or, you know, any kind of densely populated area.
And they're gonna start showing up more in lower population or probably more remote areas that may not have connectivity by any other means. I'm thinking like a submarine cable system. So, you know, to me, this is an interesting new development for Internet access at the provider level. You know, we've always talked about Starlink and other, satellite providers as a consumer level, direct to consumer.
Right?
Yep.
This is a little different. And then considering the speeds and latencies that we're talking about, this is perfectly acceptable for a lot of people in an area connecting the Internet, you know, from, again, a a provider standpoint.
Well, a lot of these regions have had satellite Internet for years, but it was higher orbit. Right? So it was a lot higher latency and sometimes a lot lower bandwidth than what Starlink's able to offer. Right?
And that's why a lot of these remote island nations or remote locations have invested money or worked with other companies to invest money to build the Unservee cables. Right? So they had other connectivity options. So, you know, it's interesting to see I don't know, call it a flip flopping, but, you know, bringing in a different satellite technology that helps some of these remote locations that maybe financially doesn't make sense to do it under c cable.
The existing satellite technology was suboptimal.
See, the other thing that I found interesting towards the end of Doug's blog was he talks about the impact on the, IPv6 table. So in addition to Starlink now offering service to non residential end user subscribers, they also have started advertising deaggregated IPv6 blocks into the global table, which Mhmm. I don't know what to make of that. I don't know why they're deaggregating them. Maybe it's because they have certain ways they want to route them. But, yeah, it'd be interesting to kind of keep an eye on how much of the global routing table becomes Starlink going forward.
Yeah. I mean, it could be for traffic engineering purposes or at least the influencing of traffic engineering.
I don't know if this is a stop gap though for some, areas that can't get other better Internet connectivity, including submarine cable systems, which may take years and very, very expensive. You know, what we're talking about is probably getting a spur created and built out, you know, something like that.
But Off of the inner city cable, you mean?
Yeah. Yeah. So that we might look at this as a stopgap, these community gateways, but I'm not sure if that's really necessarily gonna be the case because why you you know, if it's cost effective, it's a lower upfront CapEx, a million dollars sounds like a lot of money, but it's nothing. Right?
That's not a terrible permanent solution to me. And and the really, there's a couple of things going on. It's solving the problem of not having full connectivity at all times because you need that direct satellite to your Earth station, you know, connectivity. Right?
Mhmm. The Starlink mesh is is big for sure, but it's not quite there, which is why you have the medium Earth orbit and then the the geostationary orbit to cover larger swaths of the Earth's surface at one time, the trade off being much less bandwidth and not as good latency, making it not quite as usable. You know? So so in this case, that's why I wonder, especially as this technology of the inter satellite laser communication is used to have this full mesh of, satellite connectivity so that you don't need that perfect line of sight from an Earth based station to a satellite.
And and then, of course, I'm assuming that we're gonna see more satellites being deployed anyway. Right?
Oh, yeah. I I assume STARLINK's gonna continue to send up more satellites so they have better coverage in some, you know, some of the areas they they've been doing pretty consistently over the last few years.
Yeah. So with those advancements taken together, I don't see it really, you know, I do see a lot of a lot of folks considering it a permanent solution and and rightly so. I mean, if the if it meets your technical and and financial requirements.
Well, the other areas where I think this is gonna be interesting, you know, I think Doug's article focused mostly on island nations and that's fine. But, like, think of, like, we I think we talked on the last podcast or the one before about remote areas in India. Right? A lot of India's population is concentrated in big cities that have decent Internet connectivity.
There's a lot of, like, mountainous areas and rural areas. They're not near the sea. Right? So it's not easy to get cables under sea cables there.
It's really difficult to bury anything underground because you're talking about rocky terrain. It's long distances. So there's a lot of areas, around the globe that are still underserved by high speed Internet just because of physical cable is pretty much a nonstarter. So I think there's a lot of places where this would be really interesting if they can get for some definition of cost effective, cost effective, you know, high speed Internet access makes it a game changer.
I mean, like I mentioned before on the podcast, my parents have it as residential in in rural Missouri, and it's been a game changer for them. They went from one meg to a hundred hundred and fifty meg overnight.
Yeah. So Yep. Okay. So from Reuters, dated April first this year, so just a couple days ago yesterday, actually, OpenAI to raise forty billion dollars in SoftBank led round to boost AI efforts. OpenAI announced it will raise up to forty billion dollars in new funding led by SoftBank, which currently values the company, that's OpenAI, at three hundred billion dollars with a b. So according to Reuters, SoftBank will initially invest ten billion dollars this April, so this month, and then an additional thirty billion dollars later this year in December, which we understand is conditional upon OpenAI restructuring into a for profit entity by year's end.
And according to OpenAI's website, it's funding to support AGI development, which is exciting, and I don't know what that means. Right? And You know, it's something that we've discussed here and there, Justin, just the idea of how OpenAI will transition to a for profit entity at some point.
Yeah. And I read another article this morning related to this where they were talking about a lot of this investment from SoftBank, which if folks aren't familiar, is a investment firm based in Japan that invests in a lot of telecommunications and high-tech companies globally, but specifically in the US. The funds that they're investing, I think a lot of that's gonna go to help with the Stargate initiative that the Trump administration announced, the multibillion dollar I forget what the number was, like, five hundred billion dollars or something that the government of the United States is going to invest in building out infrastructure for AI workloads. And obviously, OpenAI is trying to capture some of that spend. And so this private investment into them by SoftBank is supposed to help get their models ready for being able to address some of the Stargate needs.
So Mhmm.
Yeah. And also from Reuters, other investors are gonna include Microsoft, which is no surprise. Mhmm. Code two, I don't know how to pronounce it, but, Code Management, Altimeter Capital, and Thrive Capital. And the idea here is that this large investment is gonna expand OpenAI's research capabilities.
But like, like you just mentioned, Justin, computational infrastructure, so I'm thinking data center builds expansion, upgrading, compute storage, networking, all that stuff, and, and also I assume enhancing tools like chat GPT, which is, I mean, to me is a big driver of how they're going to be a for profit company, whether it's direct to consumer or at an enterprise scale. Mhmm. And then you mentioned Stargate is back in the picture. So, yeah, I agree. You know, this this partnership with SoftBank and then and then Oracle on the Stargate project, that is all about, you know, the initiative to build data centers and support AI workloads. But from a business perspective, I think it goes without saying as well that this whole thing is gonna spike OpenAI's valuation and make it probably one of the most valuable private tech companies in the entire world.
Yeah. I think the investment now makes it the number two privately held company behind SpaceX, which is privately held by Elon Musk, obviously. Yeah. So it's, you know, already up there. And I think I also read that this forty billion dollar investment the SoftBank is making is one of the biggest funding rounds that any private tech company has ever raised. So just crazy the amount of money that's being invested in AI as we've talked about a number of times on the podcast.
And certainly from our perspective as networking folks, no network operations, data center, WAN campus, wireless, all that, like the networking space and service provider space as well, Justin, we've been feeling the not the burn, but, like, the the the marketing hype and just the talk tracks and the talking heads and pundits. We're we're we've been getting a little tired of that, and we're tracking along with some of the hype and waiting for what is it called? The path of normalization. You know what I mean?
When we start to realize the real Come out of the trough of disillusionment.
Yep.
Yeah. So there we are, and yet we're still seeing this incredible investment. So certainly, there is still a bright future, marketing hype aside, which, you know, makes me wonder if a lot of the hype is around the cute little POCs that we see here and there online that are like, oh, that's neat, but, you know, it really isn't relevant to anything with my organization. It doesn't improve my operations or my life. But certainly, there is, this path toward AGI that OpenAI explicitly states on their website.
So what does that mean? You know? That's still one of those hypothetical speculative technologies, artificial general intelligence. Nobody even knows exactly what that means. I've seen enough TED Talks now that I can't, like, formulate a consensus definition based on all these things together. So, that is interesting and very exciting because we are talking about, like, they are outright saying that that's what they're working on. So I'm excited about what that means.
Yeah. I mean, I saw another article that was saying, I think the number is thirty two percent growth in subscribers paid subscribers for OpenAI. So, you know, a lot of people are signing up for the paid version of, ChatGPT and the other services that OpenAI provides. So they're seeing growth in their user base as well. Not profitable yet, obviously, but Mhmm. At this stage in a tech company, you rarely are.
So Alright.
Moving on from network world and on the heels of NVIDIA's GTC conference, which is just a couple weeks ago, a silicon photonics startup called Lightmatter has unveiled new silicon photonics platforms, the Passage l two hundred and m one thousand. The goal is to address a bottleneck in AI data centers, and this is something that, you know, we've talked about on the show, Justin. You and I have been to events. I was just at DCD Connect in New York last week talking about this particular issue.
Right? And the issue is slow electrical interconnects between GPUs and accelerators. So the network is appearing to be a bottleneck in, you know, the advancement of AI workloads right now, networking, in the data center. And right now, we're talking about connectivity at the GPU level.
So not like front end data center to back end AI workloads and storage network and that kind of thing. We're not talking about that exactly, but it does still present a major bottleneck in, in the networking part. I don't know if this is perfectly accurate anymore, but, Justin, you probably remember this. For a while, there was this statistic going around that I read that thirty to fifty percent of an artificial intelligence job, right, in a data center Mhmm.
Was time spent in networking. Mhmm. Yeah. Now I don't know the exact proportion of networking at the GPU level and then networking in like switches and and that and how that's represented in that thirty to fifty percent.
But the idea is the same. Right? To move forward with bigger AI jobs and then making this whole thing more efficient, cost efficient, reducing that time and network, all that kind of stuff, we need to eliminate GPU idle time, which is, you know, basically, the GPU is waiting for the data to arrive so we can do something. We don't want idle time.
So the way this works is that these photonic platforms, they'd replace traditional electrical connections, with optical technology, so integrated directly onto the silicon. And then that would significantly increase the data bandwidth, reduce latency speed, speed up the data transfer between the individual processors.
So, again, it's primarily at the GPU level, and there is potential that, you know, if you think about it, AI data center architectures could start changing. We might be moving away from hierarchical architectures toward a flat design, which would be a lot more efficient and faster from a networking perspective at least. Yeah.
I think that might be a little premature, but, you know, we'll see where this goes. I mean, I remember, I don't know, it was probably ten, fifteen years ago, there was a company out of Israel that was building a photonic backplane for a route. So if you think of like a big Cisco CRS or Juniper PTX that has, you know, sixteen line cards in it, and they have to be able to talk to one another if you have packets coming in on one and going out on another.
Exactly.
That does the same thing comes in optical on your, you know, on your ten gig interface, your hundred gig interface, and then gets converted to electrical on the backplane on your chassis and then back to optical to go out your outgoing line cards. We're talking about a similar concept there. Mhmm. And they had built a photonic backplane.
From everything I had heard from people who tested it in the lab, it worked really well.
It was just it was expensive, and it actually drew more power.
Right? Because you have to power all the optics and so forth, and yet the performance wasn't that much greater. Right? Right.
So it was kind of one of those things where, like, it was a solution in in search of a problem at the time. And I had not really thought about it until I saw this article applying this to GPU interconnects inside of a discrete server. Because we've talked about on the podcast before, a lot of times when you buy an NVIDIA server, it'll have eight chips, right? Eight GPU chips on a single, blade that are interconnected inside of the sheet metal of the server.
And then you would obviously interconnect those with your network to get to hundreds or thousands of GPUs in a cluster. But being able to have faster communication inside of the GPUs on a single chip by using photonics, I'd never really thought about applying that. But so we have people smarter than myself in this industry helping us solve these hard problems.
And, one more that just came in literally this morning. I mean, it came out yesterday from Amazon's website, but I only read it this morning. Amazon is announcing the general availability of the Amazon VPC route server.
So I don't have this well summarized for you, but I did want to slide it in before Justin takes over with the remaining headlines. AWS announces the general availability of the VPC route server, which is what they say is to simplify dynamic routing between virtual appliances in your Amazon VPC.
So basically what's happening here is we are, deploying endpoints in your VPC and then you peer them with your virtual appliances and advertise routes using BGP. And it is, BGP as we know it. So you're using standard BGP attributes, and then, you know, propagate the selected routes that you want out to your route tables, making it easy for you to dynamically update routes, however you wanna use that to mitigate failures and and things like that. So it's currently available in US East Virginia, US East Ohio, US West Oregon, several regions in Europe, and in Asia Pacific as well. So it's not totally generally available, but it is certainly, in the larger markets and getting there.
Yeah. I'm really excited to see what this turns out to be. I'm gonna dig in a little more on this myself because this is something I've long wanted to see from the cloud providers, a little more transparency on what's going on with the BGP, you know, it's not magic. Right? There's BGP routing going on and, like, how do we enrich the traffic data with the BGP routing tables to figure out, like, what PaaS stuff is taking. So, yeah, it'd be really interesting to see where this all goes.
Alright. Next up, we have an article from yesterday, April first in, CRN. The title of the article is Comcast business closes NITEL buy or network as a service and boost their cybersecurity. So I had heard about NITEL.
It's been a while since I had heard about them, Phil. I don't know if you've followed them much, but, like, the article says they're network as a service provider. So, you know, think something similar like a, z scaler or a packet fabric. You know, some of those are security focused.
Some of those are more just networking service focused. But, Comcast acquired them, just closed the acquisition, yesterday and, is adding them into their broader portfolio. The terms of the agreement, how much Comcast actually paid for them were not disclosed. So, typically that means it was what they call not material to their financials.
In other words, low enough dollar amount, they don't have to disclose it to the to their shareholders and to the SEC. So, yeah, it'll be interesting to see how they integrate this into the broader services that Comcast offers presumably to, like, Comcast business subscribers, right, to the enterprise businesses that buy bandwidth from Comcast.
Yeah. And that's ultimately the goal here was to buy that channel distribution, expand here. So I don't know if it was necessarily any kind of play for any particular technology, but certainly, it's expanding into that market, which makes sense. I did read that Nitell, when it was acquired back in twenty twenty one, was valued at seven hundred million dollars I don't know if that's relevant to today in twenty twenty five, but, that's one clue.
Yeah. Yeah. They had sixty six hundred customers according, you know, to the article. So that's not a huge amount of new customers for, for Comcast. But, you know, for the business side of it, if those are large customers with multiple locations that they're able to service and then, you know, ultimately able to add additional Comcast services to to those existing subscribers, there could be a good synergy there. So Yeah.
And it is all about volume, isn't it, in that in that space? Of course. It really is. I mean, we're not service providers don't make a huge amount of money on, like, every bit that transits their network.
It really is volume. And so, you know, being able to acquire another distribution or channel distribution ecosystem, not to mention that we're talking about not SMBs, but, like, medium sized businesses. I don't know how to define medium sized businesses, but I can say, in my mind, that's a great, market to be in because you're talking about the vast majority of businesses, you know, going after, like, what is this interesting technology that operates only at the web scale level or then SMBs that are there's a lot of churn there that this is where you wanna be. So that makes a lot of sense.
Absolutely.
Alright. Moving on. The next article is from the register dated March thirty first. AI data centers want to go nuclear. Too bad they needed it yesterday.
So as we've been talking about a few times on the podcast here, Phil, these AI data centers that are being built by, it seems like everybody these days, one of the biggest constraints is getting access to the power. You know, there's been a couple articles we've seen where some of the power companies, Dominion, out on the East Coast comes to mind or trying to bring back online nuclear reactors to get, nuclear power going to power some of these AI data centers. And, you know, that's not something you just do overnight. Right?
There's a lot of government regulations to keep that done safely as we would expect. Right? You don't want a core to leak. There's been some major human disasters with, nuclear cores leaking radiation and so forth.
So gotta do that safely, but there's definitely a lot more investment being made in bringing nuclear power back as another avenue towards trying to satisfy the power demands that these AI data centers are bringing online.
Yeah. Yeah. Not to mention that we're still seeing this transition from fossil fuel vehicles to electric vehicles. Now to what extent that's really gonna happen over the next generation, you know, we're we've been seeing that start start and stop. Right?
But it's certainly the trend. So there's that. That's new a new I wanna call it a drain, but a new power requirement for our current system. Not to mention that will likely cause an increase in just consumer residential power costs, won't it?
I mean, we're going to we're going to see that. Yeah. As the supply is decreased because as you said, it takes years, decade, or even multiple decades depending on the scope of the of the nuclear facility to be built, designed, built, you know, spun up and all that kind of stuff. So it's not firing a new coal plant.
That can be done very quickly. This takes much longer. So, It's an interesting premise though to say that it's too little too late coming out of the register. It's like, yeah, we're able to tell the future?
I mean, come on.
We knew the transformer model was going to be invented in twenty sixteen and whatever it was, and then all of a sudden we were going to have large language models at the scales that we have today.
It's kind of a yes. We want some foresight when we're when we're planning these things, but certainly, I don't think it's too little too late. It's just that now we have a shift in technology.
Well, I think the point is that they're playing catch up. Yeah. Sure. Right? They're building the data centers and the demand is ahead of the supply when it comes to the power.
Right? And it's interesting some of the creative things that some of these data center operators are doing. I was talking to one of the execs at one of them here recently, and he was telling me in a lot of the cases, they'll bring their data center online and start selling the space to customers before the power grid's even ready. And I said, well, how do you do that?
He goes, well, we have a diesel generator and we just run the diesel generator twenty four hours a day until the power is available on the grid from the local power company. And then we hook into the grid, around and wait for the power company to be able to deliver the kind of power we need to open the data center, we'll have a bunch of money tied up in a data center that we built and customers that are ready to give us money to move into the data center, but we can't because we don't have power. So it's like, we're just, you know, we do what we gotta do. The other thing I thought was interesting in this article, kind of back to the point you were making, Phil, is their numbers from IEA, which is the International Energy Agency, say that only one percent of the global electricity consumption, right now is data centers, and they expect that to grow to two to four percent, but that's still dwarfed by electric vehicles, which they say already currently accounts for six to eight percent of the total electricity demand globally.
So that's, interesting to see. I don't know if I trust those numbers or not. They didn't quite line up with what I would have expected. I would have thought the data center draw was a lot higher than that, but, yeah, interesting.
Yep. Yep. There's also the argument here in my mind that so we need a stop gap until these nuclear nuclear, depending on what part of the United States you're in, you say it differently. Right?
It takes time for these nuclear facilities to come online and we need that stop gap, whether it's literally running diesel generators or other kind of sources. You know, doesn't like the US military and other military organizations around the world like build nuclear submarines like in a few years? Mhmm. And and haven't we been discussing like the small I don't know.
I forgot what they're they're called, but, like, the small nuclear facilities that, like, are the the capped domes. Remember we talked about that, like, a month or two ago or something like that? Yep. So there are disruptors to the industry or to any industry really that I think we can, you know, figure out solutions to these problems, rather than these gigantic monolith nuclear facilities that might take twenty years to build.
That's why I'm like, alright. Yeah. I get it. But I don't think that we're gonna come to this crash and burn with our power grid anytime soon.
Alright. Last but not least, an article from Yahoo Finance from March thirty first, two days ago, talking about how Trump tariffs would push big tech AI data center cost higher.
As we're recording this on April second, president Trump is set to announce what tariffs are going to be put in place from the Rose Garden today. How that I think is interesting to our listeners, Phil, is how does this impact some of the stuff we've been talking about around investment in AI and investment in AI data centers. Right? And this is what this article is all about.
It's talking about how the tariffs again, we don't know exactly what they're gonna look like until after the announcement today, but from the news that's been out, some of the hints that have been made, it could potentially raise construction costs for commercial projects like a data center build from three to five percent. If you think about all the steel, aluminum, copper, some of these products that need to be made, which is part of what the tariffs have been hinted at least being applied to. So, you know, companies like Amazon, Microsoft, Google, and Meta, who are the big hyperscalers that are building out a lot of these data centers, have said that just in this calendar year alone, they're gonna spend three twenty five billion to build this infrastructure.
So if that were to go up, three to five percent, that's a huge, increase in the amount of money they're gonna have to spend to build these data centers.
So I mean, they're in a rush to do that too.
Of course. Yeah. They're not just gonna wait because they've got demand. So presumably, they'll just call it eat that cost or absorb that cost, but it will have a material impact on the amount of money that they were, planning on spending.
So And it's not just those webscale companies building the data centers necessarily, but the entire supply chain, isn't it?
I mean, we're talking about, you know, like raw materials, steel, copper, whatever.
We're talking about chip manufacturing, we're talking about racks and the cooling infrastructure.
We discussed power in the last one, but there's certainly components there that we're going to have to consider. This isn't very all encompassing thing.
It's going to be impactful to a big extent. And and by the time folks are listening to this podcast, they will know what those things are since the the president will have spoken. What's also interesting to me is that this is Yahoo Finance.
I'm being a little silly here, but I don't use Yahoo for anything, and I never even hear the word, but I do have to say that some of their news stuff is pretty good still, and I still read it, and I get it in my feed.
I would say Yahoo Finance is one of the few Yahoo pages I use as well. Yeah. I actually really like being a little bit of a finance nerd myself when I go and try and do a little research on a company and find out what their financials look like, their balance sheet, their income statement, that kind of stuff. Yahoo has a nice way of laying that out. And like you said, a lot of their reporting is usually pretty good when it comes to financial news.
Yeah. Yeah. Absolutely. Alright. So moving on to upcoming events. We have Google Cloud Next. That is April nine through eleven.
Kentik will be sponsoring, but I will not be attending. Justin, are you going?
No. I have a PTO that overlaps with that one, so I'm gonna take a much needed break versus going and hanging out and learning more about Google. I'm a little bummed that my my FOMO is kicking in, but I'll do the relaxation instead.
Very good. The Missouri networking user group is in Kansas City this April tenth, and, for those of you not familiar, the Missouri networking user group and along with the other networking user groups that we mentioned here on this show are part of the USNUA. You can find an event near you at the USNUA website well, USNUA dot com, then go to groups and events and find one near you. Justin, are you involved with the, with the Missouri Nug?
I am. Interestingly enough, though, the Kansas City group has kinda split off from the broader Monog. I know that's happened in your area in New York as well where there's enough demand that they've split off and are running their own events in Kansas City. So while it's still under the MONUG banner, if you will, they're organizing their own, and I'm not involved in the Kansas City side of things. I'm, one of the organizers for the Saint Louis side of MONAT.
Oh, okay. Great. Yeah. Yeah. We have four events locations in in New York state now. There's one in New York City, which is really neat.
I I went to that one, just recently right at, four World Trade Center, so some neat views on the seventieth floor.
Yeah. There's the one that I lead in the capital region of New York, and we hold that in Saratoga Springs. There's one in Rochester, and there's one in Buffalo.
Speaking of NUGS, the next Massachusetts network and user group is in Framingham just outside Boston on April seventeenth. I will personally be at that one on the panel, so I'm looking forward to that. It's just a couple hour drive for me, and I love that area, so it'll be great.
Hopefully, the weather is come and bring your tomatoes.
Oh, great. Thanks a lot. And then we have AI infrastructure field day brought to you by the Futurum Group and, tech field day on April twenty three to twenty five. That's in Silicon Valley and live stream remote, of course.
And last but not least, the Virginia Nug on April twenty fourth. I forgot where, but it's somewhere in Northern Virginia outside of Washington. Reston. Thank you.
I will be attending that one, and, Kentik will be sponsoring that. So that'll be a a lot of fun led by our good friend, Scott Robon. So shout out to you, Scott. So on that note, those are the headlines for today.
Bye bye.