More episodes
Telemetry Now  |  Season 2 - Episode 27  |  January 16, 2025

Telemetry News Now: Cisco's pace in 2024, AI datacenter investment, satellite-to-mobile tech, and Meta's recent copyright lawsuit

Play now

 
In this Telemetry News Now, Justin Ryburn and Phil Gervasi discuss Cisco’s innovation pace in 2024, the rise of AI data centers, and the evolving role of satellite-to-mobile technology. They look at a recent copyright lawsuit against Meta, regulatory shifts in AI chip exports, and strategic investments in AI infrastructure. And as always, they highlight important upcoming events in networking. For info on the Foundations newsletter mentioned in this episode, see https://www.foundations.email/

Transcript

Telemetry News Now.

Welcome to another telemetry news now. We are recording now in mid January two thousand twenty five, and we hope sincerely that you had a wonderful holiday season and New Year couple weeks ago. We are back in the swing of things, and we do have a little news for you. Leon Adato will not be joining us, on the podcast anymore. He is stepping away. And for the foreseeable future, it will be just me and, Justin, bringing the headlines to you week by week. So on that note, let's get into the headlines for today.

Alright. First up this week is an article from Network World talking about, what Cisco has released recently, kind of a retroactive looking back to twenty twenty four and looking into twenty twenty five. And the author's main point is that Cisco did not release a lot of major new technologies for the first time, in a long time. The author kinda points to, their most recent earnings where they had a twenty three percent decline in their networking revenue saying that maybe this is is the reason.

To be fair, they have released some things. It's not that they haven't released anything. They updated their line of UCS servers and added NVIDIA GPUs to them. I would say if you're an enterprise and you're all in adopted on UCS in twenty twenty five, you have to have GPUs because, presumably, you're gonna be doing some AI workload processing somewhere in your data center, so you're gonna need that.

So they have added that to that line, and then they have, come out with eight hundred gig interface line cards, for the Nexus nine k switches. But outside of that, there's not a lot of new stuff that, Cisco and their networking division has has released. And so the author was pointing that they've slowed down a little bit on their innovation. Kinda curious what your thoughts are, Phil.

I mean, Cisco is such a large company that I'm I'm okay with the not not that I have the purview to be okay or not okay with what they do. But in the sense that, like, I understand if there's a slow year in development because of the fact that it takes a long time to pivot and to move and adjust. And we are seeing a lot of a lot of that right now in in the emergence of AI data centers. As far as networking is concerned, AI data centers and less of a focus on things like, you know, SD WAN and stuff. That that's ubiquitous. That's like a that's a normal thing.

And so, you know, reevaluating where do we go next. I mean, we we saw and we see now, Eden, that there is probably some headway being lost in the data center because of Arista and other companies that are probably a little bit more adept at these types of AI data center workloads. Right? Mhmm.

So I don't it's not a surprise to me that Cisco is, perhaps reevaluating. And I don't know if that's really the case right now. You know, and I also wonder if, the idea of not being beholden to a specific single vendor for all of your technology stack, you know, Cisco for your campus, your WAN, your security, all that. Yeah.

You know, that that's been something that we've been moving away from as an industry for a while. And and I think Cisco recognizes that because they have been working on interoperability with a lot of their technology, so I get that. But maybe this is also, you know, a manifestation of that as well, you know, that, Cisco does need to reevaluate. Okay.

How do we fit into the ecosystem of networking of technology as a whole? Certainly, we're gonna we're gonna talk about Cisco's UCS platform in a minute as well. Right?

Well, and I think even having eight hundred gig line cards for the Nexus nine k is a little bit ahead of the curve. If we go back to one of our previous podcast, we talked about the earnings from Arista where Jay Sheree was sent the CEO over there was saying that Mhmm. Eight hundred gig line card shipments is like a minority of their revenue. Right?

They're still doing one, two, and four hundred gig. Right? Eight hundred is still so new, and there's so few customers that are moving to that speed. We like to talk about the cutting edge because it's exciting to talk about, but the bulk of the revenue is still in this lower speed.

So, you know, I think revenue will catch up with that release for for Cisco just like it did for Arista, presumably.

Do you think that the future still is, like, in moving boxes, like campus switches, whether they be very expensive, big, you know, huge big iron campus switches or just those IDF switches that are presumably a much higher volume. Do you think that's still the future, at least for Cisco from a business perspective?

No. I don't think so. And, I mean, if you look at some of their their Juniper and to some degree, even Arista's more recent acquisitions, it kinda points to that. Right?

They've they've acquired a lot of software companies. Cisco acquired Isovalent and Splunk and ThousandEyes and some of these other products to become more of a software company that moves their big iron boxes. Same with, Juniper. Right?

They acquired Abstra and Mist and some of these other products that are really focusing more on the software that drives the hardware.

Mhmm.

So, yeah, I don't think they're just continuing to move the next biggest, you know, speeds and feeds on their big iron is is their business model for any of these companies going forward.

Yeah. And I I do wonder if, Cisco's slowdown, quote, unquote, like, kinda slowdown in products and features and releases in twenty twenty four is a result of them sort of reevaluating how they fit into this new landscape of AI. And and and that's the thing. Everybody's focused on that right now, both in, like, the tech world and mainstream media and society at large.

So Cisco is sort of, like, looking at that. How do we how do we fit into that? You know, you're gonna bring up those, UCS pods for AI data centers I see in the show notes here. But that's just one niche of of networking, of technology in general.

So how does Cisco capitalize on that? Is it just, you know, alright. While that's happening, we're just gonna keep pushing, you know, campus switches and our subscription model and our SDN controllers and things like that. You know, all all still valuable things, but, certainly, a, a company, even a global powerhouse like Cisco, does need to always be adapting and changing and growing, to to a a changing market as well.

Yeah.

Well, the other thing that the author doesn't really give them credit for is a lot of times these companies will slow down a little bit on putting out innovation because they're going back and shoring up the existing product line. Right? Removing bugs and going back fixing things to make sure customers have a good quality of experience.

Innovation is one thing, but if the customer if the quality and I'm not saying this is the case in this case, but just one thing to think about is that, you know, if the quality is not high enough that a customer can adopt it and put it into a production environment, then you're not gonna get any revenue from it anyway. Right? So Yeah. A lot of times, innovation does slow down for a quarter or two while the vendors go and work on quality issues.

I mean, the focus has been so much on AI and AI networking and stuff. Do you think that Cisco is gonna play a big role in that? I mean, I know in the notes here where I see the what do they call the iPods or whatever?

IPods. Yeah. Yeah. Yeah. Yeah. I mean, I think that that a iPods, by the way, is actually a really interesting thing because it's not I thought it was just going to be a hardware stack.

Like, you buy this UCS server and you buy the switch and they, you know, they they work well together. They've been tested to work well together. They would actually want a step further, and they actually ship it with some pre trained models. So, presumably, you can roll this into your data center, fire it up, and go straight to inference, not have to train it.

Now how good that'll be if it's not trained on your data, that'll be an interesting question mark. But I it was kind of an interesting kind of sidebar that was in that article that I that I pulled out.

Yeah. You know, the I I know that the gist of this headline isn't about these AI pods, but I do wanna camp on it for a minute because, you know, the I I've been working in this for quite a bit now and at a pretty deep level. And I know for a fact that you can't necessarily just buy a piece of gear to do all your AI magic and then call it a day. I mean, certainly, you can purchase, you know, AI as a Gervasi, and, you can you can purchase a packaged up workflow that will help you get there.

But there is so much to do in ingesting and processing and cleaning your data, dealing with that, and then all of the different components of how things, you know, hook together and piece together and then need to be interchangeable as as your needs change and as you find, like, there's too much latency with this component, so you swap it up for a different database or, you wanna add new data. So so I think Mhmm. This idea that you could just buy a hardware pod that has everything built in and then you can just start running inference, I don't think that's really feasible to do at scale in the long term.

Maybe maybe in the short term, if you're doing relatively simplistic things and you have a very well prepared, you know, dataset already. But it it could just very well be that you you know, you buy the pod, you put it in the rack, and then you gotta spend three years figuring out where all the data is that you can train your model on. And that's assuming that you wanna train a model. I don't I don't think that in networking, we should be thinking about training, you know, fine tuning, I should say.

Fine tuning a model, per se, especially because we're dealing with real time telemetry and and, and that sort of thing. And and then they're hard metrics. So, I don't I don't think we necessarily need to train a model other than maybe, like, you know, on, maybe our ticketing system where we have historical text information, unstructured data for years and years, and then our KB and and that kind of stuff. But for actual telemetry, you know, there are there are other ways to do it without having to to go through the process of training them.

Well, I agree with you. I think that's what this is probably there for, right, is to try and get them into those environments. Because if you look at what NVIDIA is doing, they have something similar where you don't just buy it well, in their ideal world, in their ecosystem, you don't just buy a server and and stick it in. They they wanna sell you the switching. They wanna sell you the software that helps you bootstrap everything. It gets all the GPUs talking to one another and monitors everything.

So, you know, they've got a bit of a a whole ecosystem, somewhat closed ecosystem on the NVIDIA side, which I just presume this is sort of Cisco's, answer to that. Right? Trying to get more into these. Because I agree with you. When I when I'm out at events and talking to people, the Cisco's name is not coming up in these environments. So it's presumably a better way to get more into that.

Justin, I grew up on Cisco, man. You know what I mean? Yeah. Like, I got I collected the certifications like Pogs.

Mhmm. And, you know, I don't ever talk about them anymore. I don't think about them anymore. And and I I mean, it it literally I I grew up as a network engineer on Cisco, and and I it's a very warm place in my heart, for all that.

And I am the most comfortable on the Cisco command line still, when I when I, you know, spin up container lab and I have a bunch of virtual switches running. But, certainly, it is interesting to see that, they they don't come up with conversate. It could be the circles that you and I operate in now. That is true.

For sure. Alright. Well, moving right along, another article, from actually, a number of articles from Network World talking about, investment in AI data centers, specifically focused on, the UAE, a company an investment company over there that has invested twenty billion dollars in data centers, as well as an article from Microsoft about their investment in, another eighty billion dollars in AI data centers for their own use and then eleven billion from AWS, to build their own data centers. So, you know, it's clear from all of these articles that there's continuing into twenty twenty five a lot of money being invested in the AI infrastructure.

You know, Microsoft and AWS, I think, probably makes sense for anyone who's who's paying attention because between the cloud offerings that they sell, to their customers to run AI workloads in their product in their public cloud as well as their own, AI features they're adding into, you know, into their own products that they sell as a software as a service offering. You know, obviously, they need the data centers. They need to compute to be able to offer those offerings. You know, for our listeners, this problem may might be less obvious why some company investment company in the UAE is investing in this.

It's really kinda think of it as like a real estate investment. Right? They're putting money, presumably, you know, lend whether it's a a debt vehicle where they're lending money to the company or it's taking a share in in the companies that are, building this wasn't really clear in the article. But either way, what they're doing is they're they're investing their money in the companies that are developing the land, building the buildings that these data centers are housed in, bringing in the power, bringing the cooling, connecting into the fiber optic that's in those buildings.

There's a lot of money, a lot of infrastructure, a lot of cost to that infrastructure that has to be Mhmm. Put in before all of those network engineers show up and the bits start flowing across the network. Right? And that's that's where this investment company in the in the UAE is investing their money to try and, try and get their money out.

In fact, the article mentions that data center developments are not short timelines and commonly take five years from an idea to fruition of a first building commissioning.

So and there's long, long cycles that take place here, and a lot of money is invested before we ever see any returns from it.

Yeah. Do you think that there are any ulterior motives beyond just the investment in AI and the amazing things that AI can bring us as from a revenue perspective and from, like, a benefit to humanity perspective? I mean, I know, you know, data centers have a gigantic square footage footprint. And often, I know for me, from a personal perspective, having been you know, I was in a real estate investor for some time, there are significant tax advantages to owning real estate in the United States. Exactly. That's part of it.

Mhmm. Yeah. For sure. Yeah. You know, my wife and do a little real estate investing on the side.

Same thing. Right? You get good tax breaks from it. So there's I'm sure there's some, you know, vested reason to do that.

You know, I I think if you're investing in this, presumably, you have more of a front row seat on how it's done. There's also potential that there's some learning that they're wanting to do here, right, to learn how this is being done so that they can do more of this in in the UAE too.

Right? Interesting.

Okay. So that could be an interesting angle as well. I don't know that that's what's going on here, but that's one way a lot of companies that are trying to get into something, learn something, will do is they'll partner with somebody who's already successful at it in another country to learn how they're doing it and then bring those earnings back to developing in their own country.

So let me ask you this. We're talking about, like, hundreds of billions of dollars, billions with a b. Right? So, you know, there's a leap here, but, presumably, the folks that control these vast sums of money have to be at least somewhat capable and intelligent, especially if they're investing on behalf of entire nation states and huge companies and things like that.

So there's some trust there. And and, you know, considering that money talks, right, do you think that, therefore, we can extrapolate that people look at AI as a long term future and not as just like a short term speculative investment. Because certainly, you can you can see, like, okay. This is gonna make me money in the next few quarters or the next few years.

Right? And and short term for some of these people might be ten years. Whatever. And then I have a quick exit strategy because we all know this is smoke and mirrors, but I'll make money in the short term.

Or do you think that this really does speak to this idea that folks agree that this is a very long term, you know, huge change to our society and culture, this technology, and and and, therefore, a a a very wise investment for both the short and long term.

I think it's a mixture of both. I mean, I would guess that the people who are making these decisions that, that they're controlling the money at, like, this investment company in the UAE probably are looking at it more in financial terms. Right? Like, what is the risk of this investment?

What is the return the potential return on my money? What are the tax breaks I'm gonna get for it? Like, what's my ultimate ROI? They may not even know or care about the underlying actual technology is gonna be running in these data centers.

Right? And whether, you know, the AI, workloads and how it's actually being done ever comes to fruition or not, probably doesn't matter for them as long as somebody's actually buying the data center space and running the models whether they're what comes out of those models ever actually turns a profit for the person who's running the model or not, they'd actually don't even care. Right?

Because they get their money out of the actual building of the building and rent and all those other things. So In the short term.

Short term. I mean, you know, but, like, when I was a a residential real estate landlord, I I had a vested interest in my tenants still having a job and being gainfully enforced as they continue to pay the rent.

And it was a longer term risk profile for that investment.

Yeah. And it was a longer term investment too. I wasn't looking at three to five years. I was buy and hold, which is ironic because I have sold everything.

But at the time, it was a buy and hold strategy. And the reason I say that is because, you know, if you're looking at a three to five year investment in a data center because of the real estate and the the property and all these other things, that's fine. But, you know, folks, even like OpenAI, aren't sure how this is a profit generating or revenue generating thing yet. A lot of that's a big question mark for the industry.

How how is this going to make anybody money? You know? We we're all enamored with some of the cool factor in a lot of these things, but how is it gonna generate revenue? And so that's why there's a question mark still for the like, why why are folks still investing, or why are they choosing to invest this these enormous sums of money?

You know?

Well, I think, you know, the reality is somewhere in between. There's a lot of these things that are talked about. You know, we we've joked about this on the podcast before. Is it AI washing or is it real?

Right? And, I don't know. Some of it some of it's AI washing, so it's actual real tangible use cases. I mean, we have AI in our own product here at Kentik, right, that that's very real.

It exists. Right? There are companies like McDonald's that are investing heavily in AI and have been a little bit public about it where they're gonna be able to do some ordering through AI. And it you know, it's it's very real and tangible for them.

Right? There are others where people are just experimenting. They don't know yet whether it's actually gonna pay off and wind out wind up actually working the way they envision it's gonna work and whether it's gonna wind up being cheaper than however they're doing it today. So Mhmm.

You know, there'll be a little of each. And the investments are always that way. Right? If everybody knew the future, then investing would be a very different art than it is today.

Right?

So Mhmm.

Yep. Yep. You're right.

Alright. Moving right along, Akamai to quit its CDN in China, seemingly not due, to trouble from Beijing, but, or at least on the business side of things, but more of a risk profile, type of things. This article is from the register talking about how, Akamai has announced customers will either have to migrate to offerings from another company like Tencent, or Akamai will move their content being serviced by a CDN Gervasi, CDN location outside of China.

So they have given their customers a deadline of June thirtieth of twenty twenty six. So from the time of this recording, about a year and a half, they've got to make a decision on whether they wanna migrate to a competing product or they wanna be served outside of China. But, you know, the article is a little bit vague in in what exact reason they're doing this for, but it's they they did keep saying that it wasn't for you know, the Chinese government can be difficult to do business with. You have to have an entity in there. A lot of times, you have to find a local partner to kind of do business on your behalf in that country.

Mhmm.

Sounds like it's not so much those reasons as it is, security and and data privacy reasons. All a lot of stuff that's been going on with TikTok. Right?

So Yeah.

The one art the one quote that did kinda speak to that said, like any business, we constantly evaluate our offerings in each market to best address our customers' needs. So kinda hints towards more of a security thing than a than a business thing to me, but that's just my read on it.

Well, that's been a recurring theme in twenty twenty four and I assume in twenty twenty five as well. Right? Mhmm. We have we have a global economy. The nature of the Internet obviously is global and interconnected, so I think we're just gonna see more and more of this. And here we are with one of the leading CDNs in the entire world having to make some tough decisions about what they're gonna do. So yeah.

Well, we're about to have a change of leadership in the White House, which presumably is gonna bring some amount of change in how the United States and and, China interact with one another whether for better or for worse, I guess we'll find out. But, you know, either way, it's gonna be different. Right?

So Yep. Yep. And, you know, let's put this into context. We've talked about the relationship between the United States and China as kind of a adversarial relationship, certainly some tension there.

And it's it's usually in the context of trying to gather intel about one another. And that's been true for, you know, independent sovereign states throughout all of time, all of human history. You know? We're trying to understand where potential enemies are moving their troops and, you know, what they know about us and, you know, things like that.

And so, in this case, though, now we're talking about it in the digital age. And so being able to acquire information digitally and through those kind of means is the primary focus for a lot of these nation states and and, of course, you know, non nation state related cybersecurity threats.

But here specifically with a CDN that delivers data and that has the ability to ingest data, I mean, I I do see where there could be an issue, and Akamai has to make some decisions. Again, we don't know, if it really is truly a security related thing, but, I mean, I'm gonna read between the lines. We're talking about, you know, a content delivery network and a a country that has a strained relationship with the United States. So, you know, you do the math. Right?

So moving on from an article out of Computer Weekly on January tenth, Telstra, that's Australia's leading telecommunications provider, they've partnered with SpaceX's Starlink to bring satellite to mobile text messaging capabilities to the country, the country of Australia. And they're aiming to address connectivity gaps in remote and rural areas, and that's very interesting to me.

You know, despite Telstra's extensive mobile network covering, I think I read, ninety nine point seven percent of the population, You know, I'm sure many of you know Australia's vast landmass leaves significant regions without reliable mobile or fixed coverage. Most of the population of Australia is clustered in its largest coastal cities. Right? And, and so this fits into Telstra's t twenty five strategy, and that offers satellite powered voice and broadband services to consumers and businesses. At least that's that's what the strategy, purports to do in time. Now we also learned today, and today we're recording on January thirteenth, that T Mobile and SpaceX have activated Starlink satellite coverage to provide critical communication services in areas of Southern California.

And, those are the areas affected by the ongoing fires in the area. Exactly. Now this is an early stage direct to cell satellite service, and it has been previously tested during emergencies like, hurricane Killeen, I think. And it's now delivering wireless emergency alerts, SMS, nine one one text capability, to those regions in the US specifically.

So in my opinion, this is transformative technology. I mean, satellite to mobile connectivity Right now for basic services like text messaging, I mean, that's that's amazing. But, eventually, voice and data as well, and I really think this is a big deal. And I get that the focus is, remote locations.

You know, I understand that's where we're starting emergency services, emergency scenarios like in California right now. But in time, I do believe that this is gonna be, you know, huge in the ability to provide high bandwidth, connectivity for anyone pretty much anywhere. Right?

Mhmm.

Yeah. And I think we covered a few months ago on the podcast one a similar announcement over in Europe and similar concept where, you know, if you're in an area where terrestrial cell phone signals, you know, five g, let's say, cell phone signal works for you, then then you'll be on the five g network. If you roam outside of the reach of one of the towers, then you bounce off the Starlink satellite.

Yeah.

And it gives you the best of both worlds. Right? If you got good coverage and you can get the lower latency right off the five g tower near you, then that's great. If not, if you go into a dead zone and you bounce off the Starlink, it helps kinda bridge those gaps in that coverage whether you're in a remote area like, you know, some of the some of the regions in Australia like you were talking about or in an area where there's been devastation, unfortunately, like the the LA wildfire area.

Yeah.

To me, the really interesting thing about all these partnerships that Starlink is doing with these various carriers is the ability to roam Mhmm. More so than just that they can offer coverage there. They've been doing that for quite some time.

Yeah.

I mean, it is just Starlink. So I'd like to see more names in that list of satellite providers or satellite communications companies that are are doing this as well. But, you know, they're leading the charge, and that's great. I'm glad to see it.

But this idea of having direct satellite to mobile connectivity, I think, is just phenomenal. Right now, it is, you know, predominantly just, you know, this low data, communication, texting, emergency, you know, alerts, that kind of thing. Mhmm. But to see that directly to my phone as a receiver of of high bandwidth, low latency data communication and voice communication is just phenomenal.

And, that I think that very transformative. I mean, it's gonna change the nature of how we connect to the Internet considering that we don't need certain points of presence or we don't need certain types of, you know, boxes and default gateways to get out to the Internet.

So you know?

It'll be interesting to see if this changes the way that, landline and mobile communication companies invest in building out their own infrastructure. Mhmm.

I mean, hasn't that already kinda happened a little bit? I mean, not necessarily that you don't need an infrastructure to run, like, your five g network, but there were there were there there have been countries and countries right now and geographic regions that have leapfrogged and skipped, like, landlines for your phone. Right? Thinking about old technology and went right to sell. So in the same way, there may be entire regions of the world that do that do that here with Internet connectivity.

Well, I was thinking, you know, for a long time, it's been really expensive to, like, let's just say, bury fiber out into a rural remote location. Right? And even back, I don't know. I'm gonna show my age here. Maybe, like, twenty years ago, I worked for a carrier. We had fixed wireless, which is line of sight direct wireless in places where it was really expensive, either cost prohibitive, or, politically, we couldn't connect, to the building we wanted to connect because it crossed the railroad track or or whatever. Right?

There's a lot of Yeah.

Reasons for that kind of thing. And the fixed wireless had a lot of its own challenges with the degradation due to weather.

Ironically, a bird flying in between the two could actually cause cutouts. It's you know, sounds crazy, but it actually does happen.

So, you know, it's been I'll say a thing in Internet infrastructure for a long time that some of these places are really difficult to meet and wireless on one way or another has been the answer for that, but there's not it's never been a really good answer. And so Starlink can come in and and other low Earth orbit satellite operators themselves, like you said, can come in and help bridge that gap where it's expensive to extend the coverage and give those people who, happen to live in that area, better coverage, then, yeah, I'm all for it.

I wonder how it's gonna play out, though, because we are talking about your local carriers. Right? We talked about T Mobile, just now Mhmm. And, Telstra, and then partnering with with Starlink and well, you know, SpaceX.

Right? Mhmm. And so that's interesting to me because now, you have this, like, like, the borders are dissolving where it's not like this physical infrastructure that's local, where there is FCC regulation involved and things like that. You have satellites that are going very fast in orbit, going over the United States, and then going to other parts of the world, providing Internet access in those parts of the world.

So we're gonna have this interesting, not necessarily a dichotomous relationship, between local providers that have certain regulatory bodies that govern what they do and then, you know, startling flying over the top of them. So I'm not I'm not sure how that's gonna play out, but I know that's gonna pose some problems as we move forward. I mean, we we've already talked about on this podcast and on the maid show, how some governments can, you know, use technology to control information and what gets, you know, sent over over the airwaves and over the wires. Right?

I'm not sure. I mean, it'll be interesting to see how much different that is, though, than, like, how do you enforce bad behavior on the Internet, right, which crosses borders and boundaries. Yeah. You know, it's gonna be a somewhat similar yeah. Restrict it. We have somewhat similar challenge, I think, with this, but, yeah, it'll be interesting to see.

Yeah. Okay. So from TechCrunch, surprise. A copyright lawsuit against Meta accuses the company of using pirated materials to train its llama AI models.

Now we have, we have newly unredacted court filings that alleged that Meta CEO, Mark Zuckerberg, approved the use of LibGen, a database of pirated ebooks and articles for training purposes despite internal concerns. What I mean by that is that, Meta employees themselves reportedly flagged the dataset as pirated and then warned it could damage the company's regulatory standing.

Now the filings also suggest Meta took steps to conceal its use of, copyrighted materials such as removing copyright information and torrenting the dataset, and that would be deliberate copyright infringement. That's a that's a big deal in my opinion. Now while the lawsuit focuses on Meta's earlier llama models, if you're not familiar with llama, they're much smaller than some of the big foundational models, like llama three point two and three point three I use myself.

This does Allows you to run them on your laptop.

Right? Because they're lighter weight and don't require as much.

Yeah. Depending on the iteration, you know, you can run, there's, like, the seven billion, the four hundred and fifty billion parameter model. So there's different iterations, but you're exactly right. They're very accessible.

This whole thing does speak to the broader debate about fair use, right, or fair use in AI training. So you gotta remember that LLMs, they need huge amounts of text data to be trained. And, so at this at this point, that's becoming more and more difficult. And the reason is the entire public Internet has been exhausted. Some of the largest models like GPT has had they've been trained on the on the entire text of of the publicly accessible Internet. And now that you're done with it, what else do you use?

Meta maintains that its actions are protected under fair use, and that's a defense that has been or that has succeeded in the past, including some prior AI related cases.

These allegations, they, they have drawn criticism. Currently, the presiding judge suggested that Meta's attempts to redact details, well, that's that's aimed at avoiding bad publicity and not at protecting sensitive information. So, you know, for now, the case remains un unsolved. There's a lot to think about here.

I mean, the questions that arise in my mind is anything on the Internet that's publicly, accessible, is there for that fair use? And if so, is it okay to train models that we all might benefit from even if those models generate revenue for that whatever company that owns them? You know? So those there's some interesting questions there.

Yeah. I mean, this one sounds bad, especially if they were trying to hide what they're doing that you know, the guilty don't try to hide their behavior typically. Right? But, again, this is us speculating a bit.

It'll be interesting to follow this one and watch it, see what facts and details actually come out on it. But, yeah, it does raise a very, interesting, not only Philip, but legal question around, like, you know, if it's publicly available on the Internet, but somebody put time, energy, and money into putting that together, you know, like an article, like, let's just say, you know, New York Times or somebody put an article together, they put the time the time and energy into publishing it, but, yes, they put it on the Internet. That's not public domain. Right?

So Right.

You know, it's publicly accessible, doesn't mean it's public domain. So Mhmm.

Yeah. The the copyright laws on some of this AI model training stuff is gonna be interesting for sure.

But that's the interesting thing because it's not that I'm taking that copyrighted information and disseminating it or saying it's my own or or whatever. I'm training a model internally that will then benefit people. And, you know, maybe the problem here, the question mark is that it also might generate revenue, but it kinda doesn't yet. A lot of these companies are not you know, I mean, Facebook is really good at, like, generating money from advertisements, so I think they'll figure this part out. But, that that's the question.

So Number two behind Google.

Yeah. There you go.

Absolutely. Noticed that this morning. Mhmm.

But but it does also help us to see more clearly the value of training data. Right now, data is the lifeblood of a well, it's always been the lifeblood of AI. The quality of the data, the volume, just the sheer amount of data that you have, and then the, you know, the ability of your internal data scientists and whoever else you have working for you to clean and process that data. Otherwise, your AI workflows are completely worthless. You know? You can train a model, and it'll make mistakes because your data is poor. You don't have sufficient data.

So I think that this lawsuit, yes, it does speak to the nature of what fair use is for training. It does also bring, you know, they broke the law, so there's just that. But it also speaks to this, you know, importance, this criticality of of just data in moving forward. And, you know, what I I've I've been reading articles, the lack of it. We just don't have enough to keep doing what we're doing. And I've even seen, you know, people talking about how the progress in the development of large language models is slowing down probably because of, you know, of this, the result of, not having enough data to continue making, you know, more iterations.

Reminds me of that scene in the movie Short Circuit from, I think, the eighties where, he goes to the library and reads every book. I need more data. Alright? That's that's the situation we're in. We all we all need more data to train these models.

It's true. Okay. So last on our list today, the US government is, it announced new regulations to restrict exports of advanced AI chips and technology, presumably aiming to maintain its global dominance in AI and chip design and limiting access for adversarial nations. So think of nations like China, Russia, Iran, North Korea.

This is out of Reuters from this morning. So these rules were introduced right here at the very end of president Biden's, his term, his administration, and they include export caps for most countries, but there are exemption exemptions for, close allies of the US. Stringent conditions for US companies like Amazon, Microsoft, and Google to then build data centers abroad as well. So from what we're seeing, the regulations are part of, I guess, a broader strategy to curb China's ability to enhance military capabilities through AI.

And I I mean, that's expected, right, at this point in this somewhat strained relationship between the US and China.

So we've seen talk about worldwide licensing requirements for advanced chips. Those are chips used for AI processing and workflows, controls on model weights, that blows my mind, that are used in machine learning, and tiered country access. So what that means is that a tier one ally of the US, that'd be countries like Japan and Britain, for example, they would face minimal restrictions.

Whereas countries like Saudi Arabia and Singapore, they would have caps on what we would export to them. And then, of course, arms embargoed countries, they'd have you know, they'd be entire entirely barred.

And what's interesting to me here as well is that NVIDIA criticize these regulations as overly restrictive. Right? So I get that part. Right?

They they say that they would, this would unintentionally benefit Chinese competitors. Now I personally understand that NVIDIA has a vested interest in greater proliferation of AI. Right? More specifically, the chips and GPUs, you know, and They're a little biased.

NVIDIA provides. Yeah. I get that part, But I don't quite understand this statement that it would, unintentionally benefit Chinese competitors to put these restrictions on them. Maybe, Justin, you have some thoughts here?

Well, I think what they're trying to say there is if Chinese companies can't get their hands on these chips and their competitors outside of China like, let's just use Alibaba as an example. Right? If Alibaba can't get their hands on these chips, but it but Amazon can, it unfairly gives Amazon, business advantage a strategic business advantage, which is they're probably not wrong on that, but that's obviously not the reason that the United States go well. Yeah.

If we're thinking the that US government is altruistic in their reasons here, that's not the reason they're doing it. They're doing it more for national security reasons. Right? They don't want these chips to be used to process models that can be used for better cyber warfare against the United States.

Right? Yep. And this is, you know, somewhat new in that it's AI GPUs we're talking about, but this controlling the flow of technology exported from the United States to nations that the United States doesn't have a good political geopolitical relationship with. There's nothing new.

I mean, encryption methods and all kinds of stuff have had export controls on them for a long time. Back in the days when I worked for Juniper, there was a whole list of countries we couldn't export. Well, we had two versions of our OS. Mhmm.

One that could be exported anywhere and one that could only be used to you know, on an in embargo countries, and the one that could be exported elsewhere didn't have any of our cryptographic, stuff built into it. So, you know, there's there's been a precedence for this on some level for a long time.

Oh, for sure. I mean, think about, like, the relationships that the United States government at the federal level has with companies like Boeing and Lockheed Martin and then with, like, unions, like the the United Auto Workers Union and the car manufacturers and and things like that. I mean, that is that's been part of our twentieth twentieth and twenty first century American history. So it is interesting to see, though, that that's now being translated into this technology space with specifically AI as the main focus here.

And so there's a recognition from the government the American government, the United States, the fed at the federal level that this is a profoundly powerful technology, that has implications on, you know, like you were saying, geopolitical relationships. There's ramifications to security, there's cyber warfare, there's, benefit to a country's GDP or or not if there's no access to these technologies. So it is interesting to me to hear, you know, AI being the focus of these conversations instead of, like, you know, Lockheed Martin and and other other companies like that that we've traditionally heard about.

And I agree with you. It's always been part of the conversation in the United States that these relationships exist and and these relationships with other countries exist or don't exist because we prevent them. Right? So, really interesting.

And, I I am very interested to see what twenty twenty five will bring for a couple of reasons. One, we do have a new administration coming in with a different perspective on foreign policy. So that's gonna be interesting to see, alright, how do we relate to these other nations, with regard to artificial intelligence? How seriously do we take this?

How closely do we work with tech companies, the big tech companies? Right? We're already seeing evidence that there's very close relationships with that.

Yeah. Everybody seems to be wanting to buddy up with the Oh, yeah. Trump White House before he even shows up in the White House.

So And and, I mean, I mean, it sort of goes without saying.

Right? I mean, that's the that's what's in power right now, and so you want to advance the, you know, your company's position. So I get that. But also with just the the sheer advancement of the technology in general and the pace that things are advancing, that, I think, will also lend itself to to a lot more interesting conversations in twenty twenty twenty twenty five for sure.

Yeah. Before we wrap here up here, I have one more thing I wanna mention to our listeners.

If folks aren't familiar with the foundation's newsletter that Christian Koch puts out, it is back. We'll put a link in the show notes for those who might be interested in signing up with this. Christian does a fantastic job of, going out and finding news articles and analyzing them, around things in Internet infrastructure is what he really focused on the last time. But, with the relaunch of this, he's gonna be focusing more on a lot of AI stuff we talked to today, investment in data centers and the networking and the infrastructure that enables all the AI workload processing. So I think this will be a good one for folks to have in their inbox to keep up with the latest.

Awesome. Thank you, Justin. So moving on to upcoming events.

First and foremost, we have PTC Honolulu, January nineteen through twenty two. That's just coming up in a few days. And, Justin, I believe you're attending?

Yep. We will have a group from Kentik go into that for sure.

Excellent. Excellent. Now there are also a few NUGs coming up. That's, part of the, USNUA.

So we have the Ohio Nug Cincinnati coming up on January twenty third, the Michigan Nug in Grand Rapids on January twenty eighth, and the Kansas City Nug in Wichita on January thirtieth.

Now we also have a new, Tech Field Day, event, AI Field Day that's, out in Silicon Valley, but also live streamed on various places like, LinkedIn and, I believe, their website tech field day dot com. That's happening January twenty nine and thirtieth. We, of course, have Nanog ninety three in Atlanta, February three and five. Of course, Kentik will have a big presence there. Justin, are you going to, Nanog this, this one?

I am. Yep. I'm part of the mentorship committee there, so I always look forward to Nanog.

Oh, excellent. Well, that's cool. We also have the Missouri nug on February sixth, and I believe, Justin, you are part of the, the leadership of that one. Right?

Yep. Part of the organizers of putting that one on, so we're looking forward to that. If you're in the, Greater Saint Louis metro area, definitely come check that one out.

It's free to attendees as all of these nugs are.

So Excellent. Excellent. And last but not least, on our list at least, we have Cisco Live Amsterdam, February nine through fourteenth. I will not be there, but, that should be an awesome event, and I always love to consume that content after the fact because it's free online.

And with that, those are the headlines for this week. See you next time.

About Telemetry Now

Do you dread forgetting to use the “add” command on a trunk port? Do you grit your teeth when the coffee maker isn't working, and everyone says, “It’s the network’s fault?” Do you like to blame DNS for everything because you know deep down, in the bottom of your heart, it probably is DNS? Well, you're in the right place! Telemetry Now is the podcast for you! Tune in and let the packets wash over you as host Phil Gervasi and his expert guests talk networking, network engineering and related careers, emerging technologies, and more.
We use cookies to deliver our services.
By using our website, you agree to the use of cookies as described in our Privacy Policy.