Geoff Huston 0:00 Well, yes and no, you see user behavior expectations are actually all important. It isn't what the system does. It what's users think the system does. It's actually more important. And we had trained people for the last 100 years using telephony that the telephone company is honest, doesn't leak, preserves your secrets. And when we looked at the diverse environment of the Internet, we took that expectation and misapplied it. So you go and visit a website, private act: me my screen that website? No, no, no, no, is the website operator bound by the Telecommunications Act of which country, where's the user, where's the website? What are the expectations of privacy here? And while the user might think that that's a private act, in reality, it certainly wasn't. And in some ways that was a harmless thing, that information was leaking. But there was a second factor out there, and that was the inability to find models that paid for this Internet George Michaelson 1:17 you're listening to ping, a podcast by APNIC discussing all things related to measuring the Internet. I'm your host, George Michaelson, this time, I'm talking to Geoff Huston from APNIC labs again in his regular monthly spot on ping Geoff, and I discuss the side effect of the relentless pursuit of better privacy across all stages of bringing up a connection on the modern Internet. Now this is overwhelmingly a good thing, especially for people who want to remain private. Even just asking about a DNS name can expose your intent to other people and many facets of how the Internet works, invite others to look at your traffic and monetize or otherwise take advantage of what they can see. Advances in cryptography and adoption of methods of querying and connecting to services which limit your exposure are now increasingly codified into the web browser and so apply to your connections by default. But what does this mean for independent measurement of the network, if packets are occluded and no insights can be gained as to what is going on in the network, what do we do to measure how do we cope when the network is going dark? Geoff, welcome back to ping. What shall we talk about this time. Geoff Huston 2:41 Hi George. Look. It's good to be back with you again. What should we talk about? Well, I do a fair amount of work of measurement, measuring all kinds of things, you know. How are we doing with these six Where have all the users gone? What technology are they using? Do they care about privacy, all that kind of stuff. And there's a deal that you can measure. But I'd like to sort of elevate it a bit more and take it up a level and actually talk about, how long can this last? How long will the network stay in such a state that we can see what's going on? [George: wow], and a bit like clouding your vision with cataracts. No, I don't have any listeners. I'm still good. Both eyeballs are working. So they say a bit like though failing vision, I think the Internet is suffering from failing clarity, [George: wow], that we have to increase the power of the spotlight and look deeper, [George: yeah], to find things that used to be trivially obvious. You know, a decade or two ago, are now really, really hard to tell. And it's getting worse. George Michaelson 3:44 You know that could be very bad Geoff, because there are all kinds of management speak phrases like "that, which you cannot measure, you cannot know", but in real sense, not being able to describe the shape of a complex system like this is really not good for stable technology and development and an understanding of cost and benefit. It would be really bad in so many different ways if our eyesight into the state of the network was going dark. So please do explain Geoff Huston 4:15 that which we cannot see and we invest in changes in investment into random gambling. And you go, oh yeah, that's fine, except when you consider that the amount of money that's being invested in communications technologies is well and truly above the billions of dollars a year, and in some cases considerably more so. And the less we understand and see of what the network, the more that investment is a bit like. Well, it's a gamble. No other way to describe it, and that's crazy, because we need this network. This is how we live, how we work, how we play. You know, everything we do is now digitally tainted, like it or not, and to cloud, the investment in that infrastructure with such levels of uncertainty is deeply, deeply disturbing. George Michaelson 5:06 So can you explain a little bit why you think we have a problem here? What is going on that leads you to say it's going dark? Geoff Huston 5:14 Okay, so the issue is, a long time ago, we were in this odd position of networks were uniquely able to see who was talking to whom and what they were saying. Because if you think about networks like the telephone network, the telephones were so dumb that the only way you gained an insight into who was saying what to whom and where was you had to tap the network and listen on the wire. And so the network was in the sort of one stop shop for measuring what people did with their telephones. And this is fine, except the other part, the flip part of the coin, is that people who use the telephone had this expectation that whatever they said was actually private to the person they were talking to, and they didn't think they were if you were on a microphone connected to a radio doing a global broadcast, no, this is between me and you. George Michaelson 6:12 That could only have ever been true up to the limit of people realizing that it was possible to tap the phone once that became common knowledge that wouldn't have been their automatic assumption. It would have been their earnest hope. Geoff Huston 6:26 Well, I suppose if you're a dastardly evil doer and you heard about telephone taps, you'd probably be aware of it. But oddly enough, there were reassuring things that were happening in the telephone system that kind of reinforced that view all of us who worked in the telephone business had to sign a comprehensive undertaking. In my case, it was the Australian Communications Act, and were I to divulge anything about what people did on the telephone network that I learned in the course of my job, or use that information in any way, the way the thing I signed said, it's off to jail with you. You know, do not pass go. Do not collect one set you are done for. You've just violated it. [George: Yeah] And I suppose to reinforce that there were a couple of folk who, for various circumstances, did breach and were punished by the full extent of the law. So that expectation that the people who worked in the phone business who had access to this information were constrained to preserve the privacy of users, [George: yeah], and even though the network was privileged, users had some trust. But what they were saying was, on the whole private with the exception that if you and I were planning the bank heist, you know, the diamond robbery of the century, and we leaked a little bit of it, you could probably be assured that someone would get wind of it and be tapping the phone. George Michaelson 7:49 But that's under a judicial process. It required [Geoff: absolutely] processes, obtaining a Warren under the crown Lawful Interception. But if we kind of go back into history, back to the days of Lady Diana, certainly for the telecommunications that became normal in that almost pre Internet world, things like mobile phones were being listened to by randoms using radio ham equipment in all kinds of ways. People had basically tossed their right to privacy out the window. Geoff Huston 8:24 Well, they tossed it when they bought a mobile phone. This is the old analog mobile phone system AMPS, as it was called. I remember hearing a very inventive story from a lad in New Zealand who had figured out that if you had a Motorola handset and a paper clip, this thing would listen to any conversation that was happening at that time over the analog radio system. It turned into a massive piece of spyware by just shorting two available terminals on your phone. [George: blimey] you know, you bought these phones almost at your peril because no one had figured out that this was interesting. It was only with the advent of digital mobile phone systems and GSM that encryption of the signal, because the signal was now bits, not just analog waveforms, the signal was now able to be encrypted and casual interception was not possible. Don't forget, it wasn't as secret as you thought, because the law enforcement folk, the long arm or the law insisted on access to keys. So, you know, when push came to shove, it was still out there, but only for certain folk with certain, you know, legal instruments to let them do so. George Michaelson 9:30 in the normal course of events, in the age of telephony, allowing for this, ups in mobile phones before they went digital, people had an expectation your telephone calls pass over services by people who keep secrets and who are legally obliged to keep secrets. Now, the Internet wasn't exactly like that Geoff, was it? Geoff Huston 9:50 Well, yes and no, you see, user behavior expectations are actually all important. It isn't what the system does it wants to users think the system does. It's actually more important. And we had trained people for the last 100 years using telephony that the telephone company is honest, doesn't leak, preserves your secrets. And when we looked at the diverse environment of the Internet, we took that expectation and misapplied it. So you go and visit a website, private act. Right? Me my screen that website? No, no, no, no, is the website operator bound by the Telecommunications Act of which country? Where's the user? Where's the website? What are the expectations of privacy here? And while the user might think that that's a private act. In reality, it certainly wasn't. And in some ways that was a harmless thing, that information was leaking. But there was a second factor out there, and that was the inability to find models that paid for this Internet. You might recall, and some of the older listeners of this might recall the hysteria of the late 80s, early 90s, under the bandwagon or the "Internet must remain free" that this whole thing had been a project sponsored by the Almighty coffers of the US defense industry, dear old ARPA, and when it sort of started to turn over into a more public stance. ARPA was still picking up a lot of the trip, a lot of the tab, [George: right] There was an awful lot of service that cost very, very little, if anything. And there was this expectation that the Internet was this gigantic free resource, George Michaelson 11:36 yeah, but free is to the people using it. It's absolutely not and never has been a technology without a cost of being provided. There was a lot of cost. Geoff Huston 11:48 I dig the world's Digital Archives with my casual search term that I put into a Google search bar or whatever, and a huge amount of compute power for a small few milliseconds gets focused on my problem, goes through the huge indices, lots of computing gets deployed, and out comes some plausible answers, cost, oh, you go. We've got it down to a point where it's fractions of a cent, maybe a cent or two. But there are 10s of millions of these search queries a second. 10th of a cent adds up to a lot of money. Very quickly. I've never paid for search neither of you. How are they absorbing this cost? [George: Yeah] well, that was the big conundrum. It was a big conundrum over, well, we've connected up the Internet, and people should generate content to make it worthwhile. And the content people said, Well, yeah, but who's going to pay me? And the service providers said, Well, we've spent all of our money building the network, and users pay us to access it. Do me just say we've got to pay you to generate content? Let me get off the floor because I've just rolled off my chair because I'm laughing so hard. [George: Yeah], that didn't work, you know, and the content folk were kind of going well, if I can't get compensated for this, this work, you can whistle. I'm like, I'm not doing this as a charity. Somehow or other. It's got to pay, [George: yeah]. So the attention turned naturally to advertising, naturally because it worked for newspapers. It worked for free to wear television. It works because, in essence, access to consumers is worth money, and if you can provide that access, even if it's not very good. And don't forget, you didn't control who bought the newspaper, so everyone saw your message. But if the right person or people bought it amongst all the millions that did it was worth it, so, largely untrained, largely unfocused, but we got used to it, and newspapers were rich. George Michaelson 13:50 The idea out there when you consider things like television advertising and the intrusion of audience measurement techniques like the Nielsen Corporation, news advertising agencies were already starting to think, knowing who's seeing our stuff, measuring at volume who sees and translating that into effectiveness and some sense of targeting is in itself, immensely valuable. It's one thing to scatter an advert into a newspaper for low cents per reader and hope somebody is going to buy. And it's another to be able to say, I know on average this many eyeballs were watching your advertising slot, and you can start to tell me how effective the ad is for sales. So the idea measuring ads getting some value, definitely out there. Geoff Huston 14:39 Oh yes, you know. And ads were certainly targeted based on television programming, for example, or based on the page and the other articles on the page in that newspaper, because, in some ways, the most effective advertising the time when advertising ceases to be an intrusive distraction and starts to be a gentle. Timely and helpful prompt. I think George, you need a cup of water right now. George Michaelson 15:03 He said, sipping at my cup of water indeed. Geoff Huston 15:07 Now imagine if, instead of me saying it, it was an ad on your screen at that very moment, you would feel spooked because it's almost the exact point of comprehensive information placed in front of you at exactly the right time when you're either at your weakest, your most susceptible, or it's just good information. [George: Yeah], and you know, bad ads are a failure of information. Good ads, effective ads are a triumph, and it's the information that really, really counts. So when we started to do profiles based on who's visiting my website, based on this idea of the more you behave on the Internet, the more you betray you, your age, your interests, gender, your location, education level, almost everything about you. You know, health issues become obvious if I saw your web profile that your profile of usage, George, A, I'd be sick and disgusted, but B, you know, I would know you possibly better than you know yourself. George Michaelson 16:13 Yes, we've had stories in the business intelligence behind tracking people's viewing and purchasing decisions that are quite obviously exceptionally intrusive, recommending to young women that they investigate choices about post birth experiences before they've even gone out and confirmed that they're pregnant was, I think, the most notorious incident that cropped up. But people hadn't appreciated the extent to which correlating data of use of the Internet, use of shopping, engagement in the digital surface, exposes this kind of information. We're all tracked Geoff Huston 16:51 in the early days of Uber having a fight with the Philadelphia City Council, and they had put investigators out there to use Uber and report on this dastardly scheme to, you know, undermine the tax operation of taxis. So given the challenge, Uber said, Well, can I detect if an Uber user is actually an inspector? Easy, oh, look, there are no cars available for you, sir. A triumph of information. You know, not only is it provision of the service, but in some cases, it's the denial of the service and what they wanted. George Michaelson 17:24 But the funny thing is here Geoff that this is now a desire to understand the nature of interaction in the system that isn't really about the telcos and the people running the cables and the service, the routers and the switches. This is a desire for people on the edge who make content and who sell advertising to understand the remote party eyeballs and have deep insight into them that is completely outside that legal constraint. Don't look. Don't tell. Geoff Huston 17:53 Well, this is true that the folk with an interest were the content providers and were the content operators. But don't forget, that was the thing about the Internet. It wasn't people talking to people, it was consumers talking to goods and service platforms. And the goods and service platforms were interested in who was actually accessing them. So it wasn't that the network operator was was interested. Yes, they were, but it was the Poke who provided services that wanted to monetize that content. And the best way I can monetize that content is taking know your customer to the extreme edge. You know, be your customer in so many ways. But there was another party who had an interest in you. [George: Yeah], you see, you might be planning evil deeds. You might be an illegal alien in my country. You might be someone that the state is not taking so pleasantly at you, is wanting to know if you've overstayed your visa, or all other kinds of dastardly acts. And certainly in the US with the traumatic episodes around 9/11 which was seen as a failure of information. If we'd have known more, circulated more understood more, you could have stopped it was the mantra of the day. And whether it's true or not, that was certainly a position taken by a number of agencies whose role was this kind of protecting their society through massive surveillance. George Michaelson 19:27 Yes, and we've seen the quite strong signal from social justice groups who are interested in aspects of society, problems involving exploitation of women, migrants, children that having seen that signal, national security override. It's in the public interest. We need to see this data. They're putting their hands up, and they're saying, Wait, here are these things that are wrong on the Internet, that are bad and disgusting on the Internet. Why aren't we applying the same class of criterion to know? Who said something, where it was directed. Who are they talking to? So it has widened out into a much bigger social agenda that there are people who state: we need to know what's going on. Geoff Huston 20:13 And when the NSA, the National Security Agency in the US, wants to do something with its significant budget and access to resources and large workforce of contractors, they don't muck about, and as Edward Snowden leaked a little over 10 years ago, it was a large scale effort to do domestic surveillance in the United States on the Internet, using the fact that most of the applications and services we use really did not think about how much information was being spewed almost in clear text. For example, when you had a Kindle reader from our favorite book company, Amazon, it would feed back the data fumes of your reading. How long did you spend on each page of that book to the nearest fractions of a second? [George: Yeah], you know, where are the naughty bits that you enjoyed reading? George Michaelson 21:09 George has been reading page 27 of the anarchist cook book for a very, very long time. What's on page 27 Geoff Huston 21:18 Exactly! He wants more of that. So give him books that have more of that. So in essence, we never took it that seriously, and agencies like the NSA just walked into this and found a complete open slather of phenomenal amounts of information. But for some reason, and this is his expectation, that we thought these were state old telephone companies, we never quite realized that they'd grown the same way as we had grown and become way more sophisticated in what they were doing. [George: yeah], and the implications of what they were doing was actually quite scary, and when it got flipped back to a bunch of lefty liberals who were also technocrats, otherwise known as the Internet Engineering Task Force, [George: huzzah!] Huzzah. They've always been on the whole, and that's, you know, a generalization rather than a, you know, prevailing rule. But they've been on the more liberal side of politics in so many ways. And when the IETF got to hear that their protocols, their dearly beloved technologies, were being exploited in ways that they'd never thought possible, at a scale that was undreamt of they reacted. And the best of these was actually written by an Irish gentleman, Stephen Farrell, an academic in Dublin, and the text from the RFC 7258 pervasive monitoring is an attack on the privacy of Internet users and organizations. Pervasive monitoring is a tech that needs to be mitigated where possible, via the design of protocols that make pervasive monitoring significantly more expensive, infeasible. George Michaelson 22:52 Wow, that's a big call. Geoff Huston 22:55 Turn it on. Hide everything. And oddly enough, and this is one of these sort of strange areas, I think, where motivations kind of matched in so many ways. Some of the giants took it on. Now, there'd been a battle over the last 15 years about cryptography, about being able to create a secret where no one except the intended party can unlock it. George Michaelson 23:22 We tend to think of cryptography in military spy contexts, but you only have to move to one side to finance to enter a realm where as soon as banking and finance industry wanted to use digital communications, they very, very quickly became concerned that they needed an ability to constrain the data that they were sending, even for things as benign as the ATM cash network, they wanted that cryptographically protected from day one. So alongside all these considerations about ubiquitous privacy, the finance sector had been saying, if you're going to deal with information about people's personal visa, card and transactions online, there is a fundamental level of privacy that needs to be maintained about that information, so we can actually offer the guarantees that go with having a service in Money. Money wanted some of this. Geoff Huston 24:22 There was a legitimate desire for good encryption, but equally, there was a desire not to make it available, or to make it available in a way that was compromised under President Clinton's rule. There was the so called clipper chip, which was an encryption chip that was pre broken that if you had the secret key the back door, it was all exposed again, and the US at one point, took encryption technologies and classified them as a strategic munition, a munition such that its export was controlled, and out here in Australia, encryption was not accessible to we are mere Colonials. At the end of the Second World War, the British government generously gave to its allies Enigma machines, proclaiming that no one had ever managed to break the massive secrecy of the use of enigma and that their use of this would be brilliant, fantastically protected, and no one could break it. AHEM, so, you know, everyone plays this game somewhere or other, sometime or other, and yes, there was a game around cryptography. But oddly enough, I think the demand from the market for good, secure cryptography overrode those political qualms about putting out pre corrupted encryption technology. Now I say this as a Westerner, if I was living in a different country that had a different view of the US, I think I would be claiming right here and now. That's not true. Even the encryption today is being broken, and you're just being told a lie to make you use it. I can't tell I'm not a cryptographer, but nevertheless, the common mythology we tell ourselves of these days is that the encryption we have is pretty damn good, and if you use big keys, using RSA with 2048, bit keys, or elliptical curve cryptography, it is infeasible to break that encryption unless you have a huge amount of resources and a huge amount of time, [George: yeah] and by huge It might even be centuries. George Michaelson 26:29 So these protocols, these mechanisms, like RSA, are built on a model of cryptography that has the two components, the public part and the private part, and they've become bound into things that we now take for granted, like trusting the name of the website you go to visit, trusting that when you connect to that website after you've said you're going there, and possibly even before that very few and hopefully no people know who you are and that you're going there, And if the bank needs to know who you are, aspects of how you prove who you are are being protected using this technology. So it's not only bound up in privacy over that session, it's also bound up in identity. Who is using things? Who are you across the divide we're depending on this technology. Geoff Huston 27:21 that is true, authenticity and encryption go hand in hand, and I not need to know I'm talking to you and nobody else, not even someone who's trying to fake you. I need to know I'm talking to you, my bank, my shop, you know you, and no one else. And secondly, I would like what we say to be encrypted, to be private, be a secret known only to you and I. And the cryptography that we're using is public private key cryptography, where, in essence, one half of the key is public information. This is Geoff's key. If you encrypt stuff using Geoff's key, oddly enough, Geoff's public key cannot decrypt it. Nothing can decrypt it except just one thing Geoff's private key that only Geoff knows. So you can have a message to Geoff encrypted with a public key, put it on broadcast, radio, you know sky writing. Who bloody cares? Put this encrypted message everywhere you like, but the only one who could recreate the original message from the sender, because it's addressed to me, encrypted with my public key, is me. The only person who has knowledge the private key and to encrypt it, you didn't need to know that. You just needed to know my public key. George Michaelson 28:41 Good technique, isn't it? Geoff Huston 28:42 Well, it's even better than that. What about if I encrypt something using my public key, so using my private key, so I encrypt something that, oddly enough, nobody else could have encrypted in this way, because my private key is known only to me, but everybody knows, because my public key is public that it was me. [George: Yeah], I can't repudiate it. I can't say no, it was George. It's George. No, he did. It wasn't me. If I generated this using my private key, I did it, no going back, no repudiation, no whoopsies. Wasn't me. It's kind of forged in stone. It's there. [George: Yeah], these are pretty classic, tough technologies. They work. And the web folk, in this rise, after 2014 of encryption in the web, took on the technology of Transport Layer Security, which incorporates that strong form of public private key cryptography. And they said, Look, if you're not using encryption, I'm going to bring up a little conversational bar, and, in fact, I'm even going to paint the screen white and say you're about to head into some weird places I don't think you should go there are you really sure? Not just a pop up, not just a minor annoying box, it was kind of like the one. Whoop, fire alarm. Don't go here. And if you were sure you knew what you're doing and click through three times, you would get there. But the browser vendors were kind of hoping that you didn't bother. George Michaelson 30:10 It was quite a remarkable cut over, because in the time before you were routinely just speaking in the clear to places, reading things and sending information, then there was a transitional period where, if you went to somewhere and the cryptography looked very slightly odd, you'd get a red flashing warning. Not sure you're fully protected here, but hey, I told you. And then it flipped over to, you really should be protected all the time. I'm telling you that you're not protected. And now it's flipped over to, I'm not even going to let you go there. I'm stopping you from going there. Geoff Huston 30:43 Well, I might let you go there, but ten mouse clicks later and a whole bunch of costs to the user going, Look, I know what I'm doing. I'm going to waste a whole bunch of minutes trying to get there before you actually get there. George Michaelson 30:54 And you said a thing in talking about this, you said TLS, Transport Layer Security. So we need to be quite clear at this point, this is not about you and me maintaining privacy right up at the top of the stack using encoding over a system that otherwise is just sending plain old packets anyone can look at anyone can look at the packets, but once you talk about what is encoded in that transport stream, you have zero visibility into what is being said. Geoff Huston 31:25 None, completely opaque. All you can tell is that I went to that site that's the only bit left in the clear, and it's all sealed up, and the transformation is stunning. Cloudflare published in a thing called radar. Cloudflare radar some stats about the use of this kind of encryption technology. 91% of users, George Michaelson 31:45 91% Geoff Huston 31:47 91% of their stuff is now delivered over TLS. That's complete. It's effective. So the web is largely now moved dark. You can listen on the wire. Won't help, will not help you. You have to sort of try and get the starting conversations where we say, hey, I want to talk to the bank of Geoff. I want to talk to my favorite weather site. And that's all you get to see. And there's work afoot to change even that and encrypt that. [George: Yeah], that's not all, because the other thing that was very quickly noticed was that the DNS is the starter of everything that looking up the Domain Name System and resolving a name to an IP address happens as a precursor to every single transaction. And the DNS is not just a little bit chatty, not just a tiny bit chatty, it's a menace. Queries get splayed in places you never thought possible. All of it in the clear, Hey, George is going to this site tell everybody as a broadcast shout. George Michaelson 32:52 So while we're talking about protecting the session that I actually make between you and me, Me and my bank and having privacy other than I went there to the people along the pathway, opening the book to find out how to get to the bank, necessarily told any number of unknown, unconstrained, unrelated people I was asking about the bank or any other website, Geoff Huston 33:17 any other and you go, Well, I'm going to use certificates to enforce this encryption. How do I know that certificate hasn't been revoked? Well, ask the certificate service if that certificate has been revoked. Oh, so you mean to say the certificate service that talks about the revocation of certificates knows that it's me and knows the side I was trying to get to. Who are those people and why are they in on my secrets? These days, a whole bunch of things never check for revocation anymore, because the privacy leak is so mouth mind bending, the huge that it's sort of a trade off between, do I compromise privacy by asking you for certificates valid, or do I compromise myself by using a potentially bad certificate, there's no winners there. George Michaelson 34:03 So the more we drive to understanding a need for pervasive security, and the more we take Stephen Farrell's goal seeking of ubiquitous privacy should be the norm and should be encouraged. The more we find corners in how we use the technology and say, Wow, we're leaking information just the bare act of doing this pedestrian thing associated with going private. [Geoff: Yes], told people I was going to do a thing that I wanted to be private. How can we make that first bit a bit more private? We're shutting down the visibility of all these things. That's what it sounds like. Geoff Huston 34:41 Ah. So along comes obfuscation, and it's sort of a compromise where you let a number of other people into part of your secrets, but you gain amazing amounts of privacy. Now I'm going to try and do this in English. And I'm probably gonna lose people, but let's give it a try. George Michaelson 35:03 Well, it's better than Latin. Geoff Huston 35:04 It's better than Latin, that's right. So what I want to do my transaction, I encrypt with key one, and then around that, I encrypt my request in key two, so I've got double our encryption, sort of my identity, and the outer envelope is encrypted with one key. The contents of the letter, the request is encrypted with a second key, so I don't send it to the destination. That would be silly. I send my letter that's my identity is encrypted to helper number one post office, number one who only has the key to the envelope and it says, oh, Geoff's making a question. I will put this in a different envelope as me as relay number one and send the inner letter that I can't decrypt onward. George Michaelson 35:57 I'm assuming that means that the component of encryption over the question you're asking didn't include things that identified you when it was encrypted. So they can ask that question without passing on. Geoff wants to know. They're just asking the question, What is the address of the bank? [Geoff: Right] Geoff isn't in that question. Geoff Huston 36:17 So the second part goes to another relay that deals with the second key. And so actually, what you get as a result of this is that no single person other than me knows who you are and what you asked. George Michaelson 36:32 Somebody knows what you asked, Geoff Huston 36:35 yes, George Michaelson 36:35 and somebody knows who you are. Geoff Huston 36:37 You asked something, George Michaelson 36:39 but they don't know the join over both pieces of information Geoff Huston 36:43 correct, George Michaelson 36:43 and a third party might know a question was asked, but they wouldn't necessarily know who asked it, or they might know you're asking questions, but they have no idea what they're about. Geoff Huston 36:55 right? You actually need those two parties in the middle to collude, and as long as they don't, it's really, really hard to tell. So this got productized. This got formed into a product, and it's called Apple private relay service. George Michaelson 37:10 That's a very specific name there. Geoff, Geoff Huston 37:13 I give all my secrets to Apple. You don't actually, you give your identity to Apple, but two part encryption, Apple don't know what you're asking for, and Apple actually have contracted three other players, Akamai fastly and CloudFlare, who act as the second party. And they take the goop that Apple sent them from the first relay, and they actually know what's being asked, but they don't know who's asking. Fascinating. So here is really solid encryption. It makes everything invisible. It's great, but, George Michaelson 37:48 oh, there's always a but Geoff Huston 37:49 you do have to pay pay for an apple iCloud subscription. You know, some amount of zlotys in your local currency. What's the uptake rate for the most astonishingly good encryption service. We know about it is good. It is really good. 16% of the world's users use Apple infrastructure equipment one way or another. If everyone thought as a deal, we'd see update rates of 16% now, using our ad measurement system, we can't tell who's using this stuff. We don't know what they're doing. We really don't, [George: yeah], but Apple very kindly say if you see a query coming from this address, it's from our relay service, and we will be generous, we will tell you which country they're from. George Michaelson 38:32 They're using a structural method to flag geo location, at least to the country level. Geoff Huston 38:37 Yes. So content folk can do content restrictions by country, and people like me can say, well, what percentage of folk we trigger to ask us for material come from addresses that are using Apple's private relay service? George Michaelson 38:54 Drum roll, Geoff Huston 38:55 yeah, right. It ain't 16% it ain't 1% it ain't point. 1% it's point oh, 2% two and 1000 George Michaelson 39:03 so at this point, you actually stand out like a sore thumb. If you're using this obfuscation technique, there's no hiding in the crowds. You're visibly trying to make it more private what you're doing. But nonetheless, in a more systemic way. There's a technology afoot here that could make it significantly harder to know what's going on, and it's not yet being widely used Geoff Huston 39:30 even that bastion of protection of privacy, NOT North America, seven in 1000 Australia, six in 1000 everywhere else, a lot, lot, lot, lot worse, Europe, three in 1000 and so on, no one cares. And it sort of leads to this observation that we've always told each other in today's digital world that privacy is dead, but no one cares. And this kind of measurement actually says if you think privacy is protecting you, you're deluded. Users don't care. They will happily use Gmail. They will happily use Google search, even knowing that Google looks at every search term and produces a report called Google Trends about what people are asking for, Google Chat, Google, AI. All the rest of these machines are looking at you with even more concentration than you are looking at them. [George: Wow]. It's a very one sided relationship, and you're not in charge. So why is this a thing? Why do folk care about privacy? Well, it's they're not looking after my interests. I now know that, but quite frankly, Google spent a fortune getting to know me, Geoff, age, address, you name it. They know it before I know it. And what they don't want to do, because it took them some amount of money to do that, is give it to Apple incidentally, give it to meta incidentally, or Amazon or anyone else. It's their data. And even though you'd say, well, actually, Google, you're wrong, the data about me is my data. Google go, no, no, you don't understand. [George: Yeah], I own the data about you I've collected at a great, great cost to me, and I'm going to protect that secret from everybody else stealing it. And it's not Google just George Michaelson 41:17 but I'm not protecting it from people buying it. Geoff Huston 41:22 Oh god no, but you know, I'm protecting it from meta, and meta is doing the same thing, protecting it from thieves like Google, etc. And so this ecosystem is actually all about deep and I think valid suspicions that each other are trying desperately hard to protect their data from the other party. George Michaelson 41:44 The interest in protection has moved from your concern about your privacy in communicating with your websites to their concerns about their privacy of insight into what you're doing using technology, because that's their value proposition. Geoff Huston 42:01 So to a user of the Chrome browser, an Apple iPhone is hostile territory. To a user of an apple app, an Android handset is hostile territory because the app is giving away information, not only to the host system it's running on, but also potentially to other applications that are active at the same time. And it's kind of protecting that that is tantamount, is a real consideration here. So it's not about me anymore. It's about each application being totally paranoid about every other application. How do you fix it? George Michaelson 42:41 Do you want to fix it? Geoff Huston 42:42 I'm Google. I want to fix this. George Michaelson 42:44 Ah, Geoff Huston 42:44 I've got your interest at heart. George, anymore, that facade disappeared, you know, by 2016 that excuse to put in massive high grade cryptography into all of this. You know, I'm protecting you. I think that excuse disappeared years ago, and it really is all about the interests of the folk who are creating these applications. George Michaelson 43:04 So "fix" in this context, means, how can we make sure we have a good moat around our intellectual property achievement, and how do we also gain insights into other people's technology to the extent we can, because that's valuable, that's what fix means. Geoff Huston 43:21 So fix means QUIC, the QUIC protocol, fix means now that everything I say is encrypted into bundles of UDP that have nothing to show. And if I want, because QUIC doesn't operate in the kernel space, it operates as a user level application, I can send 1200 octet packets, even if the payload is only three octets big, because no one can tell I'm doing it. I'm encrypting everything, and you can't tell, even if you look at packets and timing, what's actually happening from the outside, you can't see what is happening in applications if you just look at the network, [George: right] This is the ultimate, the ultimate sort of end play of the old ATM networks. Everything was 53 octet cells. But imagine if every single cell was encrypted. All you'd see is a bunch of anonymous little 53 octet buckets, meaning nothing. George Michaelson 44:16 Yeah, I've been to meetings where people have discussed machine learning models, looking at Inter packet arrival time, packet size, distribution of packets sent and packets received. Geoff Huston 44:27 Oh yes, George Michaelson 44:27 in classic IP, classic TCP, TLS modeling, we can tell this flow is a video stream. We can tell this flow is an identity exchange. And what you're saying is, if that was the last loophole where someone on the wire could know about what you're doing and potentially get measurement at scale about the nature of use of the network. A protocol like QUIC is closing off even that loophole, because it may just look like a constant stream without timing information, Geoff Huston 44:58 all of the control information. This is a real packet. This is a dummy filler. This packets only got three bytes of payload. This packets got all 1200 and you make sure it never fragments. So every packet is 1200 octets. Every packet is encrypted. It looks like the last one and the next one a bundle of bits. You can apply all the stenography tricks in the book you like, you will learn nothing, because, you see, Google's adversary is an entity who has equal capability to Google. Meta's adversary is, if you will, meta for the same reasons. In other words, they're in their nightmare scenario where the adversary they're working against is indeed themselves or exactly equivalent, and the demands of trying to hide things are now extreme, absolutely extreme, and the financial motivation is, my entire asset is the collected data of people who use my service. That's my only asset. If that gets compromised, I'm in deep, deep survival struggle because that's what I'm selling. George Michaelson 45:59 We have to remember here the we, the my in this is the Google, the meta, the CloudFlare, the Comcast, the AT&T the Telstra, they are protecting their access to market knowledge about customers use of a network from other agents worldwide. It's not about you and it's not about your bank. It's about the hyper scalers. Geoff Huston 46:23 It's about themselves. Self interest is the most powerful, powerful motivator we know. What does this mean when I look at a network and capture all the packets? I remember going to a networking meeting. God, I'm ... betraying my age. But I think it was in 1990 where a researcher, and I'm going to name Vern Paxson, had sat on the edge of a university campus and captured all of the Internet packets for three months and entered and leave. Left that campus. He didn't bother telling anyone. He just did it because he could. George Michaelson 46:54 Yeah, I remember going to meetings much later than that Geoff where the Japanese government said, despite having made you sign secrecy provisions that said you wouldn't do this, we would like you to do precisely this for exactly 24 hours and then give the packets for us so we can conduct research on it. And you WILL do this so there is both doing it without asking for permission, that's kind of weird, and doing it because you're made to do it. Geoff Huston 47:22 Now, a couple of things happen from that study and a few others like it, which I thought were really good. 25% of all packets for the DNS. And jaws dropped to the floor going, oh my god, is the DNS that bad a protocol? Yes, it is. It's astonishingly bad at the time. At the time, you know, it sort of talked about the efficiency of the network which we had all thought was horrendously efficient, that packet capture proved that's not really true. [George: Yeah] So huge insights which were actually valuable, valuable. Could never, ever, ever do it today. Couldn't do a full blown packet capture privacy, yeah? And even if you did, here's a bunch of 1200 octet packets that are randomly spaced in time have no discernible pattern with source and destination addresses that constantly change. Go figure. And the answer is, you have nothing, nothing. So how do we do measurement to try and get over the fact that the network is now completely useless, you can't see anything George Michaelson 48:20 to a measurement entity, to anyone interested in measuring who's an independent measurer the network is no longer available to you to understand very much about it. I do notice, however, the agencies like Cloudflare continue to publish stats about elephant and mice packets in TCP, about the CDF, the cumulative distribution about the timings. They are making some data available through services like Cloudflare radar, but again, it's within the limits of what they think is interesting for us to know about. Geoff Huston 48:53 and the limits of what they can actually see, you know, and that's the issue. So how do we do it in APNIC with our measurements? Well, we're the other end. We're in on the secret where the other end that you're encrypting to. We hold the key that encrypts those packets. We might not know what's going on in the network, because you can't, because there's this kind of blindness. Is mutual blindness. Apps don't know the network. Network doesn't know app behavior, but we're the other end of the app behavior, and that's how we got around this problem, by actually becoming part of, if you will, the secret that the user wants me to be part of. I'm the other end of the conversation, and it's kind of getting to the point, and this is going to be the interesting tension that spy agencies will continue to do, spy agency, things folk will continuously want to know, what is the world chatting about, and who's it chatting to, and where and why, but asking the network to tell you that is over. It's been over for more than a decade or more. Now you've got to ask the end users, devices or the servers or both, [George: yeah] and the shrouds of secrecy around both give us little hope that even if it were happening today, we have no idea of how to confirm or deny that we just don't [George: Yeah] so we can mildly suspect that efforts are underway absolutely but beyond that, vague suspicions really can't tell. And I suppose the other part of this is this whole thing about the uptake of Apple's private relay service, which I stress, is really, really good, and you should be using it if you're at all concerned about this kind of massive, massive .... surveillance. Only seven folk in 1000 even in North America, agree with that proposition, and globally, the number is two in 1000 it's tiny. [George: Yeah] nobody cares, George Michaelson 50:49 but it's kind of baby and bath water stuff here, what we've arrived at is a network surface that we actually do have a quite high degree of trust in, in terms of correctness of intent and how things work. I'm not saying it's perfect, but we've managed to deploy cryptography in a way that gives us certain assurances. But if you want to measure it, and if you want to understand qualities about its systems behavior, it's a lot harder now to get inside the machine and understand what's going on. Geoff Huston 51:20 And that very complexity becomes its own weakness, because these days, the whole you'd call it cyberness, and it's not cyber hysteria. It really is a tangible risk problem that there are adversaries who are actually very skilled at this job, as skilled as anyone else, who are actually cracking and finding chinks in the way this complex framework has been assembled and exploiting them for various ends to their own purposes. So oddly enough, the result is not a better form of security, a better form of privacy. It's not - we've gone to all this effort to kind of move one step to the left or right, depending on which your political your political persuasion, but it's certainly not a step forward. I don't think it's a step backward, but it's not a step forward either, and that's a lot of time, effort and energy that's been spent in this distinct step to one side without necessarily making progress. So yes, it's a darker world for all kinds of values of dark, and it's much harder also for measurement folk like me to try and sort of peer inside these networks, to understand what we do. And I don't think that picture is going to change. I really don't. This is a one way path, and we're down it. George Michaelson 52:38 Yeah, we've gone down a direction that's going to have consequence like this, and we now have to work in an environment where this is the normal behavior that's been really interesting. Geoff, in a dark way. Geoff Huston 52:53 I for one, welcome our new overlords, and we'll work for them in the same way as the old overlords. Yes, George Michaelson 52:58 you've written about this online, we can find snuff about this. Geoff Huston 53:03 I think the who sang about it in this song Won't Get Fooled Again for all those boomers out there, same story. George Michaelson 53:09 Your challenge today, listener is to find an independent point where Geoff has written about this problem Geoff Huston 53:15 Absolutely, thanks. Thanks, and that's great, Geoff. Cheers. Thanks all. Thank you. George Michaelson 53:22 That was the last ping recording for 2025 we're going on a short break see you again in January. If you've got a story or research to share here on ping, why not get in contact by email to ping@apnic.net or via the APNIC social media channels also remember the measurement@apnic.net mailing lists on orbit is there to discuss and share relevant collaborative opportunities, grants and funding opportunities, jobs and graduate placings, or to seek feedback from the community on your own measurement projects, be sure to check out the APNIC website for all your resource and community needs until next time you.