Our interview is with Kim Zetter, author of the best analysis to date of the weird messaging from NSA and Cyber Command about the domestic "blind spot" or "gap" in their cybersecurity surveillance. I ask Kim whether this is a prelude to new NSA domestic surveillance authorities (definitely not, at least under this administration), why the gap can't be filled with the broad emergency authorities for FISA and criminal intercepts (they don't fit, quite), and how the gap is being exploited by Russian (and soon other) cyberattackers. My most creative contribution: maybe AWS, where most of the domestic machines are being spun up, would trade faster cooperation in targeting such machines for a break on the know-your-customer rules they may otherwise have to comply with. And if you haven't subscribed to Kim's (still free for now) substack newsletter, you're missing out.
In the news roundup, we give a lick and a promise to today's Supreme Court decision in the fight between Oracle and Google over API copyrights, but Mark MacCarthy takes us deep on the Supreme Court's decision cutting the heart out of most, class actions for robocalling. Echoing Congressional Dems, Mark thinks the Court's decision is too narrow. I think it's exactly right. We both expect Congress to revisit the law soon.
Nick Weaver and I explore the fuss over vaccination passports and how Silicon Valley can help. Considering what a debacle the Google and Apple effort on tracing turned into, with a lot of help from privacy zealots, Nick and I agree that this is a tempest in a teapot. Paper vax records are likely to be just fine most of the time. That won't prevent privacy advocates from trying to set unrealistic and unnecessary standards for any electronic vax records system, more or less guaranteeing that it will fall of its own weight.
Speaking of unrealistic privacy advocates, the much-touted GDPR privacy regime is grinding to a near halt as it moves from theory to practice. Charles-Albert Helleputte explains why. Needless to say, I am not surprised.
Mark and I scratch the surface of Facebook's Fairness Flow tool for policing AI bias. Like anything Facebook does, it's attracting heavy criticism from the left, but Mark thinks it's a useful, if limited, tool for spotting bias in machine learning algorithms. I'm half inclined to agree, but I am deeply suspicious of what seems to be a confession, made in one "model card," that the designers of an algorithm for identifying toxic speech juiced their real-life data with what they call "synthetic data" because "real data often has disproportionate amounts of toxicity directed at specific groups." That sure sounds as though the algorithm that used real-life data produced results that weren't politically correct, so the researchers just made up data that fit their ideology and pretended it was real – an appalling step for scientists to take with so little explanation. I welcome informed explanations and contradictions.
Nick explains why there's no serious privacy problem with the IRS subpoena to Circle, asking for the names of everyone who conducted more than $20 thousand in cryptocurrency transactions. Short answer: everybody who doesn't deal in cryptocurrency already has their transactions reported to the IRS without a subpoena.
Charles-Albert and I note that the EU is on the verge of finding that South Korea's data protection standards are "adequate" by EU standards. The lesson for the US and China is simple: The Europeans aren't looking for compliance; they're looking for assurances of compliance. As Fleetwood Mac once sang, "Tell me lies, tell me sweet little lies."
Mark and I note the extreme enthusiasm with which the FBI used every possible high-tech tool to identify even people who simply trespassed in the Capitol on January 6. The tech is impressive, but we both suspect a backlash is coming. Nick weighs in to tell me I'm wrong when I argue that we didn't see these tools used this way against ANTIFA's 2020 rioters.
Nick also thinks we haven't paid enough attention to the Accellion breach, which leads to a discussion of whether companies are getting a little too comfortable with aggressive lawyering of their public messages after a breach. One result is likely to be a new executive order mandating broader breach notification (and other cybersecurity obligations) for government contractors, I predict.
And Charles and I talk about the UK's plan to take another bite out of end-to-end encryption services, essentially requiring them to show they can still protect kids from sexual exploitation even though they can't actually read the texts and pictures they carry. Good luck with that!
Download the 356th Episode (mp3)
You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to [email protected]. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!
The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.