A Storm of Asterisks
Casey:
I don't know.
Casey:
I feel like 2021 has been such a whirlwind, which I'll take over the complete show that was 2020.
Marco:
But sure.
Marco:
But that's setting a pretty low bar.
Marco:
I mean, that's not really like this.
Marco:
This is better than like actual hell, like in a fire.
Casey:
Literally.
Casey:
Yes.
Marco:
Like the actual world of actual hell, like where you are just burning in, you know, a hellscape forever, like an actual fire, your body's being torn apart, you're being tortured, you know, you have to listen to Dave Matthews' band, all of that.
Marco:
Oh, can you imagine?
Marco:
So 2020 was indeed not as bad as that, but yeah, definitely not a ton better.
Marco:
So I have to issue a plea.
Marco:
Can somebody at Apple please fix the wonderful Crash Test Dummies album God Shuffled His Feet in Apple Music?
Marco:
I know we've been down on Apple Music recently.
Marco:
oh i have so many thoughts which i'm not going to get into now we have too much to talk about but god what a piece of crap apple music is i know just normally i don't have too many problems with it because i'm a pretty light user of it really but i just i asked my home pods this morning to play the crash test dummies album god shuffled his feet the one with you know that one big album from the from the 90s well done very well done marco thank you if i had more time i would do at the right speed but anyway you're all listening at 2x anyway
Marco:
This is one of my favorite albums.
Marco:
Liking Crash Dash Dummies is one of the weirdest things you can be if you're not Canadian.
Marco:
They were really big in Canada.
Marco:
They were not at all big in the US except for that one song.
Marco:
And if you like Crash Dash Dummies, it's a very weird band to like because every album is radically different than the other albums.
Marco:
And what they did after this, they had one album I liked, A Worm's Life, and then everything after that, I'm like, I'm out.
Marco:
it got really weird but anyway this album it's again one of my favorite albums and i listen and i asked siri you know hey play play this album and that worked you know you don't have to like do anything weird to have it play a whole album in order like you can just say play the album named blah blah blah and it says okay so great plays the first two tracks great
Marco:
The third track, it switches to a live version of it.
Marco:
And then a few tracks in a row were the live version from some live album I've never heard of and don't own.
Marco:
Then after that, it switched back to the studio version for a track or two, then back to a live version, and then back to the studio version for the last track.
Marco:
Cool.
Marco:
Now, the really funny part of this was when I looked like on my iPad with the now playing in the control center for what's going on in your home pods, which I love this integration.
Marco:
I've talked about it before.
Marco:
This is one of the best reasons to use Apple Music and AirPlay 2 because you get this integration that's wonderful where you can interact with what's playing on your home pods or whatever from your phone and from any iPhone or iPad on your network, which is great.
Marco:
Anyway, so I checked that, and it's showing the studio album as the now playing.
Marco:
So Apple Music doesn't think it's playing a live version, but it totally is.
Marco:
And then just before the show, I'm like, let me just double check.
Marco:
Maybe this was something weird today.
Marco:
So just before the show, I went on my phone to see, like, what does my phone version of Apple Music think that it's playing?
Marco:
And it had the same problem where it was playing a mix of live versions and studio versions, but it was a different set of tracks that was wrong on the phone versus what was wrong from the HomePod earlier today.
Marco:
so please apple i i know there's there's got to be someone who works on apple music who either likes this really weird band like the way i do or maybe just is canadian and therefore is more likely to care about this band uh but please fix the crash test dummies album because it's a really good album and this is a really weird thing to be broken
Marco:
i did also i even checked spotify to see like maybe they did some kind of weird reissue of it for you know weird contractual reasons nope spotify version is perfect of course it is and i own this cd i ripped the cd into itunes forever ago i have it on my computer but there's no way like because i have it i have it in you know itunes slash music on the mac i have my version there so it plays correctly because it's local files and i have itunes match and i have apple music
Marco:
But apparently there's no way for me to play my copy of it on my phone anymore.
Marco:
I can only play the Apple Music copy, which is broken.
Marco:
Oh, my gosh.
Marco:
So please, Apple, fix the crash test dummies.
John:
You know, we sometimes make fun of the fact that our App Store apps have an artist and album field because it was the repurposed iTunes Music Store to make the App Store, right?
John:
And the underlying database schema dates back to iTunes and all that stuff.
John:
And it's kind of weird and awkward.
John:
Sometimes I think about when looking at Apple Music or hearing complaints about it, just dealing with my own thing, that the iTunes Music Store was purpose-built to be a music store.
John:
So it can't use the excuse of like, well, we were just retrofitting onto an existing system we had for e-commerce, essentially, right?
John:
And I don't know about you, but I've been in the position many, many times across my career when I'm called upon to essentially create a data model for a thing that doesn't exist.
John:
And if I was making...
John:
Apple's music store, granted, nobody can see the future and know whether it's going to be big or not or whatever.
John:
But if I was given that test, hey, we're going to sell music over the internet.
John:
We need a data model for this.
John:
It's kind of like the USB connector when I, you know, complain so much about, like, if you're tasked with making a connector, like, spend five minutes with a whiteboard thinking about what are the attributes of a good connector and write them down and see if you can hit some of those.
John:
I don't think it's overengineering or overdesigning to think about when making the iTunes Music Store at the year that it was made.
John:
Concepts like, they lead to, they could potentially lead to the problem you have here.
John:
Like, for example, albums.
John:
are released and then sometimes there is a remaster or a re-release or an anniversary edition also sometimes artists have best of collections which include songs from various albums right and like i feel like one brainstorming session with anybody who has any interactional music will lead you to those things
John:
And it's not a huge schema.
John:
It's not thousands of columns and dozens of tables that are interrelated.
John:
You could fit it on a whiteboard, but concepts like that are super important.
John:
I run into this all the time because I have lots of versions of U2 albums.
John:
And if maybe the iTunes store knows this, but if the iTunes store understands that my three different copies of the Joshua Tree are in fact different versions of the original Joshua Tree album from 1987, it is not apparent that iTunes understands that.
John:
But it's a really important concept because then not only can you display that information and understand it, but then you can avoid mistakes like this by saying, okay, you're playing this album.
John:
If you don't give me any other information, I'll play the 1987 Joshua Tree, right?
John:
If you ask for the other ones, I'll play that.
John:
But if I'm playing the 1987... I hope I'm getting this here right.
John:
Sorry, YouTube fans.
John:
If I'm playing the 1987 Joshua Tree, just play the tracks from the 1987 version.
John:
Don't get confused and switch to the remaster or the 30th anniversary edition.
John:
That's how you can tell.
John:
Don't try to match them up by like...
John:
track name or title or especially if the remasters are just also called the joshua tree like i'm not asking like again the people will sit down to make this that's got to come up in the first brainstorming session because it's a concept that exists and if you build that into the data model from day one it makes writing the app so much easier because say someone's trying to debug this from apple music or whatever it can be confusing because the track names are the same and maybe the album name is the same and maybe especially with itunes match where it's trying to look at your files and match them up with the ones ones they have
John:
records of and it's hard to know which one they're matching against this kind of metadata really helps and so i do actually wonder what is the underlying data model and how limited and dumb is it that errors like this come up all the time and there's apparently no recourse for us to like you know fix it by changing the metadata
Casey:
That's very true.
Casey:
And then the audio stopped.
Casey:
It's still playing, allegedly, but the audio stopped.
Casey:
The timer is still, or the counter, the play counter, whatever, the time is still ticking up.
Casey:
And no music is coming out of my computer speakers.
Casey:
I advanced to the next track immediately.
Casey:
Music is coming from my speakers again.
Casey:
And then until about 45 seconds before the track ends, and then it all stops.
Casey:
I'm wired Ethernet on a symmetric gigabit connection.
Casey:
There is no reason that this should not be working, but here we are.
Casey:
So yeah, Apple Music not going well for Casey right now.
Casey:
I'm just going to say that and I will try to leave it at that because we have a lot to talk about starting with some follow-up.
Casey:
John, this first piece of follow-up is for you.
John:
Are you just trying to avoid pronouncing Tatsuhiko Miyagawa's first name and last name?
Casey:
That is exactly correct because I did not have the time to practice and I thought, you know what?
Casey:
This is what you put in.
Casey:
You can do it.
John:
Thanks, bud.
John:
I know this person from the internet and Pearl.
John:
So I had practice.
John:
All right.
John:
So the last show, I was trying to think of some kind of interview where some Apple executive tried to give an explanation of why there is no weather or calculator app on the iPad.
John:
And apparently it was interviewed with Craig Federighi by MKBHD.
John:
We will have a link in the show notes to the timestamp offset where you can hear his answer.
John:
And I had said last show that it wasn't a very good answer.
John:
It's not.
John:
I mean, it's a it's a
John:
You know, public relations answer where you have to try to make a reason that likes you seem good.
John:
And CFED's answer was like, well, we don't want to do those apps unless we can do like something really special.
John:
Like we have a really good idea.
John:
We really want to do them right and well.
John:
On the one hand, it makes you think like you wouldn't say that if you're a savvy Apple executive, you wouldn't say that unless there was actually some kind of internal project to make a really good fancy iPad weather app and calculator app.
John:
Because otherwise it sounds like, oh, we didn't want to do it unless we could do something really special.
John:
You're setting yourself up for criticism if you ever release one and it's just an enlarged version because what do you say then?
John:
So it makes me think that maybe there actually is a very low priority project or two inside Apple to make these versions of the apps.
John:
But the second problem with the answer, of course, is
John:
People don't care if it's something special for the iPad.
John:
Just make the app so it exists.
John:
Like, just make the iPhone app bigger.
John:
It's fine.
John:
Like, people just want it to be there.
John:
Especially calculator.
John:
Like, we really want to do something special.
John:
Oh, really?
John:
With a calculator?
John:
How about having buttons you can press to add numbers together?
John:
Like, it's not... It starts there.
Marco:
Well, and I feel like that's kind of a BS excuse, too, because you look at something like the clock app.
Marco:
Originally, there was no clock app on the iPad.
Marco:
That came later.
John:
They did something really special with it, I think.
Marco:
Yeah, they just blew up the iPhone version.
Marco:
It's fine.
John:
Which is fine, right?
Marco:
That's what we needed.
Marco:
Yes, it's like...
John:
you don't need to do it like that's that that to me was a bs excuse and the funniest thing was they just redid their weather app for ios 15 and there isn't an ipad version of that and and they made it really cool and i think if you took the ios 15 weather app and just made it bigger it would still be a really cool weather app it's not like it gets worse like i understand the idea of like oh i mean especially back in the the early days it was like
John:
If you can't think of some way to add a sidebar to your app on the iPad, you're not really going iPad native.
John:
Like don't just take your phone app and stretch it.
John:
Like it was a criticism of a lot of the Android tablet apps.
John:
It was like I was just the phone app and bigger.
John:
And that's true.
John:
You shouldn't just take your phone app and make it bigger.
John:
But it's also true that people come to expect a certain baseline set of functionality.
John:
Apple has trained them to expect this because it's available on the phone.
John:
And at a certain point, it's better to have a calculator app than to have a really fancy one that takes advantage of the screen space and has scientific calculations and reverse false notation and 10 memories and a graphing function.
John:
That's great if you want to make that up, but you can also just make the calculator and have it be a little bit bigger and people will be fine with that.
John:
Again, they're getting it for free with the iPad.
John:
If you can't think of some way to put in a sidebar or a persistent tape in your calculator, it's okay just to make...
John:
For the 1.0, a big calculator app.
John:
And the weather app, like I said, I think the graphics and fidelity and layout lend themselves well to an iPad-sized screen.
Marco:
Yeah, look at WeatherSpaceDot.
Marco:
It looks just like Apple's, but bigger.
Marco:
Right.
Marco:
Apple can make theirs rotate to landscape and just blow out of the water.
Casey:
You know, I think I've made this joke already, but, you know, if only Apple had some sort of cross-platform framework that they already wrote the weather app refresh in...
Marco:
in order to put it on the ipad like imagine if they used you know like some sort of swift thing that was built for user interfaces i don't know why you're trying to make this joke you realize the ipad and the iphone both use ui kit like that already is the cross-platform framework like like they have three different like if they could use ui kit they can use ui kit plus catalyst on the mac and they can use swift ui like they have so many options yeah exactly yeah they can even use electron here that's getting popular we'll get to that later
Casey:
Hey-o!
Casey:
All right, moving right along.
Casey:
God, we are way behind already, and we're only 20 minutes in.
Casey:
All right, the AMD W6, whatever, video cards or workstation cards, nobody cares because it's Mac Pro stuff.
Casey:
Moving right along!
John:
Oh, he so cares.
John:
There's just one item of Mac Pro stuff.
John:
Come on.
John:
I tried, everybody.
John:
There was some debate last time about whether Apple's graphics cards, these new AMD fancy ones, are the quote-unquote workstation cards, and that's why they're so expensive.
John:
So Hishnash says, the AMD Pro W whatever, the W apparently stands for workstation, cards get the Pro drivers.
John:
This unlocks some driver features and pathways.
John:
When running Windows on a Mac Pro with a W card, you have access to these pathways as well.
John:
So my question is like, okay, that's great.
John:
I understand that you get access to more features in the Windows drivers, but
John:
Is it actually a different card?
John:
Are there any hardware differences?
John:
And so Guillaume Lowell says it is indeed the same GPUs used in the gaming cards with the same performance.
John:
So there's not a hard, it's not like an entirely different GPU.
John:
It's the same GPU.
John:
And Hishnash says there's possibly some binning and he's not sure if the memory controllers are validated for 32 gigs on the cheaper version of this on the non-workstation one.
John:
But I think it's mostly segmentation by AMD.
John:
Apple will be paying AMD a lot more for these GPUs than for a gaming than a gaming OEM would due to the Pro W driver support in Windows.
John:
So it seems like, OK, these are the quote unquote workstation cards.
John:
But the only thing that's workstation about them is that when you run Windows, you get to use the workstation drivers, which expose new functionality.
John:
When you're running macOS, is there literally any difference?
John:
Because if the hardware is the same and the driver is the same, it's very confusing.
John:
And again, you can put the non-workstation AMD 68 or 6900 into a Mac Pro, and it will use, I think, the same drivers as the workstation one.
John:
That's the open question of whether Apple has special workstation drivers or whatever.
John:
So a little bit more on this.
John:
Comparing the W6800 to the W6800X, like the PC workstation and the Mac workstation one, they seem identical except for a slight clock drop.
John:
The W6800 is advertised as 17 teraflops versus Apple's being just 16.
John:
The W6800 on the PC is $2,100, and so Apple's price of $2,800 is not that extreme given Thunderbolt, etc.
John:
Again, they're charging you more both on Mac and PC for the W card.
John:
For the exact same hardware, as far as we've been able to determine, except that on Windows, you get to use better drivers, which expose more of that hardware to Windows.
Marco:
And I think, you know, more memory and possibly a higher grade of memory.
Marco:
I don't know.
John:
But you can get, I think you can get the gaming 6900 with 32 gigs of RAM.
John:
I'm not entirely sure.
John:
That's the question of...
John:
what mix of hardware you get maybe you can get a cheaper memory controller but like the fact that the gpu itself it used to be that you'd get an entirely different gpu like it would be a different chip that had different features in it was often worse in games and better in workstation type stuff but this is the same gpu it's just like features are hidden behind a software thing on windows only and who knows what it's like on a mac so it doesn't make me feel that much better but
John:
Anyway, these are expensive cards.
Marco:
Yeah, and the moral of the story is that Apple is not marking up a $600 card to $3,000.
Marco:
They're marking up a $2,200 card to $3,000.
Marco:
And that first markup was happening at AMD's level, not Apple's level.
John:
Yeah, AMD is marking up a $600 card to a $2,000 card or whatever.
John:
Although it's not the card.
John:
Again, AMD has things like reference limitations, but I think you can just buy the GPU from AMD and then build your own card.
John:
Anyway.
John:
The GPU market is confusing and scary.
Casey:
Oh, speaking of confusing and scary, is that Daisy I hear, I assume?
Marco:
I don't think Hops can make that sound.
Marco:
She's a terrifying beast.
Marco:
She does not like the GPU market.
Marco:
No, no.
Casey:
Or maybe she just doesn't like talking about the Mac Pro.
Casey:
Maybe that's the problem.
Marco:
We are sponsored this week by Memberful.
Marco:
Monetize your passion with membership and build sustainable recurring revenue from your audience.
Marco:
Memberful quite simply lets you sell subscriptions to your audience.
Marco:
Whatever you want to deliver to them, whether it's custom perks, maybe custom podcast episodes, stuff like that, Memberful is the platform to do that with.
Marco:
It's super easy for you to set up.
Marco:
It integrates with all the tools you probably already use, and they let you control everything about it.
Marco:
You have full control of the branding, you have full control over your audience, and you have full control over your membership.
Marco:
Payments even go directly to your own Stripe account.
Marco:
And of course they have everything you might need beyond that.
Marco:
So things like dashboard analytics, member management features, free trials, gift subscriptions,
Marco:
All of this is available on Memberful.
Marco:
It's used by some of the biggest creators on the web for good reason.
Marco:
They align their goals with your goals.
Marco:
They don't want to lock you in.
Marco:
They don't want to play tricks or anything.
Marco:
They want to help you make money from your audience.
Marco:
That way you can sustain your revenue.
Marco:
You can have really lasting audience income, and it's really, really nice to have this.
Marco:
We have this here, and frankly, I kind of wish I built on Memberful some days.
Marco:
There's a lot of times that I wish I didn't have to, you know, build and maintain it myself.
Marco:
And were I building a new system today, I would definitely give Memberful a very strong look because it's, you know, you can get started for free.
Marco:
There's no credit card required.
Marco:
Again, it integrates with everything.
Marco:
They have great support if you need it, although you probably won't because it's super easy.
Marco:
See for yourself at memberful.com slash ATP.
Marco:
Get started.
Marco:
Again, there's no credit card required to get started.
Marco:
Memberful.com slash ATP.
Marco:
Sell memberships to your audience to build sustainable recurring revenue with Memberful.
Marco:
Thank you so much to Memberful for sponsoring our show.
Casey:
Adrian writes, with regard to bug bounties, and I think we theorized on the show, or I don't remember how it came up, but why doesn't Apple pay bigger bug bounties?
Casey:
They have more money than God.
Casey:
Why not just pay all the money for really good bug bounties?
Casey:
And Adrian writes, Apple can't just pay bananas bug bounties, because if they did, all the internal bug hunters would quit and make more doing the same job from the outside.
Casey:
It's a delicate balance, and bug hunters have to want to do the right thing for it to work.
Casey:
I do agree with this, and this does make sense.
Casey:
But then again, Apple, like developers and employees, get a lot of tools and a lot of information that an external person wouldn't get.
Casey:
And I know nothing about hunting for bugs, but it seems to me like that would still be attractive if money is not your only driving force in the world, which for most people it probably is.
John:
I mean, they get health insurance and a salary.
John:
And even if they don't find any bugs, they keep getting paychecks.
John:
I don't think it's an apples to apples comparison here.
John:
The people who are finding them in the outside world, it's kind of like trying to win the lottery.
John:
Whereas getting a job on a security team at Apple is a much different financial and life arrangement that is much more attractive to some people than being outside Apple and competing with the rest of the world in the hopes that you'll find a bug bounty that then you can convince Apple.
John:
Yeah, I also I don't like this argument.
Marco:
And first of all, I think we heard this argument from inside because we heard this from a number of different people on from a number of different names and stuff through a number of different like avenues of contacting us.
Marco:
And so this kind of feels like we actually like hit the right people.
Marco:
with our rant last time um but to me it's you know they're saying like well if apple paid higher bug bounties then we'd have to like then the internal people would quit because they make more on the outside well pay the internal people more you know yeah like that's not the only option here like you could like if the market value of finding these is so high that
Marco:
Some company in some other country wants to sell it to Saudi Arabia or whatever for a million dollars.
Marco:
If the value is so high, then you kind of have to pay it, whatever it takes.
Marco:
And so if it takes paying the internal bug hunters enough that they aren't tempted to quit and play the lottery, as John was just saying, if they can just make good money internally, well, that's the market for that.
Marco:
Apple is in a very high-profile position in the world, and they...
Marco:
They have created, through their success, and good for them, they've created a very high value for exploits of their system.
Marco:
And so if the value of an exploit is a million dollars or two million dollars or whatever it is, who cares how they have to pay for it, who they have to pay, they should still be the ones paying for it, not some random exploit company that's going to sell it to a creepy government.
John:
I mean, you can do what they do with salespeople, right?
John:
So you give them a decent salary, but you say, hey, if you find one of these bugs, we just pay the bounty to you.
John:
Like, it happens for salespeople all the time.
John:
Or I don't even know if there's a base salary half the time for salespeople.
John:
It's like, if you make lots of big sales, you get lots of money.
John:
Like, it's like, the question is, how valuable is this to Apple?
John:
And whatever that number is, pay it to whoever finds the bug.
John:
And I think the internal people, you can adjust and say, okay, well, the internal people get health insurance and benefits and a regular salary.
John:
But also, if an internal person hits the jackpot and finds some kernel bug, or even maybe the whole team does it, like...
John:
Give them the money that you would have given the extra like this is a solvable problem.
John:
You know, this is one of the few cases where Apple having tons of money actually does help solve this problem.
John:
It's not so easy in other cases.
John:
Apple should just hire all the people, especially if Apple's being stupid about remote, which they're still kind of being stupid about.
John:
It's not that easy to turn money into talent.
John:
But in this case, money actually does solve this problem.
John:
And Apple has a lot of it.
John:
And so, like, you know, I don't... Again, I don't think you have to... You don't have to make them exactly the same because I think there are real tangible benefits to be a salaried Apple employee.
John:
Like, say, stock benefits.
John:
Like, things that the bug bounty people don't get.
John:
But you just have to make them competitive and comparable.
John:
That's all.
John:
And then for the external people, like we said last week...
Marco:
make it easy for them to get paid make it so that everybody says hey if you find a bug totally go to apple because you get paid quickly and conveniently because that's the way you get people to send you bugs exactly the the reputation apple should have amongst the security community is that if you find something broken about ios that you can go to apple and get paid well and easily and correctly like that that should be the reputation that they develop
Marco:
They don't have it now, and that's a bad thing, but that's what they should be developing.
Marco:
And if they have to end up paying their internal bug hunters more, fine.
Marco:
That's part of how you get to that end state.
Marco:
They can do it.
Marco:
It's fine.
Marco:
No one has ever said Apple pays way too much money to its employees.
Marco:
I've never heard anybody ever say that.
Marco:
So I think they can afford to raise the salary of this department if they have to and raise the bug bounties if they have to.
Marco:
They can totally do that.
Marco:
And the fact is, that's what the market values these at.
Marco:
And so whatever the market values them at, Apple should be willing to outbid anybody else in the market.
Casey:
Yep, definitely agree.
Casey:
Some quick iCloud Photo Library follow-up.
Casey:
It's funny, unlike Apple Music, which I feel like is nails on a chalkboard every time I use it, I still am mostly enjoying iCloud Photo Library, but...
Casey:
It's not perfect because guess what?
Casey:
It's Apple Web Services.
Casey:
So my laptop, I tried to do an import of some photos into iCloud Photo Library on my laptop and it hung.
Casey:
By that, I mean like the Photos app is still working.
Casey:
It's just they never got uploaded after days, after reboots, after Ethernet, after Wi-Fi.
Casey:
It didn't matter.
Casey:
They never got uploaded.
Casey:
So I thought, OK, fine.
Casey:
On the laptop, anyway, I have the photos repository in its own, not partition, but you know what I'm saying, like volume or whatever the technical term is.
Casey:
Sorry, John.
Casey:
And so I just tossed the volume, rebuilt it, and created a new iCloud photo library.
Casey:
This time, or excuse me, created a new local photo library.
Casey:
This time, it actually synced very quickly, which I was quite happy about.
Casey:
But now I have not gotten any new pictures since the 4th.
Casey:
And as we record this, it's the 11th.
Casey:
It's just frozen in time on August 4th.
Casey:
Wonderful.
Casey:
Great.
Casey:
Thanks.
Casey:
Thanks so much, guys.
Casey:
And then secondly, I went to start fiddling around with smart albums, which the concept of smart albums I really like.
Casey:
In fact, I keep meaning to, I haven't done it yet, but I loved your idea, John, of setting a smart album for the person is Declan, but the time the picture was taken was before he was born.
Casey:
I haven't done this yet, but I love that idea.
Casey:
I think it was a great idea.
Casey:
And I started, for example...
Casey:
um, doing or trying to like have a smart album for pictures taken by my drone.
Casey:
And there were a couple other things I was trying to do.
Casey:
And I feel like there are just not that many smart album, like filtering options.
Casey:
And yes, I think I could have handled the drone or I may have already done that or whatever, but I forget what it was off the top of my head.
Casey:
And I want to kind of try to keep this short.
Casey:
So I'm just going to move on, but I really wish there were more options for smart albums, for things you could filter by.
Casey:
And maybe that's just me, but please.
Casey:
And thank you.
John:
yeah one way you can help to work around that is uh use the criteria that are there to search for photos and then apply keywords to them and then use that keyword for filtering you know i mean like you can you can make your own criteria essentially but because you can make any number of keywords so in the little keywords interface command k add as many keywords as you want and use the existing smart album features to find the photos that you want to apply those keywords to and then use those keywords in your smart albums it's a
John:
It's a little bit of a workaround, but I'm really a big fan of keywords since you can make them up at any time and apply them to any photos you want.
John:
They really help organize things.
John:
And of course, you can apply multiple to the same photo.
John:
So it's a little bit tedious sometimes to apply them.
John:
But like I said, finding them in big batches and applying them usually goes a long way.
John:
And you can always amend them later by removing and adding.
John:
I fully endorse Casey for doing this, assigning keyboard shortcuts to the keywords.
John:
So you can press a single letter to assign a keyword or remove it, like an unmodified keystroke.
John:
So you can just type, like I type D, which is for Daisy, my dog.
John:
That is also for dog, huh?
John:
And I can go through photos and really quickly, like, you know, select a range and hit D, these are all Daisy, or select a photo and hit D to remove the Daisy tag because it misidentified, you know what I mean?
John:
Obviously you run out of keys, but it's kind of like using VI.
John:
Like these are not Command-D, not Control-D, not Option-D, just plain D.
John:
And by the way, another thing, I don't know how you'd figure this out.
John:
I just assume everyone knows because I use it all the time, but people probably don't.
John:
Those shortcuts only work when the keyword's floating palette is visible.
John:
So you won't be accidentally hitting the keyboard to like, oh, I just labeled all my photos accidentally because my elbow hit the keyboard, right?
John:
those key shortcuts only work after you've hit command k and made the floating keywords palette visible so you can make it visible shove it off to the side and then just select photos and just get and hit the thing and it's actually pretty quick it's all just using doing a sqlite update under the covers i'm pretty sure so it's actually pretty fast to remove them it has some visual feedback you can see like it turning red when it removes the daisy keyword and showing daisy and white or whatever when it adds it give it a try
Marco:
We are sponsored this week by ExpressVPN.
Marco:
Every time you connect to an unencrypted network, cafes, hotels, airports, anybody on that same network can gain access to anything you send or receive on the network that's unencrypted.
Marco:
And, you know, we all use HTTPS wherever we can and lots of things are secure, but it's not necessarily always everything.
Marco:
It doesn't take much knowledge to intercept that traffic for anybody, any hacker, anybody with bad intentions.
Marco:
And even ISPs are doing stuff like injecting ads into unencrypted pages and everything.
Marco:
It's kind of a mess out there.
Marco:
So ExpressVPN is a great way to secure your connection, to use a wonderful, trustworthy VPN in places where you have to use someone else's network or an untrusted network.
Marco:
It's kind of like encryption insurance for you.
Marco:
It ensures that all traffic coming out of your devices is encrypted.
Marco:
And that way, the people operating your network can't intercept it.
Marco:
They can't see it.
Marco:
They can't inject ads into it, whatever they want to do.
Marco:
expressvpn is very simple to use if you're going to use a vpn this is a very very well rated one by lots of people you don't have to take my word for it look up the reviews see for yourself expressvpn is very highly rated if you're going to use a vpn for whatever reasons you might have to use one expressvpn is a great choice it's simple to use it works on all your devices it's a single app you install you click one button to get protected that's it it's super easy
Marco:
secure your online data today at expressvpn.com slash ATP.
Marco:
That's expressvpn.com slash ATP.
Marco:
There you can get three months free with a one-year package.
Marco:
Once again, expressvpn.com slash ATP.
Marco:
Thank you to ExpressVPN for sponsoring our show.
Casey:
Buckle up.
Casey:
Here we go.
Casey:
Let me start by saying, if you are the kind of person that listens to this in front of your children, that's awesome.
Casey:
And hi, kids.
Casey:
We're so happy that you listen to us.
Casey:
But not this time.
Casey:
This time, I strongly encourage you to use your chapter skip functionality in Overcast or whatever, not as good as Overcast podcast client that you're using, and maybe skip this chapter until after the kids are in bed.
Casey:
You probably know where this is going, but we'd like to talk about Apple's new child safety features.
Casey:
So there's not going to be like swear words or anything like that, but obviously the content from here on out, we're going to assume only adults are listening.
Casey:
So please be careful.
Casey:
That being said, so Apple announced sometime, I think around the time we recorded last week or maybe shortly thereafter.
Marco:
It was like an hour after we released the show.
Casey:
Okay, there you go.
Casey:
Apple announced some new child safety features.
Casey:
And there's a whole landing page at apple.com slash child hyphen safety.
Casey:
And there are basically three major features.
Casey:
And I think in part because they were all announced simultaneously, there's a lot of confusion, including for me, as to what happens where and when and what all these are about.
Casey:
So we're going to try as much for ourselves as for all of you to try to break this down and make sense of it.
Casey:
So let me start with the 50,000-foot view.
Casey:
And so here again, there are three major components that Apple has announced.
Casey:
Number one, the Messages app will use on-device machine learning to warn about sensitive content while keeping private communications unreadable by Apple.
Casey:
And we'll dive a little deeper into this in a moment.
Casey:
Number two, iOS and iPadOS will use new applications of cryptography to help limit the spread of child, help me with this child, sexual abuse material, CSAM,
Casey:
Yep, but that was the first time we said it.
Marco:
Yeah, it's what used to be called child pornography, and this is now the new, modern, more inclusive, I think, term for... Or more accurate, I think, term for... Yeah, child abusive material.
Casey:
Right.
Casey:
So let me start from the top.
Casey:
iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online while designing for user privacy.
Casey:
CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM and iCloud photos.
Casey:
Here again, there's a lot to dive into on that one, which is probably where we're going to spend most of our time here in a moment.
Casey:
Then finally, the third one.
Casey:
Updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations.
Casey:
Siri and Search will also intervene when users try to search for CSAM-related topics.
Casey:
So that's the broad overview, three things, some stuff on device with messages, some stuff that's working in concert between what's on your device and what's on Apple servers for photos.
Casey:
And then finally, presumably almost entirely server side updates to Syrian search.
Casey:
So that is the broad overview.
Casey:
Gentlemen, I can keep going deeper, but do you want to jump in now with any tidbits?
John:
I think we should start with the messages one.
John:
I know you said you thought we'd spend more time on the photos one, but the more I read up on this, the more I think the messages one is actually a little bit of a more difficult situation.
John:
And by the way, no one seems to talk about the Syrian search thing, but I think that is also related to this.
John:
Maybe I'll try to fold it into this discussion.
John:
So the messages one, that description is vague.
John:
Like, oh, on-device machine learning to warn about sensitive content, what is it actually doing, right?
John:
So what it's doing is...
John:
it's trying to see if kids send or receive sexually explicit material by detecting that on device and then when it detects it depending on what the situation is it pops up some kind of dialogue to the to the person who is sending or receiving and gives them a bunch of options right now Gruber had a good explanation of these features with more detail on his website and we'll link to that
John:
So the first thing to know about the messages thing is this only applies for children in an iCloud family account.
John:
So if you are not a child in an iCloud family account, I think Apple defines child as like, I don't know when it stops.
Marco:
For this feature, I believe it's only up to 13.
John:
Well, there's there's caveats.
John:
But anyway, so if you're if you're not a child and I got family, this feature doesn't exist for you, whatever.
John:
And even if it does apply to you, you need to explicitly opt in.
John:
So you don't you want your kids won't be opted into this without you doing it.
John:
It's an opt in type of thing, right?
John:
Um, so how does it work?
John:
If you send or receive an explicit image, uh, you get a warning about, about the image.
John:
I don't know what the warning says.
John:
I think there's been some screenshots of it, but like, you know, this is aimed at like, you know, younger kids and you have two options at that point.
John:
You can ignore the warning.
John:
And if you are, if you are under 12 years old, according to what Apple knows of your age, because you're in the iCloud family account,
John:
It says basically to the under 12 year old, if you choose to either continue to send or continue to receive this image that we're not yet showing you and you're under 12, we want you to know that we're going to notify your parents.
John:
So the kids in theory are told like you can continue and you can do what you're doing, but just so you know, we're going to send your parents a notification about it.
John:
Right.
John:
If you're older than 12, there's no parental notification thing at all.
John:
It just says, hey, you sure you want to do this?
John:
And the kids can just say yes.
John:
Right.
Casey:
For what it's worth, I actually thought the verbiage that Apple cited on their child safety page is very good and worth reading.
Casey:
Now, obviously, I'm no expert in this, but I thought it was good.
Casey:
So if you if you were receiving an image that has sensitive content, it says, you know, huge thinking emoji.
Casey:
This could be sensitive to view.
Casey:
Are you sure?
Casey:
And then it has like three basically bullets after that.
Casey:
Sensitive photos and videos show the private body parts that you would cover with bathing suits.
Casey:
It's not your fault, but sensitive photos and videos can be used to hurt you.
Casey:
The person in this might not want it seen.
Casey:
It could have been shared without them knowing.
Casey:
And it says,
Casey:
And then there's a second dialogue.
Casey:
You know, it's your choice, but your parents want to know you're safe.
Casey:
Again, three bullets.
Casey:
If you decide to view this, your parents will get a notification to make sure you're okay.
Casey:
Don't share anything you don't want to.
Casey:
Talk to someone you trust if you feel pressured.
Casey:
If you're not alone, you can always get help here.
Casey:
And it appears that here's a hyperlink.
Casey:
And then the two options are don't view photo, which is the default, and view photo.
John:
So when you read this, you can kind of see the target audience in your mind.
John:
A kid under 12 who's involved in either sending or receiving these things, there's lots of dangerous situations in which it would be good if there was some intervention of someone warning or, you know, like when you're picturing the ideal scenario, like these are all good things.
John:
But of course, when you're designing any feature,
John:
like this uh any feature between parents and children it is always fraught because not all parents are good parents and not all children are in a safe situation like this feature i'm not going to say this feature assumes that all kids are in a safe situation because it doesn't apple does a bunch of stuff to mitigate this for example apple doesn't immediately notify the parents without telling the kids because if you just assumed oh all parents are good and all children are in a safe situation
John:
Why this whole dance with letting the kid opt out of the warning?
John:
What kid is going to read that and choose to notify their parents?
John:
That warning undercuts the whole feature, doesn't it?
John:
That choice to bail out and avoid the notification to the parents exists, at least in part, because Apple knows that not all parents are great parents and not all kids are in safe situations, right?
John:
The difficult balance of this feature and the reason why I think it's actually trickier to think about is...
John:
How do you like does this increase the chance that a child who a child reveals something in an unsafe parent child relationship that makes that situation worse?
John:
There are many parents that will have a bad reaction to knowing that their kids are viewing any kind of sexually explicit images, especially if they're sexually explicit images that are not aligned with the sexuality that the parent thinks the kid should have, let's say.
John:
Right.
John:
You can't just assume that all parents are there to save and protect their children or that all parents' idea of protection matches what Apple's idea of protection is.
John:
And you would say, okay, well, those kids just can do the thing where they don't notify the parents.
John:
Everything's fine, right?
John:
These are kids under 12.
John:
How many kids have you seen tap through dialogue boxes without reading the text?
John:
How many adults?
John:
Right.
John:
And I will add, on top of that,
John:
you know even an 11 and 12 year old can be in depending on the situation if it's two 12 year olds swapping naked pictures of each other who are like in a relationship or whatever those kids may be highly motivated to see that picture and not kids don't always make the best choices
John:
Right.
John:
You know, a 12 year old kid may not necessarily make the quote unquote best choices as in, I know my parents are going to be notified, but I'm going to take the risk.
John:
You know, there's there's a reason children who are 12 years old aren't allowed to vote or drive cars and stuff like this.
John:
They're still they're still growing.
John:
They're still learning.
John:
Right.
John:
So even in the best of situations, this feature can lead to harms that would otherwise not happen.
John:
Now.
John:
This is why it's so difficult to think about this.
John:
You say, well, should we just do nothing?
John:
Should there be no features that help a healthy parent-child relationship?
John:
Think of Marco putting his Apple Watch on his son so he knows where he is.
John:
Features like that can be abused by parents who are not good parents to their children, to kids who are not in a safe situation.
John:
Location tracking can be used as a form of oppression.
John:
It's not how Marco's using it, not how most parents are using it, but should that feature not exist because it can be abused,
John:
Every time Apple adds a feature like this, you can see some thought and some part of the design going into the notion that we have to mitigate against the worst case scenario.
John:
But it's difficult to argue that none of these features should ever exist because there is a benefit to them and you're trying to balance the potential harm with the potential benefit.
John:
In a case like this, where we're trying to deal with child sexual abuse, the harm is so terrible that to do nothing, to me, feels worse than to try to do something.
John:
But when you try to do something, you do have to, A, try to mitigate against harms that you can imagine might happen, which I think Apple's doing, and B, accept feedback from the world and your customers about how you might be able to improve the situation by mitigating that harm in a better way.
John:
I'm not full of great ideas for this.
John:
That's why I think a lot of people have difficulty talking about this topic because if anyone is talking about this topic and they're like, there is an obvious solution that Apple should have done that is so much better than what they did and they should just do it, I'm suspicious of that.
John:
Because unless they're extremists and they say, well, Apple should never include any features that have anything to do with parents and children because any harm is worse than nothing.
John:
Like, you know, the extremists sort of, and we'll get to that with the photos thing of just like,
John:
Freedom over everything, kind of the EFF thing, where like if you are a lobbying organization where you are staking out one end of a spectrum, there is a place for organizations like that.
John:
I mean, I like the EFF, I donate to them, but I always know that the position they're going to stake out is the most extreme in favor of freedom.
John:
Doesn't mean I always agree with them, but I feel like that force needs to be there to counteract the other force, which is, you know, naked authoritarianism.
John:
We have plenty of that in the world, right?
John:
So those two extremes need to fight it out.
John:
And I'm way more towards the EFF side of the spectrum to be here.
John:
Way, way, way closer.
John:
But...
John:
They're always going to say this feature shouldn't exist at all.
John:
I don't agree with that, but I also agree that it's super hard to do this feature in a way that doesn't accidentally end up harming a bunch of kids that would otherwise not be harmed, either on purpose or by accident, because now this feature gives parents a, you know, gives, you know...
John:
parents bad parents i don't want to say bad parents but like children who are who are in an unsafe situation are now in more danger because of the danger posed by this previously there was no way to accidentally hit a button and notify your parents that you're doing something you know is going to make your life worse right and now there is but the reason this exists is because there is other harm that we're trying to stop as well so i have real trouble figuring out
John:
how to feel about this feature right now.
John:
I kind of feel like trying to do something is better than doing nothing.
John:
But I do hope Apple iterates on this, and I do believe that there can be a better way to implement this with even more safety for kids in bad situations.
Marco:
I mean, first of all, this giant disclaimer from at least me here and probably you two as well.
Marco:
It's hard for me to talk about stuff like this because this is a – like the horrible dark world of child sexual abuse and all this stuff that this is trying to prevent or find at least –
Marco:
We are not experts in this world.
Marco:
We are fortunate enough that we haven't had to be.
Marco:
It's such a terrible set of things that happens here.
Marco:
Again, we're lucky that we're not experts, but because we have a tech podcast...
Marco:
we and because tech is so big and it encompasses so much of the world stuff like this lands on our feet of like well this is what our audience expects us to be talking about this week it's very relevant and and so here we are and i feel like many of you out there are kind of put in the same position like as consumers of tech news and apple news and or just being apple fans and being enthusiasts of this stuff like
Marco:
This stuff comes up and all of a sudden we all have to take a crash course in what all this stuff means, what is going on in the world out there, what problems and solutions already exist, what have people already been doing, what have companies already been doing.
Marco:
So we're in unfamiliar territory here to fortunately a large degree.
Marco:
So please forgive us if we miss some aspect of this or stumble over parts of this because it's very uncomfortable to even be thinking about this stuff because it's so –
Marco:
Actual sexual abuse is so horrific.
Marco:
As we'll get to in a minute when we talk about the CSAM scanning feature, it has special treatment in society because it is so horrific.
Marco:
It's such a special case in so many ways of how we treat things.
Marco:
So anyway, all of that being said, and we'll get back to that other part in a minute.
Marco:
All that being said, the messages, nudity censors basically –
Marco:
It seems like they've done a pretty decent job of avoiding most of the problems with the parameters they've put in place with this feature.
Marco:
If the feature went up to 18, I think that would be much more problematic because I think everyone can agree that you don't really want 9-year-olds sharing nude photos with each other.
Marco:
But people have different definitions of things like age of consent and everything as you get closer to 18.
Marco:
You could argue – many people do argue – if a 17-year-old girl takes a picture of herself on her phone, should she be arrested for possession of underage nudes?
Marco:
Like that's –
Marco:
And that has happened.
Marco:
And there's all sorts of weird ways in which that can be overly oppressive to women or to queer youth.
Marco:
And so obviously any feature involving people's ability to take and share pictures of themselves runs into serious problems in practice if it's older teenagers necessarily.
Marco:
So by keeping it to younger children, you avoid a lot of those murky areas.
John:
Well, the flip side of that, though, is that young kids are also the most likely to misunderstand or not really get the consequences of what the dialogue box is trying to tell.
John:
And that's why the dialogue is worded to try to like the bathing suit area thing.
John:
It's worded and aimed at younger kids, but they're exactly the ones that are the least equipped to really, truly understand the consequences and also probably the most likely to tap through them really quick.
John:
And the second side of that is.
John:
You know, abuse and sort of grooming by older predators happens to 16 to 17 year olds all the time, too.
John:
So there's some people who are more expert in this field who have criticized Apple's targeting of saying most of these sort of.
John:
sex trafficking and grooming that is happening is not happening to nine-year-olds but it's actually more of a problem in the older teens and so the situation like we all because it's so horrific we all tend to think of like oh what are the normal situations a 17 year old couple are like sending each other nude pictures and we don't want to get in the way of that because it's just normal growing up stuff right but what about the you know
John:
The much, much older sexual predator, either posing as a teen or not even posing as a teen, but, you know, grooming a 16 or 17 year old, it's it's it's just as bad as the the child situation.
John:
there's, there's so many variety of ways that these things can be abused.
John:
And the, the tool that we have to deal with this, this, you know, we should get into the tech aspect of this for a second.
John:
This is sort of just machine learning.
John:
Hey, this picture that either is being about to be received or about to be sent, does it look sexually explicit?
Yeah.
John:
And that's just kind of a best guess.
John:
And that's the only tool we have.
John:
We don't have any other context.
John:
There is no machine learning trying to suss out, is this conversation between a predator and prey?
John:
Is this a conversation between two kids who are a couple?
John:
As far as Apple has told us, there's none of that.
John:
It is literally only this one thing.
John:
Photo coming in, photo coming out.
John:
ML model, look at photo, tell me yes, no.
John:
Is it sexually explicit?
John:
Such a blunt instrument that has no awareness of this other stuff.
John:
And it's hard enough to solve this problem because you know all the pictures of someone's baking cookies and they take a picture of the bowl and it's sexually explicit because it's like... Machine learning is not perfect.
John:
This is straight up, hey, machine learning, here's an arbitrary picture.
John:
Tell me whether it's sexually explicit.
John:
And it's not super accurate.
John:
So we also have to account for all the cases where some poor kid or teenager is going to be faced with this dialogue, especially on an incoming picture, and go...
John:
what did this person send me and it's just like a picture of their dog right because because their dog is determined to be sexually explicit right so the tech involved in this thing also makes it somewhat fraught and i and i think like you know marco from your perspective like oh it's it's easier for the older kids and harder for the younger in some aspects but also in some aspects it's the reverse and like you really just have to go through all the different scenarios and it probably also helps to have experts in this field to like you know like i read a few things from of saying like
John:
Here's where the bulk of the problem is.
John:
And even though this is scarier, this happens.
John:
It's kind of like the whole thing of like, you're probably not going to be like murdered by a stranger.
John:
Most likely, especially if you're a woman, you're going to be murdered by, you know, this person you're in a relationship with or someone you know or your family member.
John:
It's depressing to think about.
John:
But like the fear of murder from a stranger.
John:
You know, or a shark attack or whatever is so much out of proportion to what's actually going to kill you, which is usually much more mundane.
John:
Right.
John:
And so I'm sure something like that also applies to all the child sexual abuse stuff and experts in the field would could probably help Apple better target this.
John:
But when your only tool is in this particular feature, this machine learning model, it's your options are limited.
Casey:
Yeah, yeah, very much so.
Casey:
I mean, it's such a tough thing.
Casey:
Like you guys said, you know, you want to prevent this.
Casey:
And in Apple's case, not only do you want to prevent it, but you want to do it with some semblance of privacy.
Casey:
You know, you don't want to be beaming these images to some server like Google probably would.
Casey:
I honestly don't know how they handle it.
Casey:
But, you know, you don't want to be beaming every image that you receive via iMessage to some server to verify whether or not it has CSAM in it.
Casey:
It's a very difficult problem to solve, and Apple's made it more difficult by insisting on having it be as private as they possibly can, which is, in my opinion, it's something they should be applauded for, but it's challenging.
John:
This gets us into the next feature.
John:
It's like, oh, this is privacy preserving.
John:
It doesn't break end to end encryption on messages.
John:
Right.
John:
Because it's only like obviously when a message arrives on your phone, something has to decrypt it.
John:
Otherwise, you can't read it.
John:
Right.
John:
So and if we do it on device, if we do the machine learning model on your device, like it was encrypted and then encrypted across the hallway and only right before it gets to your eyeballs when we have to encrypt it anyway, at that point, we'll do the machine learning thing.
John:
So it's privacy preserving.
John:
Right.
John:
Yeah.
John:
Right.
John:
And so, yes, the grant of the dialogue tells you it's going to do that and so on and so forth.
John:
But you're putting a lot of weight on people being able to correctly read and understand dialogues.
John:
And by the way, tap the right button.
John:
Right.
John:
Previously, before this feature, there was no feature in messages that could potentially rat you out to your parents.
John:
Right.
John:
With an errant click on a dialog box.
John:
Right.
John:
And now there is.
John:
And from a child's perspective, right.
John:
that's not privacy preserving at all, right?
John:
From an abstract kind of like, oh, random people snooping on internet routers can't see your picture.
John:
Great, that's great.
John:
But what I care about is my parents finding out.
John:
And now suddenly there's a possibility where that didn't happen before.
John:
And of course, on the parent side is,
John:
Oh, if a predator is trying to, you know, groom my 12-year-old, I really want to know about that, Apple.
John:
And so there's just so many conflicting stakeholders in this soup that it's very difficult to come up with a, you know, other than, again, the extreme position of like, you should just never do anything about this, right?
John:
And that seems like a clean solution until you think, oh, right, so we should do nothing about child sexual abuse?
John:
It's like, well...
John:
Don't do this.
John:
Well, what should we do?
John:
Oh, now it's a hard question.
John:
I don't like it.
John:
So Apple's trying to do something, and we'll get to why probably in a little bit, but anything you do is fraud in some way.
Marco:
Well, and I think, I mean, let's get to that now.
Marco:
I think one of the things that I've learned listening to other people's podcasts about this, by the way, I can strongly recommend this week's episode of Decoder with Nilay Patel.
Marco:
He had a couple of experts on in this area.
Marco:
So this podcast, Decoder,
Marco:
Every episode of it is some CEO or chief officer of some company, and they mostly sound like they're going to be really boring, but then when I listen to them, it's like I learn something cool or it's much more interesting than I thought every single episode.
Marco:
Literally every episode I've heard, which is most of them, it always ends up being worth it, even if it sounds, from the title and description, like it might not be very exciting or it might be a company you don't care about.
Marco:
Anyway, so this week's episode of Decoder with Neely Patel...
Marco:
Very, very good because he had two experts on in this area, and I learned a lot from that that I didn't hear in a lot of other places.
Marco:
So I can strongly recommend listening to that.
Marco:
If you want to hear from people who actually know what they're talking about in this area, you will learn a lot, I promise.
Marco:
But yeah.
Marco:
Anyway, one of the big things that we've all learned, at least I sure have.
Marco:
I didn't know this before this.
Marco:
is that almost all of the major tech cloud slash service companies are doing various forms of CSAM scanning and reporting and everything.
Marco:
Every big company you can imagine, like Dropbox, Facebook, Microsoft, everyone's doing this.
Marco:
And one of the reasons why they're doing this is because they have to, by law, in many countries, including in the U.S.,
Marco:
And so I think part of the reason why Apple is doing this is that they have been facing increasing pressure from the law and from law enforcement agencies.
Marco:
And there's a big history here of Apple trying to make their devices very private and trying to give users lots of strong encryption tools to use for their data and for their content while sometimes being at odds with what law enforcement wants to be available to them.
Marco:
There was obviously the famous San Bernardino shooter case where the government wanted Apple to unlock a phone, and Apple basically said no, and Tim Cook made some terrible analogies about cancer, but for the most part, his argument once you got past those terrible analogies was fairly sound in why they shouldn't do that.
Marco:
But anyway, part of what makes this complicated is that
Marco:
We, in the tech business, we operated for so long kind of skating by under the radar of most governments and legislatures.
Marco:
They couldn't keep up with us.
Marco:
They didn't understand what we were doing, and they kind of left us alone to a large degree for a very long time as we developed the tech business.
Marco:
And
Marco:
I think those days are long over now.
Marco:
Now, governments have gotten a clue of how powerful tech is.
Marco:
They don't like parts of it, and they intervene now to a much larger degree with legislation and pressure and legal threats or actions than they did in the past.
Marco:
So we as computer people are accustomed to
Marco:
tech companies being able to do whatever they wanted and us being able to have these devices that we could do whatever we wanted on.
Marco:
And largely the law was not enforced or, or didn't expand to cover tech stuff.
Marco:
And so we got used to this freedom of like, my device is mine.
Marco:
The government can't tell me what my phone can and can't do or whatever.
Marco:
Um,
Marco:
That era has been chipped away over the last several years at least, and now all the tech companies are under much greater pressure from the governments that they either operate directly in or at least have to sell their products to for healthy financial reasons.
Marco:
So –
Marco:
There's going to be an increasing amount of government intrusion into tech.
Marco:
Some of that, like some of the antitrust proposals, which I know there was a big one today.
Marco:
We're probably not going to get to it today because it just happened and we have a lot of stuff to talk about today.
Marco:
But some of that stuff will be good.
Marco:
But a lot of this stuff will be, well, we have to comply with this law now.
Marco:
And some of those are going to be good laws that we agree with, and some of them are not.
Marco:
And it's going to get messy.
Marco:
It's already getting messy, and it's going to get messier as the tech companies have to bow to pressure or just comply with the laws in their jurisdiction that they operate in.
John:
Are you sure about one correction?
John:
Are you sure about the thing where they have to scan?
John:
I'm pretty sure they don't have to scan in the US.
John:
What they have to do is report it if they find it, but they don't have to go looking for it.
John:
I believe that's right.
John:
But there are UK and EU laws that are coming down the pike that potentially will say you have to scan for it.
John:
So in some ways, well, let's just finish with the motivation for this thing.
John:
Some of the motivation might be that those laws are coming and you might have to comply with it anyway, so we should do it.
John:
Another part of the motivation is... And by the way, all these features we're talking about are US-only at this point.
John:
So even those EU and UK laws say, oh, those aren't relevant, but Apple will potentially expand this to other countries on a country-by-country basis, according to them.
John:
The other thing is that...
John:
If the U.S.
John:
ever has a law like this, Apple, and this is what Apple says in their interviews, and we'll have Matt Panzarino had a good interview with Apple's head of privacy about this.
John:
The Apple answer is the reason we're doing this now is because we figured out a way to do it that is, quote unquote, privacy perverse.
John:
preserving right and we'll talk about the photo scanning and what their meaning of that is but what they're saying is these other companies that are doing it like facebook and microsoft and so on forth they do it the you know the brute force way of like hey we have access to all your photos that are stored on our servers it's our servers they're on our cloud services we're just going to scan them there and if we find anything we're going to report it because the law in the u.s is if you find it you have to report it but they're actively looking for they're scanning all your photos on the server side because they have them right
John:
Apple could do that too, but Apple apparently considers that not privacy-preserving.
John:
And the Apple side of privacy really hits on this thing of saying, like, oh, it's much worse when you scan on the server side because it's more opaque and you can't tell what we're doing.
John:
And we could decide to just scan one person's thing because they're under scrutiny and all these sorts of other things.
John:
Apple is very big in their messaging to say that is not, from Apple's perspective, privacy-preserving.
John:
What is more privacy-preserving is if we do it on device, and we'll talk about that feature later.
John:
in a second or whatever.
John:
But Apple's story is, hey, the reason we're doing this now is not because we're afraid of regulations coming down or whatever.
John:
It's because we found a way to do it that is privacy-preserving, according to our definition of privacy-preserving.
John:
But surely part of Apple's motivation is that Apple knows that whenever there is sort of an attack on Apple's privacy-preserving features, like the San Bernardino thing of the FBI or whatever, saying, Apple, this is a terrible terrorist, and you need to let us have a backdoor on all your iPhones because terrorism is bad, right?
Right.
John:
That's not a good situation to be in, and Apple has to make the difficult argument that we're not in favor of terrorism, but we also don't want to put a backdoor on all our devices because there's no such thing as a backdoor that can only be used by the good guys, right?
John:
It's an argument that tech people understand, but it's hard to understand when emotions are high and terrorism is involved.
John:
Same exact thing with child sexual abuse.
John:
If there's a child sexual abuse situation, you can say, Apple, I know you said you don't want to include a backdoor for some reason, but child sexual abuse, you have to do it for the children, right?
John:
So features like this where you can say we found a way to do this without backdooring every single iPhone is a great defense when the time comes when someone says, oh, you know, just like in the movies, this kid has been kidnapped by the boogeyman and like some scenario that like never happens in real life.
John:
A stranger has kidnapped a beautiful, innocent child and you need to unlock this phone to get an Apple.
John:
You need to let this happen or whatever.
John:
Features like this that.
John:
hopefully catch the boogeyman before they kidnap a kid by detecting the fact that they're downloading CSAM and stuff.
John:
Done in a way that doesn't require putting in a backdoor that quote-unquote only the good guys can use or some other technical fantasy that doesn't actually exist is a great way for Apple to be prepared when, like Marco said,
John:
those regulations start coming in the U.S.
John:
of like it's not a free-for-all anymore, right?
John:
It's probably part of the same reason that Facebook and Microsoft and Google and all those things do their own CSAM scanning server side.
John:
Just say, look, we're already doing a thing that will help with this terrible situation.
John:
So please don't ask us to backdoor our encryption or please don't outlaw end-to-end encryption or all sorts of other much worse policies that will actually make everyone less safe but that are politically appealing to people who don't understand the tech.
Casey:
Yeah.
Casey:
So let's talk about iCloud Photo Library.
Casey:
Yep.
Casey:
So like I said, again, the summary is that iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online.
Casey:
While designing for user privacy, CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM and iCloud photos.
Casey:
So...
Casey:
Let's start off.
Casey:
If you are not using iCloud Photos, this does not apply to you.
Casey:
That's as simple as that.
John:
Now, before moving on from that point, that's another thing that a lot of people will bring up, which is, oh, well, then there's no point in this feature because all the nefarious child sex abuse predators will just read that and say, aha, I'm safe from Apple.
John:
I just won't use iCloud Photo Library, right?
John:
Yeah.
John:
Why would Apple announce the way to avoid this feature?
John:
It's totally pointless.
John:
All we'll ever do is catch innocent people because no guilty person will ever use it.
John:
If you look at the CSAM scanning that Facebook and all these other big companies do, and you see how many instances of it they catch every year, I think the Facebook number was 20 million reported last year.
John:
Oh, my God.
John:
And it's not like it's a secret information that Facebook does this scanning, right?
John:
So you would think, well, if Facebook announces to the world that they do this scanning,
John:
Why would anyone who's, you know, a child sexual predator use Facebook?
John:
People do things that don't make a lot of sense, but the, you know, like it's, we'll get to this in a little bit saying, I just won't use Facebook.
John:
I just won't use Google.
John:
I just won't use Apple.
John:
I just won't use iCloud photo light.
John:
We're like, yes, in theory, if you were the world's biggest mass criminal mastermind, you could like avoid all these things.
John:
Right.
John:
But practically speaking,
John:
It's very difficult to essentially avoid using the internet and the major players on the internet.
John:
And practically speaking, 20 million cases caught by Facebook shows that they don't avoid it.
John:
They do it.
John:
And we catch them.
John:
And that's why features like this, even though there's a way to work around them, still have value in catching criminals.
John:
If you caught zero of them per year, we should have to rethink this.
John:
But 20 million per year at Facebook...
John:
is a big number and by the way apple which prior to these features was not actively doing anything to catch this stuff reported something like 200 last year and who knows how they found those 200 like maybe they were reported or something like that but when facebook is doing 20 million and apple is doing 200
John:
I feel like that shows that Apple needs to do more.
John:
And so here is thus these features that we're talking about.
John:
So here's this next feature.
John:
So yes, it's only if you use iCloud Photo Library.
John:
If you don't use iCloud Photo Library, none of this stuff ever happens.
John:
But that doesn't mean that no one will ever be caught by this.
Casey:
Right.
Casey:
So I tried to do some deeper reading into the mechanics of how this works, and I did some, but my eyes glazed over for some of it.
Casey:
I didn't get through it all.
Casey:
So I have tried to do some research and I have failed.
Casey:
So call me John Syracuse.
Casey:
We will try to cite what we can in people who have done a little more research than us.
Casey:
And certainly, like Marco's disclaimer earlier, I am not a cryptographic expert.
Casey:
In fact, a lot of it is way above my head.
Casey:
So I'm trying my darndest to understand this a little better, but I need a little more time to get it 100% right.
Casey:
But with that said, mostly quoting Gruber's really good summary.
Casey:
So for iCloud Photos, the CSAM detection for iCloud Photos only applies to images that are being sent to iCloud Photo Library.
Casey:
Like I said earlier, if you don't use iCloud photo library, no images on your devices are fingerprinted.
Casey:
Photos are compared on device to a list of known CSAM from NCMEC, which is the National Center for Missing and Exploited Children.
Casey:
So let me unpack that sentence.
Casey:
So NICMAC, the National Center for Missing and Exploited Children, they do keep a database or repository of some sort, if I understand correctly, of CSAM.
Casey:
And they are the only organization, they're the only people that are legally allowed to do that here in the United States.
Casey:
And that's because they're the people in charge of trying to prevent it and fight it.
Casey:
And so my understanding is...
Casey:
Apple, and I'm filling in a couple of blanks here, but Apple will provide some sort of tool to NCMAC to scan all their files in their database.
Casey:
These are things that they know are bad.
Casey:
This known sexually explicit material, child sex assault material, whatever.
Casey:
They will scan all that, and that will generate a bunch of hashes.
Casey:
So basically a bunch of numbers.
Casey:
And they'll be post-processed a little bit by Apple, the hashes, that is, not the photos.
Casey:
And that generates a whole bunch of hashes, again, so these are numbers, that Apple can then use to compare the
Casey:
your photos too.
Casey:
So the idea is, and I'm dramatically oversimplifying, but let's say there's a CSAM picture of whatever, doesn't matter what the specifics are, and it yields a number of 42.
Casey:
Now, obviously, these numbers are way longer than that, but let's just say it yields the number 42.
Casey:
Well, if I had a picture on my phone that also yielded 42 as the hash, as that unique number,
Casey:
And it should do this, by the way, even if I make it grayscale, even if I twist it upside down or whatever the case may be, because it's doing some semantic processing and some other things.
Casey:
But one way or another, if I end up with a photo that ends up with a hash of 42, and NCMEC has provided a photo and scanned it using Apple's tool and provided the hash of 42 to Apple, then uh-oh, we've got a match.
Casey:
And things will carry on from there.
Casey:
But before I go on any further...
John:
And when you say a match, by the way, you're not saying this is a similar picture.
John:
This is a picture of a similar thing.
John:
It is literally the same picture.
John:
Plus or minus, like you said, zooming, cropping, grayscale, blurring.
John:
But basically what it's trying to say is this is literally the same picture.
John:
It's not like saying, oh, this is a picture of an apple.
John:
It's like, no, this is the exact picture of an apple that's in the CSAM database, right?
John:
It is the exact picture.
John:
Right.
John:
There are a finite number of pictures that this is trying to detect.
John:
It is the database provided by NCBEC.
John:
I don't know how many pictures it is.
John:
But that's it.
John:
Those are all the pictures that it's ever going to find.
John:
It's not going to find a picture that's not in that database.
John:
And if it finds one, it's not saying this is a similar picture or a picture of the same thing or even a picture of the same person or anything like that.
John:
It is saying this is literally that picture.
John:
So it is...
John:
extremely limited in that if it's not on the NICPEC database, it will never be detected if this system is working correctly, right?
John:
Which, that disclaimer we'll get to in a little bit.
John:
But that's what this thing is attempting to do.
Casey:
Right, this is in contrast, mind you, to the messages stuff, the iMessage stuff we were talking about earlier, where that is trying to say, oh, that looks like a body part covered up by a bathing suit.
Casey:
That is something we should figure, you know, that's something we should alert you about.
Casey:
This is different.
Casey:
This is exactly what John said.
Casey:
This is not, oh, that looks like a body part covered by a bathing suit.
Casey:
No, no, no.
Casey:
It's this picture matches whatever picture is in that CSAM database.
Casey:
And Apple doesn't get the CSAM database because not only do they not want it, I'm quite sure, but it is illegal for them to have it.
Casey:
All they are getting is the list of hashes generated by it, presumably by some tool that Apple provides.
Casey:
So the thing is, though, you just one match isn't enough.
Casey:
Nothing happens if there's one match.
Casey:
There is some threshold.
Casey:
Nobody knows what that threshold is.
Casey:
That's probably for several different reasons.
Casey:
Probably.
Casey:
So, you know, like if we all knew that the threshold was 20, then some nefarious individual could keep 19 photos on their phone and they'd be fine.
Casey:
But we don't know if the threshold is 20 or 2 or 2 million or whatever.
Casey:
So one way or another, one match isn't enough to trigger any action.
Casey:
There is this threshold, and we don't know what that threshold is.
Casey:
But eventually, that threshold will be reached.
Casey:
And again, I'm totally making this up, but just to make discussion easier, let's say it's 20.
Casey:
And so...
Casey:
Once 20 pictures are hit, then at that point, the cryptographic protections that are built around these, I forget what they call them off the top of my head now.
John:
Safety vouchers.
John:
Thank you.
John:
That's actually, before we even get to the threshold board, that's an important point.
John:
When one of these matches is found, one of these safety vouchers is sent to Apple, but Apple itself can't decrypt that to do anything with it until the threshold is met.
John:
There's a bunch of cryptographic stuff, which like Casey said, is probably over all of our heads.
John:
That makes that possible is using cryptography to say, OK, when we find a hit, we'll send the safety voucher to Apple.
John:
But Apple cannot do anything with that safety voucher.
John:
They can't tell what the original picture was.
John:
They can't tell which picture it matched.
John:
They can't do anything with it until the threshold is reached.
John:
And when the threshold is reached, then at that point, Apple has 20 safety vouchers from this person's phone.
John:
And at that point, then because of the cryptographic stuff, then they can actually decode them and say, now we need to actually look at these pictures.
John:
And so that brings us to the next step.
Marco:
It's kind of like the world's most complicated and worst raid array.
Marco:
We need a certain number of these before we can decrypt any of them.
Marco:
Which honestly, from a technical point of view, that's a really cool idea.
Marco:
It's very clever.
John:
They do a bunch of other privacy-preserving stuff, again, if you can understand the cryptographic stuff, where they will intentionally seed it with false information, so there's no way to sort of pick out people who are potentially bad actors until the threshold is reached, just because you see some...
John:
They do a bunch of stuff where to try to be privacy preserving, because as we've learned, even just from metadata, just of knowing that like safety vouchers are arriving could be some information that could be used to determine something.
John:
So they intentionally seed in some information to add noise to the thing.
John:
But the whole point is even Apple, even under like threat of law.
John:
Again, if someone subpoenaed and said, we demand that you decrypt these safety vouchers and show what these pictures are.
John:
Apple literally can't do it because of math until the threshold is reached.
Casey:
Right, which is very cool.
Casey:
And again, that 20 number that we're using, that's made up.
Casey:
We have no idea what the threshold is.
Casey:
But the threshold is designed such that, and now this is a quote from Apple, to provide an extremely high level of accuracy and ensures that less than one in one trillion chance per year of incorrectly flagging a given account.
Casey:
Now, mind you, that's not incorrectly flagging a photo, incorrectly flagging an entire account.
Casey:
So if you're to believe Apple, whatever that threshold is, be it 20 or 200 or 2 million or whatever, there is less than a one in one trillion chance that any one of the three of us or anyone else for that matter will have an oops and get our account flagged even if it shouldn't be.
John:
So this thing reveals some information about this because we just got done saying like the whole point of this algorithm is to try to tell is this the same picture as that accounting for things like zooming, cropping, rotating, color changing, stuff like that.
John:
So when I say, oh, accounting for those changes, it's clear that it's not a bite for bite comparison because that would work on any of those things, right?
John:
Obviously, there is some amount of, I don't know if you call it machine learning, but some amount of processing that is done to try to determine if this picture is the quote unquote same picture as that one, even if it's been converted to black and white, even if it's been zoomed a little bit, even if the crop is slightly different, even if a new section of it was blurred out, even if it has some words stamped on it, you know what I mean?
John:
Like,
John:
a human could tell if they're the same picture but for computers it's harder to sell like a human can tell oh this is the same picture it's just rotated a little bit in zoom like we can do that pretty easily but computers have harder time with it right so this one in a trillion chance thing and the fact that there's a threshold at all is telling us this algorithm is not 100 accurate when it comes to determining if this picture is the same as the other one because if it was you wouldn't need a threshold
John:
Right.
John:
It's not like they're trying to say you're allowed to have some number of CSAM on your on your computer.
John:
That's not what they're saying with this threshold.
John:
Like, oh, it's OK if you have a little bit.
John:
If you have a lot, we're going to report you to the law.
John:
It's because this algorithm is not 100 percent accurate.
John:
Right.
John:
And so to make it and, you know, obviously having a false positive is really bad.
John:
So to try to avoid a false positive, Apple has done the math and said, we're going to make this threshold and we're going to make it.
John:
So it's really, really hard to have a false positive.
John:
And there's two strategies in that.
John:
One, the consequences of a false positive could be devastating to the person involved in it.
John:
You do not want to be reported to law enforcement for having CSAM when you have actually none because of some stupid algorithm, right?
John:
That is super harmful, and Apple would never want to do that, right?
John:
And the second thing is, since the algorithm is not...
John:
100% accurate, Apple wants to actually know that it's in Apple's interest to try to make sure that you and also to get the most egregious offenders.
John:
The whole point of this is to catch the people doing the bad thing.
John:
I'm going to, I don't know much about the field, but I'm going to say it's probably unlikely that people who are doing this have one picture, right?
John:
They probably have more than one.
John:
So again, we don't know what the threshold is, but by putting the threshold like this, hopefully they can avoid any false positives and also pretty much catch everybody who's doing this.
John:
Again, it depends on the threshold.
John:
If the threshold is a million photos, maybe this is not a great feature.
John:
But if the threshold is 10, you're probably going to catch all the people, right?
John:
Like, again, why don't they just keep nine?
John:
Or if we found out the secret threshold, people could just keep one under.
John:
See, also Facebook catching 20 million people.
John:
it like that's not the way criminality works and there is no system that can only that can catch the master criminals right because they just won't use the internet and they'll be safe they'll live in a shack in the woods like there's always some out right we're just trying to do something which is better than nothing in this case so so yeah so the unreliability of this needs to be a factor like the threshold that's the way to think about this right um and apple's calculations presumably are well founded and
John:
But the reason a lot of people are – well, there's lots of reasons people are nervous about this, which we'll start enumerating now, I think.
John:
But one of them is that this is not – that it is an algorithm.
John:
And despite the fact that Apple says it's one in a trillion, it's not potentially reassuring.
John:
Now, the next backstop on that is when you hit the threshold and Apple can finally decrypt the safety vouchers, it doesn't report you to the police at that point.
John:
What happens at that point is Apple has –
John:
someone whose job is terrible, actual human beings then have to look at the photos and do a final human-powered confirmation that, yes, these really are the same photos, right?
John:
That these really are, I mean, not the same, but these really are, you know, that they are sea salmon, not a picture of someone's dog, right?
John:
Human being has to make that determination.
John:
That's not a fun job.
John:
But that is the backstop in saying, okay, at that point, after a human looks at it, after it's passed the threshold, it's done all the things, once it passes the threshold, they get, I think they get like a lower resolution version of it.
John:
They don't even get the full version of it, but they get enough of it so a human can look at it because Apple can finally decode it now because it passed the threshold.
John:
They look at it.
John:
They make a determination.
John:
This is, by the way, after the one in a trillion.
John:
After the one in a trillion, then a human looks at it.
John:
So even if you fall into the one in a trillion thing, if it turns out not being one in a trillion, but one in a hundred million, then a human has to look at it.
John:
They make the determination.
John:
If it turns out at CSAM, they report you to the authorities because it's U.S.
John:
law that they have to do that anyway, right?
John:
Because now they have found it, a human has confirmed it, and they report it.
John:
Unfortunately for Apple, even this is not particularly reassuring to a lot of people because anyone who's gone through AppReview knows that ostensibly a human looks at every app in AppReview.
John:
And we've all seen rejections from AppReview that prove that having a human look at something is not necessarily a guarantee that something sane or logical will happen.
John:
Now, you would hope that the people doing this job have...
John:
a higher threshold of reporting someone to the police for child sexual abuse material than rejecting your app because they think you didn't put a button in the right place.
John:
I would also hope they don't have such a volume to deal with.
John:
Well, we'll get to that in a little bit because that actually might not be true.
John:
Given how little they detected so far and what might be lurking in there, it may actually be a terrible situation.
John:
But...
John:
Like, this is not Apple, well, it's not entirely Apple's fault, but there is a perception that, you know, especially within tech community that's thinking about this from a tech and privacy perspective, that that doesn't actually make me feel that much better because my experience with humans at Apple is not reassuring in this regard.
John:
Now, I think that's probably just a sort of
John:
gut reaction to past experiences that i hope has almost no bearing on this situation um because it seems like app review is complicated uh human being looking at a picture and determining whether it's child sexual abuse material seems less complicated to me it seems more an open shut type of thing i don't think a picture of your dog is going to be accidentally flagged as c sam by an inattentive reviewer i really hope not right um
John:
But so why does this feature make people upset?
John:
Why was this feature getting most of the press and complaints aside from the messages feature above and beyond the messages one?
John:
Why is this the one that bugs everybody?
John:
I think part of it is that it applies to adults.
John:
It's not just kids because, you know, who's on the internet arguing about this?
John:
Probably not 12 year olds, but it's a bunch of adults.
John:
And this one does apply to adults if you use iCloud photo library.
John:
right so that's one aspect the other ones i just talked about are like well apple says it's one in a trillion but who knows what it really is it's not a deterministic algorithm or it's not a it's not a algorithm that anyone really understands so it's some form of machine learning and it's kind of fuzzy and it's not 100 accurate thus the thresholds that makes me nervous and the humans the backstop don't make me feel better so there's some worry about being flagged unjustly despite all of the backstops that apple's put into it
John:
One of the more fundamental underlying discomforts with this entire system is that it feels, I'm going to say, unjust, un-American, not in keeping with the American justice system because people have some expectation and part of the Constitution, the Fourth Amendment or whatever, that in the U.S.
John:
anyway, there is a sense that
John:
If you are looking into something in my life, there has to be some reason.
John:
I'm suspected of a crime, so you look at my bank records to see if I've been laundering money.
John:
You think I have stolen merchandise because there's someone who matches my description caught on a security camera stealing something from a store, so you have a warrant to search my house.
John:
Right?
John:
That is generally the way our criminal justice system works, that if there is some suspicion that you have done a thing, you have to convince a judge that we think this person might do this thing, therefore we need to search something.
John:
And you get a search warrant and you look into it, right?
John:
The other side of that is where you just watch everybody all the time.
John:
And that way, if anyone does anything wrong, you'll catch them.
John:
And that's what we call surveillance.
John:
And this feature does not have the concept of probable cause or any of these type of things.
John:
It's surveillance.
John:
It is watching every single picture on every single person's phone all the time.
John:
Now, Apple isn't the U.S.
John:
government.
John:
It's not the same situation at all.
John:
But from a sort of emotional feel and justice perspective, it feels like I am now being surveilled, that everybody is being surveilled, that it
John:
that everything we're doing is being watched just in case we ever do something criminal.
John:
Again, the messages feature is exactly the same, but it's like, oh, that's kids.
John:
It only applies to kids.
John:
It doesn't apply to me.
John:
I don't have to worry about that.
John:
But every photo that is sent and received by people who are under the age limit of that messages feature, every single photo has that ML thing run against it if you've opted in, right?
John:
And same thing with this thing.
John:
If you're using iCloud Photo Library, every single one of your photos...
John:
going into iCloud Photo Library has a supply to it.
John:
And for some people, especially people who are sort of security conscious and looking at, or privacy conscious and looking at this through the broader lens of, you know, what seems fair and just in the technological world, this doesn't feel good.
John:
It doesn't feel good to know that you are constantly being surveilled just in case you do something wrong.
John:
And everyone trots all this stuff.
John:
Well, if you're not doing child sexual abuse stuff, you have nothing to worry about.
John:
If you have nothing to hide, it's okay for these Germans to listen in on your phone calls, right?
John:
Like, again, Apple is not the government.
John:
It is not a direct comparison, but it feels similar.
John:
People don't like the idea that you're being surveilled.
John:
Setting aside the fact that, like, you know, the CSAM scanning is going on for every photo uploaded to Facebook, every photo put into Google Drive, every photo put into Microsoft OneDrive.
John:
Like, that's also surveillance because they're not discriminating.
John:
They're not saying, oh, this person might be a criminal.
John:
We're going to scan.
John:
They just scan everybody's because that's what computers do.
John:
but it feels like surveillance to people.
John:
And from, and this gets back to the app argument of like, oh, we didn't want to do this until we could do it in a privacy preserving way.
John:
But by doing it in this quote unquote privacy preserving way, it still feels like surveillance.
John:
No, they're not scanning it on the server.
John:
So they're still scanning every picture for everybody.
John:
They're just doing it on the client.
John:
And Apple can't even make the argument of like, oh, we can't even see your photos because they can.
John:
Because Apple doesn't do end-to-end encryption on their, like iCloud backups and iCloud photo library.
John:
Like, in the end, if you backup to iCloud, Apple can get at those backups.
John:
And so some people are making the argument that this feature is a precursor to Apple finally providing end-to-end encryption for iCloud photo backups.
John:
Again, more arguments about like, well, the criminals just won't do iCloud backups.
John:
Like...
John:
you know they will and they do because they're people and some of them won't but most of them will right because it's just law of averages anyway if this is a precursor to end-end encrypting iCloud backups great but if it's not it doesn't feel any more privacy preserving and i say feel specifically here than scanning on the server side apple's argument is that
John:
It is more privacy preserving because the scanning happens on your device and everyone gets the same OS with the same database of NCMEC images.
John:
And you can prove that cryptographically and you can look at the bytes and you can be clear that we're not targeting an individual and so on and so forth.
John:
But in the end, Apple is still saying, hey, every single person who has an iPhoto who uses iCloud photo library, your device that you hold in your hand is looking at every single one of your photos.
John:
As Ben Thompson pointed out in his thing today.
John:
A lot of people feel like their own phone, quote unquote, spying on them is somehow feels worse than, you know, sending it to a server where it gets scanned by Apple.
John:
Right.
John:
Because you feel like you're being betrayed by the physical thing you hold in your hand.
John:
Like even though it's not actually worse, it's the same thing.
John:
Right.
John:
And in many ways, it is more secure for it to be happening on device and not, you know, not sending it unencrypted across the wire or letting Apple see it or all those other things.
John:
So that inside the iCloud backup issue.
John:
But it feels worse.
John:
And this is a lot of Apple's problem with this feature, that every part of it, Apple is making the argument that this preserves your privacy better.
John:
But to the average person, when you explain it to them, it feels worse than the alternatives that Apple says are worse.
John:
But in the end, all of them are essentially some form of surveillance by your device, which is common practice and is the way that we use computers to try to catch criminals in this particular situation, which I don't know.
John:
Again, people who don't like this, okay, what should we do instead?
John:
Well, there's always the people who say, let's do nothing.
John:
I'm not in favor of that solution.
John:
And neither is pretty much anyone else in the tech industry.
John:
And if those laws come through that says Apple has to scan, they need a solution.
John:
And if you needed a solution, this one is in keeping with Apple's values, which is we'd rather do it on device.
John:
We don't want to, you know, we don't want to compromise our end to end encryption where it exists.
John:
In theory, Apple, this leaves Apple free to do end to end encrypted iCloud backups at any point in the future while still being able to say to government regulators, hey, we're still scanning for CSAM, right?
John:
You can't make us unencrypt our end-to-end encrypted backups because that's not stopping us from doing the thing you want us to do, you know, save the children and all that stuff.
John:
But from a feel perspective, I don't think this feels great to a lot of people.
Marco:
I mean, for me, the more I sit with this and the more we learn about some of the details...
Marco:
I'm put a little bit more at ease about it.
Marco:
The more I think about a lot of it, it's tricky because as you mentioned earlier, CSAM is a special case both morally for most people but also legally in most jurisdictions.
Marco:
Normally, you can take a picture of, in most cases, whatever you want and it's generally not going to be
Marco:
illegal for you to even possess that picture you know there's things like copyright infringement that could be a problem or you know other other issues but like for the most part most types of data are not themselves like totally illegal to even possess whereas this is and
Marco:
I think the more life experience or perspective you have, the more reasonable you think that is.
Marco:
If you know at all how horrific this kind of stuff can be, then, yeah, you kind of realize why it should be illegal to even possess it.
Marco:
So...
Marco:
Apple then is in a tough position because they sell a billion devices that have really good encryption built in and really good privacy built in.
Marco:
And it gives their customers the ability to do a lot of illegal things to great effect.
Marco:
A lot of that you can look the other way and say, well, it's out of our hands, it's not our problem, and the good outweighs the bad.
Marco:
But there's always this big exception that what if you enable child abuse?
Marco:
That's a pretty horrible thing.
Marco:
And in this case, if you look at the way they designed this feature, we'll talk about potential future motives in a minute, but if you look at the way they designed this feature,
Marco:
They didn't design it to prevent the iPhone camera from capturing CSAM images.
Marco:
They didn't prevent other apps from transmitting them back and forth.
Marco:
We mentioned earlier about the iMessage feature.
Marco:
I think a lot of kids are going to be using other services to do that kind of thing, not just iMessage.
Marco:
Anyway, they did roughly the bare minimum they could do to keep themselves out of hot water with the CSAM scanning feature.
Marco:
The iMessage thing, that's a little bit different.
Marco:
What they did was keep themselves out of possession of data that's illegal to possess.
Marco:
So in that way, they clearly did a very narrow thing here.
John:
Well, they're not keeping themselves out of possession because surely they possess tons of it now, and they're not going to find that tons of it unless it is uploaded or downloaded from iCloud Photos.
John:
Do we know if they're going to be retroactively scanning existing libraries?
John:
I would assume they are.
John:
They won't be because that's the whole point.
John:
They're not doing server-side scanning.
John:
Now, they will scan it if it comes to or from a phone, which may allow them to scan it.
John:
But, I mean, if it goes off the end because you're optimizing storage and you pull it back in.
John:
But Apple has been very explicit that they are not scanning it server-side.
John:
Eventually, they'll probably get it all because if it's in an iCloud photo library and you load up a new phone or even just scroll a whole bunch or, you know, like things go back and forth from iCloud photo library all the time to the phone.
John:
And every time something goes back and forth to any device, a Mac, a phone, I don't know if it's a Mac, this is just iPadOS and iPhoneOS.
John:
But anyway, anytime it transfers, it is then scanned on that device.
John:
But they're explicitly not saying, oh, and by the way, we're going to go through a back catalog of all of all these iCloud phone backups that we have access to because we have the keys and scan all the photos.
John:
Right.
John:
So Apple will undoubtedly continue to be in possession of CSAM just as they are at this moment.
John:
But going forward, they are trying to catch any collection of it that starts to exist or that is newly downloaded to a new phone or a new iPad or whatever.
Marco:
Yeah, that makes sense.
Marco:
All right.
Marco:
But anyway, I think they are clearly trying to, for the most part, in most ways, still let your device be your device.
Marco:
In this case, they are basically mostly protecting themselves from being in possession of data that's illegal to possess.
Marco:
And so I'm a little bit...
Marco:
heartened did we figure out if that's a word or not it is i don't know why you doubt this it is all right i'm a little bit heartened that they've done this in a relatively narrow way you know there's lots of ways that governments have applied pressure to tech companies that that i think are are a little bit more overreaching like for instance uh try scanning a picture of a hundred dollar bill uh or any you know euro banknote or you know any any modern banknote try scanning it and open it up in photoshop see how far you get
John:
Oh, that actually brings up the other big objection to this, which is the slippery slope thing having to do with governments.
John:
So we just described the feature.
John:
It's the NCMEC database.
John:
It's the comparison against things.
John:
One of the things people jumped on really early was like, first of all, how does stuff get into the NCMEC database?
John:
Because if it's totally opaque and Apple doesn't even get to know what's in there, you're just trusting NCMEC?
John:
What if someone...
John:
You know, from some company says, here, put in this picture of our, you know, copyrighted image into your NCMEC database.
John:
So then we'll know if anyone shares our copyrighted image or whatever.
John:
And the second thing is, that's just one database.
John:
Next, it's going to be a database of anything, you know, movie companies are putting in databases of...
John:
movies and trailers and we're just gonna find every you know it's gonna all it'll be copyright infringement and patents and all sort of stuff it'll just be like apple will just take anyone's database and just compare against and all this stuff there's lots of slippery slope arguments there apple for what it's worth has explicitly said no we're apple itself is not adding stuff to the database it's not letting anyone else add stuff to the database nick mech's entire purpose in life is not to allow random companies like disney to add pictures of iron man to the database because they don't want people sharing pictures of iron like
John:
It is very, very narrowly defined.
John:
Right.
John:
The second part of that, and Apple says that they'll, you know, like that's the intended function of this feature.
John:
Right.
John:
Second part is, OK, well, but the government can make Apple do all sorts of things.
John:
And in fact, the government can make Apple not tell people about it.
John:
So what if the government makes Apple add pictures of secret like Pentagon documents that they don't want to be leaked or whatever?
John:
And we need we want them to be leaked because they show like, you know, abuses in Abu Ghraib or whatever.
John:
Right.
John:
The government can make Apple do that and the government can make Apple not say anything about it.
John:
All right.
John:
So the solution to the government being able to force companies to do things that we don't like when you live in ostensibly a democracy, plus or minus voter suppression and gerrymandering and all the other terrible things that afflict this country.
John:
is that we change the government and the government changes the laws and we have things in the constitution that prevent, you know, like there was a whole big argument about how the fourth amendment would prevent any sort of evidence, you know, gathered in this way from being admissible in court or whatever.
John:
But anyway, in the U S in theory, I'm being buried under a storm of asterisks here.
Marco:
They're just falling from the sky, just burying me under a pile of asterisks.
John:
Yeah, I know.
John:
But, but anyway,
John:
So in the US, in theory, we have a mechanism to stop that from happening.
John:
But what it comes down to is, yes, companies are subject to the government that runs the country in which they operate.
John:
And the US is subject to the US government.
John:
And the US government has a bunch of terrible laws.
John:
And it's very difficult to change those terrible laws.
John:
And we know all that.
John:
But that is that situation.
John:
But then one step up from that is, OK, let's say you're OK with the US and you think they're not going to do anything too terrible.
John:
What about in China?
John:
Well, I have some bad news, as we've discussed in the past.
John:
Apple has a China problem and the world has a China problem.
John:
And part of that problem is that China already has access to everything that Apple does in China, because China has made Apple put all their stuff in Chinese data centers where China holds the keys.
John:
Right.
John:
That's not a problem Apple can solve.
John:
The only way they can solve it is say we're either going to be in China and do what Chinese law dictates, which is essentially give China access to everything, which is what the situation currently is, or we don't do business in China, which is what some other companies have chosen to do.
John:
So that's the conversation you need to have there, which is like,
John:
First of all, China doesn't need to stick things in the NCMEC database.
John:
They have access to everything because they're an oppressive authoritarian regime, right?
John:
They've already done that.
John:
They probably have way better systems than this for, you know, keeping track of the dissidents and doing all terrible things that they do, right?
John:
That's terrible.
John:
That's also not a problem Apple can solve, and it's not made worse by this feature.
John:
So, like so many things...
John:
If you don't trust your government to not do oppressive authoritarian things, nothing the technology company that operates in your country can do will fix that.
John:
Like, Apple can't fix the U.S.
John:
government except for through lobbying and all the other ways they can fix it.
John:
But again, as they're all the asterisks that are falling down from this guy, from Marco, government problems need to have, unfortunately, government solutions.
John:
So...
John:
The reason technology is so difficult to regulate is because the issues are complicated and nuanced and there's lots of, you know, we have to do this because terrorism or save the children or whatever.
John:
So we need backdoors and all our encryption and we continue to fight that as tech savvy voters and consumers.
John:
But I think the most salient point here is that regardless of your dim view of the US government, and I think we all share that, we can say that in the US, our ability to change what the government can and can't do is way better than in China.
John:
And as we said at the top of this program, this policy is only in effect in the U.S.
John:
So if you see this and you think this is terrible, the government can make Apple do all sorts of sneaky things.
John:
A, I would say, yeah, the government could already make Apple do all sorts of things and force them not to tell you about it.
John:
This has already happened and will continue to happen.
John:
And if you don't like that, vote for people who want to change that.
John:
That's the only stupid tool we have to change that.
John:
You know, there is no complaining on Twitter about Apple policy that is going to change that.
John:
because apple i believe me apple does not like being told to do something by the government and also being told that they can't tell anyone about it apple doesn't like that either right so if you don't like that and if you feel bad about that let's change the laws related to that and again in theory the constitution is some form of a backstop against the most egregious offenses because our certain rights are very difficult to change without a constitutional amendment and yada yada yada right
John:
And then if you're worried that Apple is going to let China do whatever they want, they already are.
John:
Sorry.
John:
Right.
John:
And if you're worried that Apple is going to let some other country do whatever they want, this eventually comes down to the foundation of trust that we've talked about when talking about many features in the past, which is in the end, you have to trust.
John:
your OS vendor or your platform vendor with something.
John:
Because no matter what they do, like, oh, we have end-to-end encryption.
John:
Somebody writes the app that implements end-to-end encryption.
John:
And if you don't trust the person who's writing the app, even if it's open source, oh, I trust them because I can see the source code.
John:
Oh, really?
John:
You audited all those lines of the source code?
John:
If that was true, a heartbleed wouldn't have happened, right?
John:
In the end, you have to have some baseline level of trust of the person who is implementing your encrypted system, even if you agree with all of the, you know, the way it's supposed to work.
John:
that's what it always comes down to do you trust apple to not secretly put pictures of mickey mouse and iron man into the database and find people who are illegally copying like movie trailers or something stupid right you either do or you don't uh and if you don't trust them who do you trust buy your phone from them instead right that's what it comes down to because yes
John:
Apple, like what their encryption thing is, in the end, the messages app eventually has access to all of your messages.
John:
The mail app eventually, because it shows you them on the screen.
John:
Like they're in memory on the phone.
John:
Like the phone could be doing whatever it wants.
John:
Like it doesn't matter about all this encryption, provable security or whatever.
John:
Something has to decrypt them and send the information to your eyeballs.
John:
Right.
Casey:
Right.
John:
Right.
John:
That's true.
John:
But at each one of those stages, it's like, what can I do to change the government in China?
John:
What can I do to change the government in the US?
John:
And do I trust Apple to do something that's in my interest?
John:
On the plus column for Apple, they have proven in the past that they will resist US government pressure to do a thing
John:
That would be essentially a PR win.
John:
Oh, Apple was so great.
John:
They unlocked that terrorist phone for the FBI.
John:
Apple refused to do that, despite the fact that to many people it made them look bad.
John:
Oh, does Apple side with the terrorist?
John:
Do you enjoy, you know, the San Bernardo killer?
John:
Is that your number one customer?
John:
You want to protect that person?
John:
Because there is a higher principle.
John:
So if you're worried that Apple would never do that, they have at least once, and probably more times, proven that they will do that.
John:
If you're worried that Apple is being forced by the government to do things and not say anything about them, yeah, that's probably happening.
John:
But nothing about what Apple implements can really prevent that.
John:
You could say, oh, if Apple didn't implement this feature, then they wouldn't have to bow to government pressure.
John:
No, because once the government can make you do stuff and not say anything about it,
John:
There are very few limits on that.
John:
And again, iCloud backups are not end-to-end encrypted.
John:
So already the government can probably force Apple to give them that information and not say anything about it.
John:
So I kind of understand the argument that tech companies shouldn't implement features like these because it makes it easier for the government to demand that they do things.
John:
But I don't really...
John:
Buy into it too much because if your problem is that the government can make a company do a thing, the solution is not tech companies should never implement features because government can make them use the features for nefarious purposes.
John:
The solution is it shouldn't be legal for the government to use these to make companies do these things for nefarious purposes.
John:
And in general, it's not, except for in these quote unquote extreme circumstances, 9-11, never forget, where these laws can be used and abused to make companies do things because of terrorism, because of child sexual abuse and so on and so forth.
John:
And then finally, as we've been discussing the whole time, sometimes, as they say at the beginning of the show, certain crimes are particularly heinous.
John:
Right.
John:
Like I'm not getting the quote right.
John:
That's law and utter SVU for people who are not getting the reference I'm trying to make.
John:
Sometimes there is what is the worst of the worst of the worst thing that society treats differently for reasons we all agree on.
John:
And in those particular cases, I think it is worth it to try to do something rather than doing nothing because you think the nothing will somehow protect you against an oppressive government.
John:
And I think it's way, way, way better than what governments might eventually force them to do if they hadn't done this.
John:
Yes, exactly.
Casey:
Yeah, my first reaction to this was, this is garbage.
Casey:
And the more I read on it, the more my reaction and my thoughts on it are calmed down.
Casey:
I think maybe, John, you're slightly, I don't know, underselling is the best word I can come up with.
Casey:
But I understand, I really, really, really understand, certainly after 2016 through 2020, I understand better than I ever have, that it is easy for us to lose control.
Casey:
This already sounds bad, but it's easy for us to lose control of our government.
Casey:
And by that I mean rational humans.
Casey:
And so...
Casey:
When one cannot fundamentally trust your own government, which is probably been true my entire life, but it's only felt true in the last five-ish years, particularly 2016 through 2020.
Casey:
When one can't trust their own government, then it makes it hard.
Casey:
hard to trust that they won't compel Apple to do this.
Casey:
And ultimately, as much as Apple will say, no, we will refuse, we will not capitulate, we will never allow this to happen.
Casey:
Even with that said,
Casey:
Ultimately, when it comes down to it, the government has guns and bombs.
Casey:
And not that they would literally bomb Apple, but if the government really went that haywire and really wanted to win this argument, they will win the argument.
Casey:
There is no ifs, ands, or buts about it.
Casey:
And the reason I think everyone's worried, including me, although by and large I'm not...
Casey:
I'm not too upset about this anymore.
Casey:
But the reason everyone is worried is that before, there was no real mechanism that we knew of to scan your photos for content justified or not that someone has deemed inappropriate.
John:
There was, though, because Apple has access to all your iCloud backups.
John:
If the government came to you and said, hey, we want you to scan all of KCLS's photos...
John:
they could totally do it right now without this feature.
John:
Like, that's what I'm saying.
John:
This doesn't add any, you know what I mean?
John:
And that's, that's where we get to the end to end, uh, backup thing of like, of closing that door.
John:
But right now that door is not closed.
John:
Like, so like, I understand the argument, like if you add a feature, it makes it easier for the government to make you do a thing.
John:
But the thing that the government would make you do, they can already make Apple do.
John:
And they have been able to make, and in fact they have actually done it.
John:
I'm pretty sure the government has used law to get access to people's,
John:
iCloud backups right with or without letting Apple tell you that it's happening right they do all the time that's already technically possible right the unlocking of the phone is like oh we just want to see what's on that phone but if it was in the iCloud backup we would have had access to it already so like I I know what people are saying of like if you implement this feature the government can force you to do it but I like the solution to that like I don't think this strategy of we'll just never implement features that could be abused by the government
John:
is a good one because almost any feature can be abused by the government and lots of useful features can be abused by the government.
John:
The solution to government abuse is government.
John:
Like is, you know, part of the reason the constitution exists and the whole argument that I saw in some article or whatever of like, would the fourth amendment allow you to submit as evidence in any kind of criminal trial and,
John:
information gained by forcing Apple to scan things like, you know, secretly or whatever.
John:
And like, you know, that's the reason we have courts and the Constitution and our laws and the Fourth Amendment to try to protect against those kind of abuses, to try to protect against the government saying, oh, we're just going to, the government's going to listen to everyone's phone calls.
John:
Oh, yeah.
John:
Does that sound familiar to anybody?
John:
Yep.
John:
9-11, never forget.
John:
Like this is a problem.
John:
But when I see this problem, I don't think this is a problem that needs to be solved by tech companies.
John:
It's not.
John:
It's a problem that tech companies live with.
John:
And I get that argument.
John:
But it really just sort of makes me even more hardened to fight against.
John:
Stupid laws, stupid politicians that appoint stupid judges through stupid processes that don't respect the will of the people.
John:
There's plenty of problems here, but the way I feel like attacking them is not through the tech stack.
John:
And living within those limits, I feel like this specific feature of the NCMEC database and...
John:
Scanning for CSAM on devices against a collection of data that the government already has access to is not a feature that that worsens the situation.
John:
Like, I feel like it does acknowledge that, yes, our government is bad because it doesn't give the government access to anything they didn't already have access to.
Casey:
I do see what you're saying.
Casey:
I don't think I entirely agree.
Casey:
I think the rub for me is that, yes, the government could say, scan Casey's photos for such and such imagery.
Casey:
And presumably right now, because you two jerks made me join iCloud Photo Library, then it is hypothetically possible, sure.
John:
Your photos were all over Google before.
John:
They're scanning everything.
John:
Well, that's even worse.
John:
Your photos have already been completely scanned by Google.
Casey:
Oh, absolutely.
Casey:
Well, hopefully not anymore.
Casey:
Well, I guess it never really dies.
Casey:
We were getting sidetracked.
Casey:
Did you empty your trash yet?
Casey:
Yes, I did.
Casey:
Oh, all right.
Casey:
So the thing is that there was – you could argue, and nobody really knows, but you could argue that while it is easy for the government to say, scan all of Casey List's photos and look for such and such,
Casey:
I would assume, maybe ignorantly, maybe naively, that it is less easy, or it was until iOS 15, less easy for the government to say, hey, I would like to know across all Apple users in the United States, who has a picture of Trump or whatever.
Casey:
And...
Casey:
Now, there is clearly a mechanism that Apple claims would never be used for this, that, you know, the neck motor, whoever they are, would never give us an image like that.
Casey:
But the technical solution to show me all of the iPhones with a picture of Trump on it—
Casey:
They could hypothetically do that now in a far easier way than they ever could before.
Casey:
And what you'd said earlier.
John:
Do you not remember when the U.S.
John:
government was listening in every single phone call in the entire United States?
John:
Does that not ring a bell?
John:
Do not underestimate the government's ability to do like, you know, well, they could target, they could tap my phone, but they're not going to listen to all the phone calls in the United States.
John:
No, they will.
John:
The government can absolutely look at every photo and every iCloud backup if they wanted to.
John:
They can look at every photo going across the entire industry.
John:
That's the power of our – that's our tax dollars at work.
John:
Are we making our own little oppressive regime under the guise of fear-mongering for terrorism?
John:
Those are all terrible things that have happened in our country.
John:
And are probably still happening.
John:
exactly right and it's you know that's again the difference between surveillance like technology enables surveillance like there's plenty of sci-fi on this right that without technology you have to look at just the one person but technology is like you know what we can just look at everything all the time
John:
Why don't we try that?
John:
Like, that's why so many sci-fi stories have to do with the techno dystopia where, you know, the panopticon where you're being watched all the time.
John:
That's not possible with humans.
John:
It's very possible with computers.
John:
And so, you know, again, with the discomfort, Apple solution is essentially surveillance, private surveillance by a private company of private people stuff.
John:
Right.
John:
But government also does surveillance.
John:
And thanks technology, they can also do it on a mass level.
John:
Right.
John:
and you know so if if the government for all we know the government is already doing this without apple's knowledge because another thing that our wonderful government does sometimes see the phone tapping or whatever um but you know and it's not again it's not a human listening it's machines processing that's always the way it is the magic of computers but like that's why i think you have to look at these in terms of capabilities if you are tasked with searching all uh photos for every u.s citizen
John:
your go-to is not, let's get something into the NCMEC database, right?
John:
Your go-to is not, aha, finally Apple invented this feature.
John:
We'll finally have our opening.
John:
No, you've long since implemented your own solution to this that is not Apple specific, that is not Google specific, that is not Microsoft specific, that spans the entire internet and has nothing to do with any specific feature a tech company had to build for you, right?
John:
And there's all sorts of conspiracy theories.
John:
You can think about how that might be done, but like,
John:
That's what I get to where you really need to look at the specific feature and it says, does this specific feature make it more likely that this bad thing is going to happen?
John:
And this specific feature in this specific case, I think doesn't because it doesn't provide any new capabilities and it doesn't even make it any easier.
John:
In fact, it's harder because of the limitations of this database and exact matches and so on and so forth.
John:
It's easier to just scan everything for, you know, anything you want to in your own scanning technique of not being as strict of saying it has to be in this fixed database or whatever and doing it.
John:
client side scan them all server side using whatever logic you want right look for whatever you want you're not limited by this feature this feature is too limiting to be useful as a government tool government has much better tools already at their disposal that i feel like they would prefer which is why this specific feature doesn't bother me the broader question of like why is apple implementing implementing essentially surveillance features is slightly bothersome but i think that is mostly
John:
explained by the fact that they're doing this for essentially trying to be narrowly targeted, as Marco was saying before, narrowly targeted to their own apps in the worst case scenario or thing everyone agrees is awful that has special laws already written for it.
John:
And so if you're going to be comforted by any of the narrowness, this has all the narrowness you could possibly imagine.
Marco:
Yeah.
Marco:
And to be clear, and Casey, I agree with your concerns for the most part.
Marco:
I think we all saw how big the mountain of asterisks on our government was over the last – and not even just from 2016 to 2020, but I would even say a lot of that happened from 2000 to 2016 as well.
John:
It happened much longer than that, but it started affecting white men recently, so we all know.
John:
I mean, that's the truth of like, if you think you have distrust of government or distrust that government's going to do things that are in your best interest, you're very lucky if you just had that realization in the last decade or so.
John:
Most Americans have had that realization for way longer.
Marco:
Yeah, exactly.
Marco:
And so, you know, I have a slightly more, I guess, defeatist view on this, but, you know, I think that it enables a more clever solution, which is, you know, I think people keep saying like, well, this is okay in the US, but what happens if China gets a hold of this?
Marco:
No, it's not okay in the U.S.
Marco:
either.
Marco:
It's not okay for the government to have widespread surveillance powers in the U.S.
Marco:
either.
Marco:
And we have seen over and over again how the freedoms that we claim to have, the justice system that we claim works and is reasonably impartial, the oversight the government agencies are supposed to have over each other, the checks and balances, we've seen how all of that
Marco:
can just be forgotten and made exceptions for at the drop of a hat.
Marco:
And it isn't even just one or two bad presidents that get us there.
Marco:
We have so many exceptions on all those freedoms and protections that we think we allegedly have.
Marco:
And the reality is we have a government that largely does whatever it wants and that when bad actors come in –
Marco:
they are able to get a lot of bad stuff through our system.
Marco:
I mean, geez.
Marco:
I'm always reminded, imagine how dark things would have gotten in the last four years if they were actually competent.
Marco:
They were incredibly cruel and mean-spirited, but they weren't very competent.
Marco:
Imagine if they were also competent.
Marco:
How much damage could have been done?
John:
It would be like the Reagan years.
Marco:
so so my point is you know if you are if you desire or if you need for whatever you are trying to do if you need super privacy if you want to have private conversations say about the government
Marco:
You are not doing yourself any favors by having those conversations on public cloud services that are not end-to-end encrypted.
Marco:
There's lots of arguments whether iCloud should have end-to-end encryption for everything.
Marco:
iMessage is, by default.
Marco:
Obviously, there's the issue of what happens when it backs up itself to iCloud, which is –
Marco:
I forget if that's the default now, but that can be turned off, and it was off for a long time before it existed.
Marco:
Anyway, the point is, if you want governments that are ill-intentioned, which over an infinite timescale, that's going to be every government at some point.
Marco:
If you want your data...
Marco:
to be above government surveillance you have to take a purely technical approach to that and and use things like strong encryption and even then hope that you know the nsa hasn't broken that encryption very easily in ways that you don't know about yet or intentionally weakened it you didn't even know but they weakened it from the beginning there's lots of i love the i love those conspiracy theories and some of those you look at you're like
John:
It doesn't make me feel good.
Marco:
Yeah, exactly.
Marco:
But yeah, so the point is like if you want to get out of the potential for governments to abuse their power and for Apple to abuse its power and to work together to try to get you, the way you do that is –
Marco:
Using technological measures, using encryption and stuff that is beyond – where you are protected, again, assuming that it's good encryption and it hasn't been cracked or broken or sabotaged, you are protected by math and logic and provable things, not just policies.
John:
but the law the law outweighs math though i remember when there was like a encryption uh there was there was export uh like restrictions on heavy encryption and like the playstation 2 couldn't be exported or what do you like yeah yeah you can that's that's what we're that's what we keep getting out of like if you let the government if you don't do anything the government will make some terrible law like outlawing intended encryption so yeah math is the protection against that's why apple can refuse the fbi's things like we literally can't do that it's like
John:
It's physically impossible because of math.
John:
Right.
John:
But the government can always come back and say, oh, yeah, guess what?
John:
N10 encryption is legal now.
John:
And that's super bad.
John:
So in the end, the solution to all this all has to be a government powered solution in the interim.
John:
When we have when we are protected by whatever whatever crumbling foundation remains of our Constitution that protects our supposedly inalienable rights as upheld by.
John:
A bunch of lifetime appointed judges who got there by an incredibly corrupt, terrible process.
John:
And many of them are themselves terrible people.
John:
Hopefully we protect enough of our foundational rights so that let's say if the government makes a terrible law that makes it impossible to provide any kind of secure communication, that that would be shown to be unconstitutional by someone who isn't an originalist.
John:
The founders never knew about encryption.
John:
This must be legal.
John:
What Marco's point, though, is about the, you know, talking about it on public clouds or whatever, gets this really good...
John:
Aspect of this whole discussion that brought up by the person who runs the pinboard service and Twitter account.
John:
I forget this person's name.
Marco:
Yeah, there you go.
John:
I'm just going to read these three tweets because it basically summarizes it better than I could.
John:
This is directly from the pinboard tweets.
John:
The governance problem here is that we have six or seven giant companies that can make unilateral decisions with enormous social impact and no way of influencing those decisions beyond asking nicely for them to come talk to the affected parties before they act.
John:
So this is the problem of like, oh, so if you don't trust Apple, maybe you should try Google.
John:
Oh, if you don't trust Google, maybe you should try Microsoft.
John:
Oh, if you don't trust Microsoft, I'm running out of places to get my phone real quick.
John:
Didn't Stallman try to make a phone or something?
John:
Whatever.
John:
Yeah.
John:
Right.
John:
Large tech companies at this point can do things with their policy.
John:
Like let's let's say Apple implemented a bunch of these policies for child safety and they were much worse.
John:
They were like super harmful and like they they did a much worse job of trying to balance the concerns of like they really you know, the chance of false positives are really high and it was just going to look like it was going to be a disaster.
John:
You don't have a lot of recourse as a consumer.
John:
Because these companies get so big and so powerful and they all tend to do similar things.
John:
See all the other companies that are doing the server side scanning that if you really don't like what they're doing, because they're not government entities, you can't vote them out.
John:
And you quote unquote voting with your wallet has a limited effect on them unless you can get millions and millions and millions and millions of other people to do the same thing.
John:
And in the end,
John:
People tend to need in the modern era cell phones to just live their life.
John:
And if there are only a few sources of those cell phones and all those sources agree that they're all going to do a thing and you don't like it, the idea of, well, I just want to have a cell phone is very difficult to convince millions and millions of other people to also do to the degree that affects them.
John:
So the general, we've talked about this before of like, why is it bad to have a small number of giant companies that control important aspects of our life?
John:
In general, it's bad.
John:
So continuing the pinboard tweets.
John:
The way we find out about these technology impacts is by rolling them out worldwide and then seeing what social political changes result.
John:
See also social networking, Facebook, so on and so forth.
John:
Sorry, I'm adding some commentary.
John:
I hope you can see which parts are mine.
John:
It's certainly a bracing way to run experiments with no institutional review board to bog everything down with pessimism and bureaucracy.
John:
So it's important to note like, yeah, private companies can do things more efficiently and
John:
in these regards and sometimes it is better to not like this shouldn't be done through one governmental agency innovations the reason we have all these great things from apple and microsoft and google and all that good stuff right um so i'm continuing from pinboard but the problem is there's no way to close the loop right now to make it so that apple or facebook or google inflicts huge social harm their bottom line suffers or their execs go to jail or they lose all their customers
John:
Profits accrue while social impacts are externalized.
John:
So say you start a social network originally to try to rate which girls in your school are hot or not, and eventually you end up fomenting genocide halfway across the earth.
John:
Does that affect your bottom line?
John:
Are you harmed by that?
John:
I guess it's a press relations issue.
John:
We can probably smooth that over.
John:
But when you get to the size of Facebook, if you accidentally foment genocide –
John:
But the loop is not closed.
John:
Those are the externalized harms.
John:
But your stock doesn't suddenly drop in half.
John:
You don't get fired.
John:
Nobody goes to jail.
John:
Maybe you get brought in front of Congress and they yell at you a little bit while you say that you can't remember or are just trying to do the right thing.
John:
But this is yet like, you know, I know we just got done talking about how Apple we think is mostly trying to do the right thing here.
John:
It's important for technology companies to do something.
John:
But let's not lose sight of the fact that
John:
Having gigantic, incredibly powerful, a small number of gigantic, incredibly powerful tech companies is itself its own problem, independent of the problem of trying to have a government.
John:
Because as bad as the government system is, we have even less control collectively over what these companies do.
John:
In some ways, you may think we have more because like, oh, the citizens are make or break these companies.
John:
But...
John:
practically speaking especially in areas that have technical nuance it has proven very difficult for consumer sentiment to to close the loop to say hey company if you do a bad thing you will be punished in a way that makes you motivated to not do bad things in the future that loop tends to only work in terms of like products that explode in your hands or like you know
John:
supporting the worst of the worst possible politicians with your donations but in general if you do something and there's like a third order effect again if you make facebook and it accidentally foments genocide um most people are like yeah but that wasn't really facebook's fault and the genocide people were going to do a genocide anyway and facebook's trying to stop it and like like the loop is not closed there right and so if there's something that all these big phone companies are doing with their phones that you don't like
John:
It's not actually that easy to change that, especially if you don't like it, but no one else cares.
John:
You may be listening to this and saying, I'm never going to buy an Apple phone.
John:
They're spying on me.
John:
And so is Google and so is Facebook or whatever.
John:
But just try getting one of your friends who's not into the tech world to listen this far into this podcast.
John:
In general, what we've seen from technology stuff like this is that people just don't care.
John:
Like they just want their phone to work.
John:
They just want it to do their things as long as it doesn't bother them, as long as they're not falsely flagged for child sexual abuse material.
John:
They mostly don't care.
John:
So trying to affect the policies of these companies by rallying the people to refuse to buy Apple phones or Google phones or Microsoft phones or Android phones of any maker is really, really difficult because to paraphrase singles, people love their phones.
Marco:
We are sponsored this week by Burrow, the company setting a new standard in furniture.
Marco:
Burrow has timeless American mid-century and contemporary Scandinavian styles for their furniture.
Marco:
It's easy to move and comes in modular designs.
Marco:
You can get it upstairs.
Marco:
You can mix and match different pieces and stuff.
Marco:
And it's all made from premium durable materials, including responsibly forested hardwood, top grain Italian leather, reinforced metal hardware, and everything you might expect out of high quality furniture.
Marco:
Their in-house design takes a research-driven approach to make sure their furniture fits your lifestyle.
Marco:
This translates to things like a simple mounting guide for the index wall shelves they sell, a tool-free assembly process, and of course, a modern, convenient shopping experience.
Marco:
They got rid of the faraway warehouse stores, the high-pressure showrooms, and they replaced them with modern, easy-to-use online shopping.
Marco:
Of course, that's what you want these days.
Marco:
You get to create and customize your own furniture without leaving your house.
Marco:
And there's free shipping for all.
Marco:
Every order, no matter how big or how small, is delivered directly to your door for free.
Marco:
This can save easily up to $100 or more when it comes to big stuff like couches.
Marco:
And all this is backed with Burrow's world-class service.
Marco:
You know, everyone needs a little help sometimes.
Marco:
The Burrow team is always available to lend a hand from custom orders to delivery scheduling, whatever you might need.
Marco:
So listeners can get $75 off your first order at burrow.com slash ATP.
Marco:
That's B-U-R-R-O-W, burrow.com slash ATP for $75 off your first order.
Casey:
burrow.com slash atp thank you so much to burrow for sponsoring our show lalo vargas writes hello friends what's your current thinking on bitcoin and crypto in general i think i never heard you talking about nerd money do you hold any without disclosing any amount any project in particular that you like thanks friends uh so a couple things first of all let me try to be brief uh which we never successfully do on this show
Casey:
uh my thoughts on crypto are you know i think the heat death of the universe is coming fast enough without crypto let's not accelerate it uh but with that said one of you probably john added two delightful links to the show notes which is en.wikipedia.org slash wiki slash ponzi scheme and slash pyramid scheme which made me laugh more than i am comfortable admitting when i saw those in the show notes a day or two back so
John:
john would you like to explain the relevance here yeah john do you hold any bitcoin so uh we did actually talk about this on a past show not that long ago too i put these links in there just because it's fun to like if you read the little summary on pyramid scheme you would read it and say okay
John:
Technically, Bitcoin isn't a pyramid scheme because pyramid scheme is a business model that recruits members via promise of payments or services for enrolling other members into the scheme.
John:
It's like that's not how Bitcoin works.
John:
You don't get bitcoins for recruiting other people into Bitcoin.
John:
So that's it's not really a pyramid scheme.
John:
So let's look at Ponzi scheme.
John:
Is that what it is?
John:
Ponzi scheme is a form of fraud that lures investors and pays profits to early investors with funds to more recent investors.
John:
It's like, well, that's not how Bitcoin works.
John:
When new people invest in Bitcoin, their money doesn't go to the early investors, like directly like it does in a Ponzi scheme.
John:
The reason I put these links in here, though...
John:
is that although bitcoin technically isn't exactly like the technical definition of a pyramid scheme and technically isn't exactly like the definition of a ponzi scheme it operates very much like them in that thus far the only value inherent in bitcoin is based on the speculation that the price of everyone's bitcoin will go up and so
John:
Getting more people to invest in Bitcoin and therefore making Bitcoin look more desirable does in fact benefit the early investors and quote unquote recruiting people into getting Bitcoin, just like in a pyramid scheme, does in fact raise the value of the people who already have Bitcoin and were in earlier.
John:
right setting that aside we've talked about in the past the mathematical foundations of like oh isn't it cool that you can have two people who don't trust each other have an exchange of money without a central party mediating it right that that technology is interesting unfortunately it uses a lot of energy and is really slow and doesn't have good concurrency and has all sorts of other problems that have to do with it which makes it not that interesting for many problems except for buying heroin
John:
It's a great way for criminals that don't trust each other to exchange money in a way that's not observable by governments.
John:
So there is a use case for Bitcoin.
John:
It just happens to be a terrible Bitcoin.
John:
If you are a criminal and don't want to use the banking system because you're doing something criminal, Bitcoin is a great way to do that.
John:
So what Bitcoin has enabled is a huge explosion in ransomware.
John:
Because guess what?
John:
You can get paid for ransomware anonymously through Bitcoin.
John:
It's way easier than trying to get money into it.
John:
Because think of what you have to do with ransomware without Bitcoin.
John:
You have to get someone to transfer money into like a numbered account in like Switzerland or something.
John:
It's like way more complicated.
John:
Bitcoin is so much easier.
John:
So that's why there is a huge explosion in ransomware.
John:
So what do I think about cryptocurrency that uses proof of work or even the ones that don't like... Yeah, proof of stake is the new one.
John:
Yeah, proof of stake is slightly better for the environment.
John:
But the bottom line is like...
John:
lots of bad uses are enabled most of the people who are into it and the reason you see so much evangelism is because the more people they can get to get into bitcoin the higher the value of bitcoin goes up and that helps them with their investment and those are all the earmarks of a pyramid scheme or a ponzi scheme even if it's not technically exactly the same thing so
John:
Are people getting rich off of Bitcoin?
John:
Yeah, people get rich off of pyramid schemes and Ponzi schemes all the time.
John:
That's why they exist because they make people rich.
John:
But they're not a great thing to get into.
John:
And the whole thing about Bitcoin is like, well, if you had said, you know, people thought it was about reaching the tipping point five years ago, but if you had...
John:
heeded that advice and not invested you wouldn't be rich like i am now that's true that's true of ponzi schemes impairment schemes too but it like it doesn't make me excited to get into them because i am not planning on ransom wearing anything i'm not trying to buy heroin uh and i do not have confidence that were i to put my life savings into some kind of cryptocurrency that i would be that i would not be the last person left holding the bag but i instead would be one of those early investors who gets rich off of it so um
John:
If you have gotten rich off it, congratulations, good job.
John:
But if you have not invested in Bitcoin, I would suggest that it is not a particularly safe place to put your life savings, given that no one really knows how and when this story will end.
John:
But most people are pretty confident that it will end in some way.
John:
And when it does end...
John:
uh you don't want to be the one you know left holding the bag you don't want to be the one playing musical chairs who has no place to sit down when all the other people cash out the early people if they haven't already and you're left with a bunch of bitcoin that becomes not worth quite that much and if you were wondering if bitcoin really has great utility and worth in the world look at what people do with it which is they like to exchange it for what i would call real nerd money which is actual money that you can use to buy things
John:
Wow.
Casey:
A couple of quick thoughts here.
Casey:
First of all, I think it was ATP 424 from April where we discussed this.
Casey:
I put a link to that in the show notes.
Casey:
And additionally, I do think as much as I snark on Bitcoin and crypto, I do think, and John, you alluded to this earlier.
Casey:
The mathematics behind it, or the principle of the mathematics behind it, I think are fascinating and very clever and very cool.
Casey:
And I've talked about this a couple of times, but there's a really good video by Blue31Brown or something like that.
Casey:
I forget the name of this person.
Casey:
Three blue, one brown.
Casey:
I was close.
Casey:
It's like a 25-minute video or something like that.
Casey:
But it is extremely well done and builds up from like, hey, how do you and a couple of roommates figure out how to settle up bills if you don't trust each other?
Casey:
And it basically ends up with Bitcoin.
Casey:
So as a...
Casey:
Solution to a problem.
Casey:
I think it's very clever and very interesting as something that is using incredible amounts of power and is extraordinarily inefficient by and large and is surely going to create a lot of email for us that we don't want.
Casey:
Not a fan.
John:
I mean, this is an example of externalities, right?
John:
So the technology is there and it's like, oh, and the externality that we just essentially made it profitable to burn energy, right?
John:
Because as long as you make more in Bitcoin than you spend in electricity, it is a profitable endeavor.
John:
So the
John:
unintended externality of these cool systems for having sort of zero trust uh you know no uh middle party relationship to be able to exchange uh things the externality is wait a second i think you just made it profitable to burn electricity and they did and people do and it makes sense from a financial perspective but from a like from a earth perspective of like
John:
So what value are you creating?
John:
Well, it's kind of like a pyramid scheme and some people get rich.
John:
OK.
John:
And the cost is what?
John:
How much CO2 emissions?
John:
Oh, we only use solar power, spare energy.
John:
I'm not quite sure about that.
John:
Like the real question for all that is like, OK, look, if Bitcoin didn't exist, would that coal have been burnt?
John:
Like, where would that energy go?
John:
Like, you know, obviously the silly ones are like, there was a shutdown power plant and Bitcoin miners bought the power plant and turned it on.
John:
And all it does is run Bitcoin stuff all day.
John:
And by the way, it's making everyone's gaming cards more expensive.
John:
Can we at least, you know, agree on that?
John:
Well, even nerds should be able to say, it's not good that we can't get GPUs to play games.
John:
Games produce value in the form of happiness in the world, right?
John:
Actual and people get paid to make games like it's an actual economy.
John:
Bitcoin, for the most part, does not produce any value except for speculative investors making money at the expense of later investors and maybe some cool technical papers that help someone get their PhD.
John:
Wow, that sounds like a Ponzi scheme.
Marco:
Yeah, and I think, to me, multiple parts of this are offensive to me.
Marco:
First of all, I am in agreement with Casey and I think, John, that the technological concepts of Bitcoin
Marco:
Shared work like this, the idea of blockchain verification of transactions, that's a really cool set of technologies and approaches, and it's very clever and it's fascinating.
Marco:
But I think what that has enabled, if you look at the total, the net good and bad that have been enabled by cryptocurrency, I think the bad dramatically outweighs the good.
John:
It's not it's not even close.
John:
The killer app of Bitcoin is literally ransomware, right?
John:
Exactly.
John:
And possibly also drugs.
John:
But I mostly just hear about the ransomware and the circles I travel like the net net is not good for the world.
John:
For individuals, it might be great.
John:
Some people who got rich, it's great for them.
John:
But for the world, it's super negative at this point.
John:
It's not even close.
Marco:
Yeah.
Marco:
And then my second major problem with Bitcoin is the people.
Marco:
Now, I know we're going to hear from a few of them, and I'm going to tell you right now, I don't like you.
Marco:
And if you write to me and say bad stuff, I won't care.
Marco:
I don't like you, because what you most likely are, if you're into Bitcoin, so A, you are very likely to be a person who is willing to make the world a slightly worse place.
Marco:
whether it's through carbon emissions or through participating in a system that enables a lot of illegal and damaging activity, whatever it is, you're willing to make the world a little bit worse place to make a buck.
Marco:
And that tends to attract not the best people to that area.
Marco:
Now, when you combine that factor with, okay, I know I am a privileged white man in tech, but can I use the word tech bros?
John:
I hope so.
John:
I think so.
John:
You're old enough now that I think you're allowed... Now you're an old white man in tech, so you can say tech bros.
Marco:
And I don't think I ever was a tech bro, necessarily.
Marco:
I was near... You totally... You 100% were.
Marco:
No, I... No.
Marco:
I was near that area, but I don't think I would... I was never... I wasn't, like, you know, one of those people who would, like, go on stage at TechCrunch Disrupt and pitch my startup that's going to change the world.
Marco:
That was never me.
John:
Yeah, but you were in a startup that changed the world, so...
Marco:
No, I was going to start.
Marco:
Yeah, I don't know if it changed the world.
Marco:
Anyway, I wouldn't.
John:
I would never change the world of porn.
Marco:
OK, I would never clean that.
Marco:
And that actually mostly happened after I was gone for the record.
Marco:
Anyway, so anyway, so I the world of tech bros is a world that I don't usually get along with very well.
Marco:
It's all those people who the Silicon Valley TV show makes fun of and they don't think it's funny.
Marco:
It's that crowd, right?
Marco:
So what Bitcoin and cryptocurrency in general?
Marco:
The kind of people that attracts.
Marco:
It's the combination of tech bros, which are largely a pretty terrible group of people in some relatively minor ways, but mostly still terrible people.
Marco:
The intersection of tech bros with a far worse group of people, finance bros.
Marco:
oh and living in the new york metro area i see a lot of these people oh my god they're the worst so when you combine tech bros with finance bros and libertarians yeah and libertarians that's that's a whole other thing when you combine these groups of people and especially the prospect of making money pulls in the finance bros and converts some of the finance bros into wannabe tech bros and
Marco:
And so the intersection of this produces just the worst group of people ever.
Marco:
Like you do not – oh, and of course all the profiteering people who will burn carbon to make a little bit of money.
Marco:
So like this is – the collection of these people is just the worst people.
Marco:
And so cryptocurrency as a thing, while I think it's an interesting concept, the realities of the kinds of people who are mostly into it and the kinds of people it attracts and the kind of usage it attracts are so terrible.
Marco:
Yeah.
Marco:
And both from an annoying point of view and from a world-damaged point of view.
Marco:
So it's a terrible thing that it has actually produced.
Marco:
These are all the worst people that have invaded our industry and taken over all the GPUs and everything.
Marco:
They're making the tech industry worse.
Marco:
And they're a burden on us.
Marco:
They're a burden on the world.
Marco:
I don't see any benefit to it.
Marco:
So to answer the question...
Marco:
I don't hold any cryptocurrency.
Marco:
And I'm not a big fan.
John:
I'm not as harsh as Marco.
John:
I think I talked about this before when we first talked about Bitcoin.
John:
When a new technology comes out, it's natural for nerds to be curious in it.
John:
So if you got a bunch of Bitcoin, especially because you thought it was a cool technical thing or whatever, and play with it, and hey, especially if you made a bunch of money off of it because you mined Bitcoin back when it was easy and they became worth a lot of money, great, more power to you.
John:
Yeah.
John:
especially in the beginning it wasn't clear how this was going to turn out it's like any new technology and as tech enthusiasts we're interested in new technologies right i mean when bitcoin first came out i downloaded the software and tried like mining for it i never actually got any bitcoin so i don't have any i have never owned any but it's a technical curiosity and so if you became rich off bitcoin
John:
I say more power to you.
John:
You found a way, hopefully use that money for something good.
John:
You use it to have a happy life and to support your community and your family.
John:
Like, kudos, right?
John:
But what Marco is talking about is like at this point, today, 2021.
John:
Correct.
John:
The footprint of cryptocurrency and understanding what it is, what it's good for, what it's not good for, and what you have to do to make money off of it is much more clear now than it was.
John:
And so I would say if you have Bitcoin...
John:
I'd be looking for to make it the most advantageous exit as possible.
John:
And what I wouldn't say, like, I would say that if you're super enthusiastic about the sort of utopian possibilities of cryptocurrency, try to come up with one that's better than the ones we have now, which to the credit of a lot of people involved in this, they do.
John:
That's why proof of stake exists instead of proof of work, right?
John:
People are trying to improve it.
John:
But Bitcoin gets all the press because it's like the one that sort of broke through is the most popular.
John:
uh it has a lot of mystique around it and when a lot of people say cryptocurrency what they really mean is bitcoin and bitcoin has a lot of bad externalities and i would not suggest anyone get into it and if people if you got rich off it great but at this point it's not great if you're trying to improve it or do something better that's good but at this point like
John:
At this point, like these, you know, I'm going to make a cryptocurrency and I'm going to convince a celebrity to endorse it because they don't understand the tech, but I'll just tell them that it'll make the money.
John:
And it actually will because a celebrity will give it publicity.
John:
And then the early people who have most of the coins will make money.
John:
And like, it's just another way to scam people out of money.
John:
to scam investors out of money.
John:
It's tales all this time.
John:
This is not like what you see happening with Bitcoin always happens with financial incidents.
John:
Like look at the various financial crashes caused by those tech bros that Marco doesn't like, right?
John:
It's just that now there's a different angle on it.
John:
And that is just generally distasteful and bad.
John:
I will say though, I do have some cryptocurrency.
John:
Some in the early days of crypto, some cryptocurrency company was giving out free cryptocurrency for signing up to the website.
John:
And I did that and I got free cryptocurrency, which I still have.
John:
uh and it just sits there as a number and it's not a very big number uh but i'd never do anything with it or look at it because it's not worth enough money for me to cash out right um and if it is someday worth enough money to cash out i'll cash out and be like one of those people who says oh great you got rich off cryptocurrency but it's probably never going to be worth any money so i just ignore it uh but i do have some of it i did actually have to like declare it on my taxes as an asset or whatever because it's above like whatever the
John:
200 dollar limit or something like we had to have our account and go through all this or whatever so it's an official thing that i own and i look at it and if ever becomes worth millions of dollars you can bet your butt i'm going to cash out of it and take that millions of dollars but i got it for free and it's not a thing that i use in it as an investment instrument i do not use it to do any transactions i don't do anything having anything to do with crypto
Casey:
Richie Hironian writes, I know the clever trick to limit iCloud photo library disk usage by creating a separate APFS volume or disk image.
Casey:
Recently, I noticed that Messages was using almost 100 gigs on my 256 gig SSD.
Casey:
That is not desirable.
Casey:
I did a bit of research but couldn't find a similar trick to limit Messages disk usage.
Casey:
I think it's a little more complicated since message attachments are somewhere under the library folder.
Casey:
Any insight here?
Casey:
Yeah.
Casey:
Yeah.
Marco:
you could probably find the folder that they are being stored in deep within library, whatever.
Marco:
You could probably use a symlink trick to symlink that into a disk image that is limited or an APF as a volume, however you want to do it.
Marco:
So that's the first thing I would try.
Marco:
But also, I would also ask...
Marco:
Does Richie not use iCloud for message attachments?
Marco:
Because message attachments can be very big.
Marco:
It is, in some ways, a photo library.
Marco:
Actually, I don't think we even heard whether those are being scanned for the CSAM.
Marco:
Oh, interesting.
Marco:
I'm kind of surprised we didn't hear that, actually.
Marco:
Anyway...
Marco:
But if you store your messages attachments in iCloud, I bet it offloads them pretty soon when you're low on space.
Marco:
And so it probably doesn't keep that big of a cache.
Marco:
Because I use iCloud for iMessage attachments.
Marco:
And as I scroll up through messages, it has to page them in, loading them off the network after a while.
Marco:
Because it's not keeping all those attachments locally.
Marco:
Whereas before iCloud Photo Library, I remember that was always a big chunk of my iPhone storage space.
Marco:
like you'd see messages and it would be like, you know, five gigs, 10 gigs, whatever on your phone because it was all those attachments like historically over time.
Marco:
So I would suggest if Richie does not use iCloud photo storage or iCloud storage to store message attachments, I would suggest trying that or considering that if this is going to be a big problem, like if you can't just get rid of these attachments because they actually are like,
Marco:
the only copies of them.
Marco:
Um, but, uh, if, if you, if this is just, you know, some disc quota not being enforced well and they are being paged off as they get older, I would attempt some kind of Simlink trick into a disc image.
John:
I think it's pretty brave to try the SimLink trick.
John:
I generally don't want to mess with, especially now with the containerization and the various container folders, don't want to mess with the internal structures of these iCloud powered apps just because it's not as straightforward as it used to be.
John:
The library folder used to be much more tractable, but now with the advent of the container system with iCloud stuff, it's a little bit...
John:
sketchier um the problem with messages the problem with a lot of apple apps in that it's quote unquote supposed to manage your disk space in an intelligent manner by purging things that are safely ensconced in the server which to marco's point means that you basically should enable the icloud uh messages sync thing which means the government will be able to look at all your messages in your icloud backups too which is nice
John:
But yeah, that's the consequence of that.
John:
And Apple's solution to this in recent years has been one, like odd messages thing, which helps solve this problem if it operates correctly.
John:
If it doesn't, there's nothing you can really do except cross your fingers and hope the crap gets purged.
John:
But two, they added with some surprising amount of fanfare a couple of years ago.
John:
the ability to tell messages to trim off attachments older than some date, right?
John:
Because this was a big problem on a lot of people's phones.
John:
They were filling their phones with message attachments.
John:
Eventually you just fill it, right?
John:
So your choices are either get that stuff into the cloud so you can purge it from your phone and not lose it or delete it from your phone.
John:
And Apple did both.
John:
They came up with a, you know, messages in the cloud feature.
John:
That's the cloud version.
John:
And they also came up with features in the messages app that will let you delete that crap.
John:
Um, they recently, they're doing that in reminders now too.
John:
It used to be the reminders would just pile up forever, which reminders are obviously tiny.
John:
They're not like photos, but eventually after, you know, 10, 15 years of the iPhone, people have a lot of reminders too.
John:
So the features to delete them will clean it up.
John:
Uh, ironically, the thing that deletes your data, like, Oh, delete all attachments older than a year.
John:
That will probably actually clean your space up as soon as you activate it.
John:
Whereas the iMessage and the cloud thing, you activate it and then you just wait, I guess, and hope that something eventually purges crap from your phone.
John:
But yeah, the solutions are not great.
John:
But I think those are the solutions for you.
Casey:
And finally, Andrew Nelson writes, what camera or lens should I rent for a Disney World trip?
Casey:
I want good at low light, great and fast autofocus, some water resistance in case of unexpected rain, better sharpness and bokeh than the iPhone 11 Pro, easy to use, and a good battery.
Casey:
Andrew does not care about long zoom raw or touching up pictures.
Casey:
I will answer this because I can be very quick.
Casey:
What you want is your iPhone 11 pro because having just gone to Disney world a couple of years ago, we went in late 2019.
Casey:
I did bring my big camera and on a couple of occasions, it was very useful because
Casey:
But by and large, and maybe it's just because my big camera, which is an Olympus OM-D E-M10 Mark III, I think I have that right.
Casey:
Maybe it's just my particular big camera, and that's the crux of the issue, so maybe I'm being unfair.
Casey:
But in my opinion, the iPhone, particularly with HDR, which I'm waiting for John to jump in and tell me that his big camera does all these things.
Casey:
But
Casey:
The HDR on the iPhone is really impressive, particularly for outdoor shots where you're trying to get a decent sky that's not blown out smithereens as well as your subject matter.
Casey:
Plus the low light on my iPhone is actually quite a bit better than it is on my Olympus here, especially is where John is going to say, oh, not so fast.
Casey:
But in so many ways, it was just a pain in the butt to carry anything bigger than an iPhone onto rides or anywhere else.
Casey:
So even though I did have my big camera with me pretty much always, I should have actually done and looked and seen how many pictures I took with each.
Casey:
But my gut tells me 80% of the pictures I took on the most recent Disney World trip I had were with my iPhone.
Casey:
And in fact, it was either 10 or 11, whatever was current at the time, in late 2019.
Casey:
uh in and almost probably 20 percent at most were taken with the big camera and i think even that is optimistic i think it was probably like 90 10 uh i'll come back to john since you have more uh disney experience than marco marco do you have any thoughts on this real quick yeah it was funny so so because andrew wanted you know you look at the list of wants it's it's everything iphones are great at you know low light exactly exactly good autofocus water resistance and
Marco:
Ease of use, battery, and then Andrew says, don't care, long zoom, raw, and touching up picks.
Marco:
So initially, I'm like, well, okay, just iPhone, really.
Marco:
But unfortunately, in the middle of the want list, Andrew says, better sharpness and bokeh than iPhone 11 Pro.
Marco:
Okay, so first, I mean, the smart-ass answer is, go rent or buy an iPhone 12 Pro Max, which is honestly...
Marco:
probably the best answer if you actually just don't want to use your iphone um so i have i have a couple of alternatives here uh so if andrew insists you know because see the not caring about long zoom and raw that to me really says like all right you you want pictures that are just great right out of the camera without a lot of effort you want an iphone but if you actually want a significantly more resolution and better like actual optical background blur um
Marco:
better than what an iPhone can do with its weird simulated background thing that blurs your ears off.
Marco:
Here are some good options.
Marco:
So from cheapest to most expensive.
Marco:
And also, Andrew says rent, which is good.
Marco:
So from cheapest to most expensive, the cheapest option is still just use your iPhone, but get a nice lens that you can clip onto it.
Marco:
A decent telephoto lens that gives you like a 2 to 4x kind of zoom range.
Marco:
I don't really know what's out there in this area, but that will give you better background blur because that's the principle of how those optics work.
Marco:
you get really good background blur if you use a very, like, you know, the longest telephoto lens you can get, and you get as close to the subject as possible, then you will get really good background blur.
Marco:
And there's other factors, of course, but that's what's going to be relevant here.
Marco:
And that's, you know, those lenses are like, you know, $30 to $50 for the various clip-on things.
Marco:
I know that the Moment case and lens
Marco:
assembly together is a little more expensive but tends to be pretty good quality i even have i like i we have um tiff wanted a macro lens to photograph our butterfly caterpillars that we are raising here um don't just don't don't worry about it and and
Marco:
And we tried different options, and I went on Amazon and just found one that was well-reviewed, and it was like $30, and it's like a clip-on thing.
Marco:
So you just clip it onto the phone.
Marco:
You align it on top of the main camera of the cluster, and it just works.
Marco:
And that was great, and it was inexpensive.
Marco:
So $30, you get something like that, but get like a telephoto lens, and that'll give you what you want.
Marco:
Otherwise, use your iPhone.
Marco:
Now, the next most expensive option is to actually do what Andrew asked for and actually rent a camera or lens.
Marco:
I would say because Zoom is not one of Andrew's priorities, I would say get a fixed lens compact camera.
Marco:
And again, rentals make this easier.
Marco:
Now, the water resistance thing makes some of this a little trickier.
Marco:
So I will say rent a camera that is not water resistant, hope it doesn't rain, and get the insurance plan.
Marco:
because the you know any kind of like you know lens rentals is where i've rented from before um they and pretty much anywhere else you can rent a camera will have some kind of you know somewhat pricey insurance plan you could add on that will cover all risk so you can drop it in the ocean and you won't be responsible for all of it or some of it or whatever so i would say rent whatever you want and get the insurance and then water resistance is kind of checked off the list okay so as for what you want
Marco:
what i would suggest having never used either of these cameras is at the low end the fuji x100f because it's a fixed lens camera fuji i found is very very good at getting really good pictures right out of the camera with no post-processing whatsoever they have really good jpeg rendering usually
Marco:
really good color rendering.
Marco:
It's just, it's very, very good for low effort, good shots.
Marco:
And I, while I've never owned this particular camera, Fuji cameras tend to have very good reviews for things like basic usability, ergonomics, stuff like that.
Marco:
It's also reasonably compact.
Marco:
But yet, it is going to give you a significantly better optical setup than you can get from an iPhone for things like total resolution and background blurability optically.
Marco:
And then the high-end option, you know, that's, Lens Rental has that for about $83 a day or a week.
Marco:
So the high-end option for about three times that is the Leica Q2.
Marco:
I have never owned a Leica camera.
Marco:
I have rented Leica cameras before.
Marco:
I have briefly used a Q1.
Marco:
I have not used a Q2, but the Q series of Leica cameras is delightful to use.
Marco:
They're extraordinarily expensive to buy, but if you're going to be renting one for a short trip, it's $250 plus whatever they want for the insurance, so...
Marco:
You're probably looking at $350.
Marco:
So again, not cheap.
Marco:
And you're really getting close to just the buy an iPhone 12 Pro Max territory.
Marco:
But what you get with the Leica cameras in my experience is, again, really good JPEGs right out of the box with not a lot of messing around.
Marco:
you do have amazing optics, amazing resolution.
Marco:
You have, you know, great ability to have good blur, even at, even with this relatively wide angle lens.
Marco:
Um, and they're just fun to use.
Marco:
They're very fast and responsive.
Marco:
And that's something that's really hard to find in full frame cameras.
Marco:
Um, but, but here it is like the Leica Q2 has that.
Marco:
Um, so anyway, that's my recommendation.
Marco:
But again, I w I would go with Casey and suggest, uh,
Marco:
just getting maybe a fun little clip-on lens for your iPhone and a battery pack might be the better approach.
Casey:
For what it's worth, at lensrentals.com, which both Marco and I have used in the past, and although they've never sponsored, I definitely recommend them.
Casey:
They're excellent.
Casey:
The LensCap Plus coverage, which is the most expensive, I don't know exactly what it covers, for the Leica Q2, it's $60.
Casey:
So that brings the rental price from $257 for a week to $317 per week.
Casey:
which is not cheap.
Casey:
And like you said, we're talking about at this point, you know, why not just buy yourself a new iPhone?
Casey:
But I do understand what you're saying, and I do like the idea of what you're saying there, Marco.
Casey:
But I stand by iPhone's the way to go.
Casey:
John, what do you think?
John:
So this list of criteria is a little bit odd because it doesn't have any kind of weighting.
John:
So both of you said like, oh, the iPhone's good at low light.
John:
That's true as long as your subject is not moving.
John:
The way the iPhone gets good at low light is by taking 100 pictures and combining them together into one picture.
John:
If you're trying to take a picture of a kid running through some dimly lit ride, you're going to get nothing because the sensor on the iPhone is tiny.
John:
It does not gather a lot of light.
John:
Computational photography is doing all the heavy lifting on low light.
John:
Now, that said, maybe you think your subject won't be moving and all you want to take is pictures of people standing and smiling in front of things.
John:
then the iPhone is good at low light again.
John:
Congratulations, right?
John:
But good at low light, it's listed first, but if that is your number one priority, to actually get a camera that is good at low light, you need a much bigger sensor.
John:
And because this list isn't prioritized, it's like, okay, but how good at low light?
John:
Like, do you need a full frame camera?
John:
Do you want medium format?
John:
And when I get into stuff like this, there's one thing that wasn't listed, which is like,
John:
reasonable size is listed nowhere so like it really opens the door to like do you want to carry a gigantic 50 pound camera i don't think you do but you didn't list it on your wants so it's hard for me to say what i should recommend because you're kind of saying like oh like what this list says to me is i don't actually mind if it's kind of a big camera
John:
Like size and portability and convenience, like that wasn't listed, right?
John:
The next item, great fast autofocus.
John:
This is where I start to get into the cameras I have the most experience with.
John:
Sony has one of the best, if not generally agreed upon to be the best great fast autofocuses in the entire industry across almost their entire camera line.
John:
It's really, really good about finding the thing you want to focus on and latching onto it really, really quickly and not letting go.
John:
that's like the major selling point of the software side of the sony cameras it's really really good right uh lots of sony cameras are water resistant uh and then so better schnopters and bokeh than the iphone 11 pro yes iphone travel is the snarky answer but like really it's not that much better that makes me think you want a real camera because if you want actual optical depth of field you need actual optics which means you need an actual camera
John:
And so given that, given that you didn't say, like, it's not super important to have the smallest, lightest thing, I'm setting aside the cameras that Marco recommended, which is like the little compact, all-in-one, non-interchangeable lens cameras, because that wasn't listed in the criteria.
John:
So why would you pick that camera unless compactness is one of your priorities?
John:
Which leads me to considering the Sonys that I have the most experience with.
John:
And especially if you're willing to rent, I would say that makes it even easier.
John:
Now you say long zoom is not important, but I know from experience of taking pictures at Disney World, long zoom is not important, but having a prime lens can be limiting.
John:
because you won't know what focal length to put it maybe you want a big picture of like the the big ride of like oh here's space mountain or the matterhorn or whatever and then another situation maybe you want a picture of just your kid right you probably need some kind of zoom range to say this is a wide shot versus this is a tighter shot right so you it's going to be really difficult to fix to pick a single focal length so i what i'm saying is
John:
get an interchangeable lens camera with a pretty big sensor and a decent lens that has a reasonable zoom range, not a long zoom.
John:
It's not going to zoom in probably any farther than, you know, the average camera, but you really want that range.
John:
Maybe you'll even find yourself in a cramp situation where you want to take a picture of your family and they're all in front of you and you're like a foot away and you want to get the whole family in.
John:
Now you need a wider angle.
John:
right so kind of like the range that the iphone does is a reasonable range but i think the iphone is it falls a little bit short about getting picture of your kid on the dumbo ride because there might be far away from you like the little barriers of where you have to get the picture from right so you need some kind of zoom range so my main recommendation and you know this is both based on my experience but it's also based on my very limited experience but it's also based on the reason i bought this camera is
John:
If you're going to rent, get the Sony a 6600, which like there are better, cheaper options if you're going to buy.
John:
But if you're going to rent, it's probably not that much more expensive to rent the 6600 than the 65 or 64 or 61.
John:
So get the 6600.
John:
It comes with the amazing fast autofocus system.
John:
It is weather resistant slash weather.
John:
It's water resistant, right?
John:
So it actually is kind of weather sealed.
John:
And so is the lens I'm going to recommend you get for it and get the Tamron 17 to 70 lens, which has a great zoom range, is an amazing lens and is weather sealed.
John:
And it's not that big, but the sensor is way bigger than an iPhone.
John:
It has way better low light performance than the iPhone with any subject that moves in any way, including your hands shaking, right?
John:
Because the sensor is so much bigger.
John:
And the step up from that, I would say is the a7C, which is a full frame sensor, same exact size body, same great autofocus system, same weather resistance.
John:
And you can get the same exact Tamron 17 to 70.
John:
Actually, no, it's not full frame.
John:
You can get the full frame equivalent of that lens from a different manufacturer.
John:
Um,
John:
And use that on the full frame 7C.
John:
But the camera that is literally sitting on my desk here right now, the a6600 with the Tamron 17-70 will absolutely cover all of your actual photography needs.
John:
And it will take way better pictures than any of the cameras recommended so far at a similar price.
Marco:
Thanks to our sponsors this week, ExpressVPN, Memberful, and Burrow.
Marco:
And thanks to our members who support us directly.
Marco:
You can join at atp.fm slash join.
Marco:
Thanks, everybody.
Marco:
We will talk to you next week.
Marco:
Now the show is over.
Marco:
They didn't even mean to begin.
Marco:
Because it was accidental.
Marco:
Oh, it was accidental.
Casey:
Accidental.
Marco:
John didn't do any research.
Marco:
Marco and Casey wouldn't let him because it was accidental.
Marco:
It was accidental.
John:
And you can find the show notes at ATP.FM.
Marco:
And if you're into Twitter, you can follow them at C-A-S-E-Y-L-I-S-S.
Marco:
So that's Casey Liss, M-A-R-C-O-A-R-M-E-N-T-M-A-R-C-O-R-M-E-N-T-M-A-R-C-O-R-M-E-N-T-M-A-R-C-O-R-M-E-N-T-M-A-R-C-O-R-M-E-N-T-M-E-N-T-M-E-N-T-M-E-N-T-M-E-N-T-M-E-N-T-M-E-N-T-M-E-N-T-M-E-N-T-M-E-N-T-M-E-N-T-M-E-N
Casey:
Yeah.
Casey:
T-A-M-R-O-N?
Casey:
Tamron?
Casey:
Is that what you said?
Casey:
I've not heard of that.
Casey:
1770.
Casey:
Alright, I'm trying to get some
John:
yeah that's sigma and tamron are like the two big like kind of third-party lens makers for for most of the slrs and stuff gotcha yeah the 17 to 70 is like i should have recommended i mean i'm still in the show so it's fine the sony uh 16 to 55 is actually a better lens but it costs twice as much but if you're again if you're renting maybe that doesn't make a difference so consider that as well um i i mentioned the tamron just because it has a slightly bigger range and it's cheaper and if that factors it in in the uh in it all the rental then do that but the
John:
sony 16 to 55 is actually slightly better tiny bit more compact and if you're renting it's probably like only five bucks more or something see i don't know how you would want to lug around a full frame like sony interchangeable setup well they didn't they didn't list compact size
John:
Like they didn't say it has to be small enough to fit in my thing or whatever.
John:
And having lugged around a camera of this exact size on a extended Disney vacation, I can say it wasn't that bad.
John:
Like these are compact cameras.
John:
They're small.
John:
Like the a7C is the same size body.
John:
They're small for interchangeable lens cameras, but they're not small compared to an iPhone.
John:
But I don't think they're that bad to lug around.
John:
Even in the million degree heat, even with like a backpack on everything.
John:
I did it.
John:
I was fine.
John:
I survived.
John:
And so if you're not going to list compact size, then you're going to get recommended larger cameras.
John:
It's not like I'm recommending a gigantic, you know, Canon SLR full frame that's like weighs seven times as much, right?
Marco:
Yeah, that's fair.
Marco:
But I don't know.
Marco:
I mean, this is why I ultimately like I think Casey's experience of just mostly using the iPhone is worth heeding.
Marco:
Like it's so like the iPhone is so good as a vacation camera for most people's priorities.
Marco:
Like John, you are, I think, much more willing to lug around a camera.
John:
than most people are but but Andrew specifically is trying to say not an iPhone with this criteria they want better sharpness better bokeh like they want actual optical depth of field and they know they're not going to get that with an iPhone so they're saying they're talking about renting right so they're obviously saying they might as well have just said don't recommend me an iPhone because they know what the iPhone is it's known quantity has its qualities and even though for most people it probably does everything you need to do Andrew is specifically asking I want better pictures than I would get with an iPhone and
John:
You want real optical depth of the field, you get a real camera, that's what you'll get.
Casey:
I totally get you.
Casey:
And I'm glad both of you made those recommendations.
Casey:
But ultimately, I feel like it is worth hearing someone say, it might be worth just saving your money and sticking with the thing that's most convenient.
John:
Well, what I said last time is, look, they're going to have their iPhone with them anyway.
John:
So if there is a situation in which you don't want to have the big camera or you think the iPhone would take a better picture, just use the iPhone.
John:
Like, I did not take iPhone pictures on my Disney vacation.
John:
Of course I had my iPhone with me.
John:
I had both, right?
John:
You're going to have your phone with you anyway.
John:
Like, it's not like you're going to say, I got a real camera, so I don't need to bring my phone.
John:
Of course you're going to bring your phone.
John:
Everybody brings their phones.
John:
It's so the government can surveil you.
John:
No, that's not why.
John:
People like me.
John:
people love their phones um so you're gonna have the phone anyway so you're not giving up the phone right you're just adding to it and i you know i again this question specifically looks to me like someone who says i want pictures of my vacation that don't look like they were taken on a phone so gotta have a camera for that
Marco:
Yeah, we recently had some friends visit, and one of our friends uses a small, I believe it's a Fuji, like a small, I think it's a micro four-thirds camera.
Marco:
And the photos she was able to take on it were...
Marco:
Noticeably better than the iPhone photos, but not necessarily in the, like, you know, massive amounts of sharpness.
Marco:
Like, that's not what I noticed about them.
Marco:
What I noticed about them was that they just had a different color tone.
Marco:
Like, just, like, the way that the camera rendered tones and colors and skin tone and...
John:
The color science is what they call it in the biz.
John:
Right.
John:
There's better color science, which is so weird when I read it, but that's what they use in reviews.
Marco:
And I wouldn't necessarily even say better.
Marco:
It was just different, and that was refreshing.
Marco:
After seeing mostly only iPhone pictures myself for a very long time, to have a few pictures that were in this group photo library we had from the trip that –
Marco:
that were taken by a, quote, real camera, they looked noticeably different, and it was just a refreshing thing to see.
Marco:
And I think in some ways, they were better, technically.
Marco:
In some ways, the iPhone pictures are easier to take good pictures with.
Marco:
But it was interesting seeing what another camera could do.
Marco:
And it was nice.
Marco:
The way the iPhone renders colors and contrast and stuff...
Marco:
it's very you know scientifically optimized it's like it's like when you when you eat at like a fast food place you know like this has been optimized by flavor scientists to like maximally taste exactly the way it's supposed to you know you know eating like a dorito it's like this is just like all flavor science has gone in here but then you have like different food sometimes that has different priorities and it's refreshing and it's good and it's different that's how the color rendering of this camera was
John:
I think it's the same as those analogies in another way.
John:
And that the iPhone photos are processed, just like processed food.
John:
Like the reason they look the way they do is you're starting with garbage and you really have to process it to make it appealing.
John:
Whereas the ones that kind of look different are starting with a better product as in a less noisy image from
John:
the sensors, like the iPhone is doing a lot of work.
John:
And so the iPhone pictures look the way they do because the raw material they're starting with is just total garbage.
John:
And the computational stuff is working overtime to combine them, denoise them, contour them.
John:
You do the HDR stuff like, and it's amazing.
John:
Don't get me wrong.
John:
That's amazing.
John:
That's why we like iPhone pictures, why they come out so good.
John:
Cause the phone does all that stuff, but the regular camera can do so much less and just say, look, our raw material off the sensor is 10 times better.
John:
We don't have to do that much processing.
John:
And, you know, even for things like the colors, a lot of the colors, I'm not saying they're synthesized because all the colors are synthesized from various sensor readings.
John:
But, like, you're getting more raw material to work with from a camera with a big sensor and big glass and all that.
John:
So you don't have to grind over it as much.
John:
You can allow it to sort of come through as is more.
John:
And that lets you have...
John:
I'm sure different kinds of quote unquote color science, whereas the phone has to do tons of heavy lifting and multiple exposure and exposure bracketing and combining to get what it thinks is a representation of what's in front of it.
John:
Like, I'm not going to say that the big camera looks quote unquote more natural, but like you said, it can look different to you because bottom line is it has been through a very different pipeline to get to the final form.
Casey:
It's funny, coming off a beach vacation a couple of weeks ago and actually a day trip to the beach literally today, I brought the big camera with me today.
Casey:
I brought my GoPro with me and I brought, of course, my phone with me.
Casey:
And the only thing I really took pictures on today happened to be the GoPro, which is a terrible still still camera.
Casey:
Like it's truly bad.
Casey:
But I was in the water.
Casey:
And I certainly don't want to bring my big camera in there.
Casey:
I do, John, the same thing you do.
Casey:
So I'm not absolutely opposed to it.
Casey:
But generally speaking, I try to avoid it if I can.
Casey:
I have lightly cracked the back of my iPhone.
Casey:
And so I don't want to get that wet because I am never again going caseless, caseless.
Casey:
And so I was left with the GoPro because I was in the water a lot today.
Casey:
When I was on the beach trip, I did use the big camera a lot.
Casey:
And it's so frustrating.
Casey:
I've probably said this before.
Casey:
It's so frustrating.
Casey:
Because I'll look at the pictures that the big camera took.
Casey:
And in terms of like having a proper telephoto lens so I can get close to my subject without actually being close to them.
Casey:
And in terms of the bokeh, even on a, I think my zoom is an f2.8 and my prime is like an f1.4 or something like that.
Casey:
And I almost never put the prime on anymore because I'm trying to get close to like a moving child or whatever the case may be.
Casey:
or just a far away child more than anything else.
Casey:
And I look at these photos, and the bouquet is great, and I think the color is pretty good, although I'm not a particularly astute critic of these things.
Casey:
But then I'll look at the sky, and the sky is completely blown out, and so I miss the HDR of the iPhone.
Casey:
And then I think about how I have to go and post-process all of these to put geotags in because I'm not a monster like you, John, and I wish I had the iPhone.
Casey:
Are you shooting on auto?
Casey:
No, I'm shooting aperture priority.
John:
What do you have your aperture set for?
Casey:
Usually like between two and four, generally speaking.
John:
I don't know why your sky is blown out as much as it is when people are outdoors on a sunny day.
John:
I feel like it's not a challenging HDR situation where you should be able to get reasonable balance.
Casey:
Well, because I don't have HDR.
Casey:
I don't have HDR at all on this camera.
John:
I know, but I'm saying even without HDR, like it's not, it doesn't seem like it would be a super challenging situation to have a reasonably good exposure on the person's face that's in sunlight and the sky that's behind them.
John:
mean well and also i'm firing these you know from the hips so to speak in the sense that you know i'm not i'm not doing hours and hours of well that's exaggerating you know what i'm saying like a lot of calibration no no no cross-processing just right off the right off the camera i mean just like i don't know enough about photography to know how what what you might need to change other than it seems like you're overexposing a little bit but if you if the faces i don't know you'd have to look at a specific picture all i can say is like i take a lot of pictures of people at the beach and having the sky blown out behind people is not usually a problem for me and i am not doing anything particularly fancy with my camera
Casey:
and i think you have a better camera than i by a fairly large margin but like i dropped in the chat in in our super secret private text channel um or private slack channel i don't want to put these on the show notes and i apologize for that um because it has pictures of the kids which i mostly try to keep off the internet now but um if you look at the first couple of pictures they are shot on the big camera and you can tell because the subject's super close and then you look at the next couple of pictures and
Casey:
Maybe you wouldn't.
Casey:
I would say the sky is blown out in the ones in the big camera, and maybe you wouldn't.
Casey:
But certainly, without question, the sky on the pictures taken with my phone is far better exposed than the ones taken with the big camera.
Casey:
And perhaps that's user error on my part.
John:
Well, that's not the same sky.
John:
It's framed totally differently.
Yeah.
Marco:
no i i agree with casey it's what where the iphone really excels is first of all an area i forgot to mention video like also true yeah it's nearly impossible for lay people who are not really good video shooters to get better video from any other camera than you get out of an iphone with with no effort whatsoever so that's part number one but i would even say a lot of that actually stands to photos now too like
Marco:
What you get photo-wise out of an iPhone, especially in regards to dynamic range, whether it's the built-in HDR stuff or just various other ways that it processes dynamic range, it's so far ahead.
Marco:
of what any standalone camera does.
Marco:
Now, there's reasons for that.
Marco:
People who really know what they're doing with standalone cameras can capture the much better data from the much better optics and much better sensor and can typically do a good amount of post-processing on it to do things like you expose to the left or expose to the right, whichever one it is where you...
Marco:
you basically expose for the highlights not to be blown out, and the result is your shadows are super dark right out of the camera, but then in post, you raise up the shadow detail with all these amazing sensor dynamic ranges that we have nowadays with standalone cameras.
Marco:
But...
Marco:
That all takes work and skills and talent that many of us don't have or don't have time for.
Marco:
Right.
Marco:
And so what you get out of an iPhone for dynamic range is so much better and more pleasing and more usable.
Marco:
And typically you get more dynamic range detail because it's so hard to use, for most people, to use standalone cameras to capture things like a bright sunny sky with anything else in the frame.
Casey:
Yeah, and so what I'm driving at in a roundabout way is, and the pictures I've shown, Marco and John, are not the greatest representations of, you know, like really excellent pictures that my phone has taken.
Casey:
Actually, John, I forgot to show you, I did take a single bird picture for you since you were apparently spamming all of Instagram with 300 bird pictures while you were on your beach vacation.
John:
The reason I put all those pictures is they're not pictures of people who might not want to have their pictures shown, so birds don't complain.
Casey:
That's true.
Casey:
But anyways, there are examples of pictures I took of my big camera and also the day I've stumbled on just now is a relatively overcast day.
Casey:
So in many ways, I'm not giving you a great example.
Casey:
But I would get out the big camera, particularly in Zoom situations, and I would think to myself, man, I'm so glad I brought the big camera.
Casey:
But then about half the time, I would think, wow, the sky is blown out, man.
Casey:
I kind of wish I had the iPhone for this.
Casey:
Oh, I got to geotag everything now.
Casey:
I kind of wish I had the iPhone for this.
Casey:
So the big camera definitely has space in my life, and that's why I still bring it.
Casey:
But as I've said many times over the last couple of years, as the iPhone gets better and better, if it wasn't for just having such better glass on this camera...
Casey:
I don't think I would ever use it, but to get that really decent bouquet, to get some, I would argue in some cases, some much better color, I really do need to get out the big camera.
Casey:
And that's not really a complaint, but it's just, it's wild to me how in just a few years, again, we've said this many times on the show, in just a few years, we've gone from, yeah, we'll use the iPhone in a pinch, to...
Casey:
Yeah, I'll use the big camera when I really want to get a really good picture.
Casey:
And God, what a pain in the ass it is.
Casey:
It's just such an unbelievable change from the way it used to be.
Casey:
And that's a good thing in the grand scheme of things.
Casey:
But as someone who...
Casey:
wants to be a ever better amateur photographer, I feel like it is limiting for me to only use my iPhone, which is also not really true because you can get incredible shots from an iPhone if you work at it.
Casey:
But I don't know.
Casey:
It's just a very odd place to be that here it was.
Casey:
I had the big camera with me and I had people that I wanted to take pictures of with my big camera, including not only my family, but the family that we were visiting with.
Casey:
But I ended up just using a frigging GoPro because that was the most convenient tool for that particular work.