Ten Blue Links, “it’s just me, my MacBook, and Jesus” edition

AI, Apple at its best, Apple at its worst

Profile picture of Ian Betteridge
Ian Betteridge
Dec 27, 2024

If you’re reading this, my move from WordPress to Ghost for posts and emails succeeded – woo hoo!

1. You don’t need fancy systems to get things done

There is an entire industry devoted to getting you more “productive”. In fact, if you want to make money, writing a book on how to be productive isn’t a bad idea. As often, it’s more profitable to teach something than to do it. If you can develop a fancy productivity app, that also works.

Joan Westenberg decided just to use the tools which came free with her iPad – Reminders, Calendar, Notes, and Pages – and it’s worked really well. You don’t need complicated tools and systems to get things done. Just your brain, and enough to help you remember things.

2. Fingerprinting you for fun and profit

Fingerprinting never sounds fun. And the digital version is just as not-fun as the physical one.

What’s fingerprinting? Imagine your digital footprint as a trail of unique breadcrumbs scattered across the internet. Browser fingerprinting is the art of collecting these seemingly innocuous details—your screen resolution, installed fonts, browser plugins, and even how your device renders text—to create a distinctive profile that's remarkably effective at identifying you across websites. Companies then piece together these technical characteristics, which might appear random in isolation but collectively form a surprisingly precise identifier.

What makes this tracking method particularly odious is its lack of control for users: unlike cookies, which we can clear with a click, our device's fingerprint quietly persists, helping advertisers build detailed portraits of our online behaviours without relying on traditional tracking methods.

Google, a company which built its entire business on tracking you around the web, used to think fingerprinting was a bad thing: “We think this subverts user choice and is wrong.”.

Google, a company which built its entire business on tracking you around the web, now thinks it’s a just-splendid idea and is supporting its customers who want to do just that.

3. Eat that goose with a pinch of salt

Michael Tsai notes what should be obvious to anyone: Apple’s statements about what Meta is demanding through its DMA interoperability requests need to be taken with a giant pinch of salt.

In particular, this:

Except that with the OCSP preference that Apple reneged on, Apple does get to track the apps that you use.

Sauce, goose, gander, etc.

And, as Steve Troughton-Smith notes:

Also, just to acknowledge the spin Apple is taking on this, which I have no interest in linking to: they just threw Meta under the bus for interoperability requests, something that is forbidden under the EC's proposal, triple-underlining why the EC needs to legislate all of this in writing in the first place.

Oh, dear. It’s clear that Apple hasn’t learned that waging a PR war doesn’t allow you to do things which are illegal. The “court of public opinion” isn’t actually a real court, guys.

4. Big changes for Apple in 2025

The European Commission is systematically examining Apple’s operating systems, working with interested parties and the industry to determine the appropriate boundaries for when APIs should be closed off to third parties.

Before you get too sympathetic to poor little Apple, as Steve Troughton-Smith puts it, “Apple bought all this on itself”. It has spent the last decade building private APIs and capabilities which it refuses to make available to third parties. And – in the case of the iPhone and iPad – if you try to use them Apple will stop you selling your app, as it says in 2.5.1 of the App Store Review guidelines:

Apps may only use public APIs and must run on the currently shipping OS. Learn more about public APIs. Keep your apps up-to-date and make sure you phase out any deprecated features, frameworks or technologies that will no longer be supported in future versions of an OS. Apps should use APIs and frameworks for their intended purposes and indicate that integration in their app description.

And of course, Apple being Apple, this “guideline” also applies to any apps in third-party stores because the company will not notarise your application if you break it. Which means it just won’t run at all, regardless of where you download it from.

As Steve puts it:

My takeaways from the proposal: the EC is prepared to go into detail on specific features, mandate various avenues of interoperability and APIs required, ensure that Apple can't make them burdensome in implementation or by policy, set a concrete timeframe for the changes to be made (i.e. by next release of iOS), and ensure that Apple can't pull the rug out from under these APIs in the future or self-preference for new or unannounced devices. All the proposals are great, necessary changes

Apple wanted to be the state as regards to “law” for iOS development. Now it’s being shown that it’s not a state, and it doesn’t get to act like one.

5. LLMs have a scaling problem

Token limits present a big challenge for the future of large language models. Think of tokens as the individual puzzle pieces that constitute an LLM's “understanding” – each word, subword, or character that it processes. The computational resources required to handle these tokens grow quadratically with the context length, creating a mathematical perfect storm.

When we double the context window – how many tokens an LLM can work with in a single query – we're not just doubling the computational needs, we're quadrupling them. This relationship becomes particularly bad when we consider the practical implications: training a model with a 1-million-token context window would require not just vast amounts of memory, but also extraordinary computational power to handle all the potential relationships between those tokens.

It's as though we're asking these models to maintain a conversation while simultaneously remembering and considering every single word that's ever been spoken in that conversation, all at once. Unsurprisingly, this is not the way that human brains work, despite their vast processing power.

The energy requirements, the hardware constraints, and the sheer engineering challenges create a formidable barrier that even the most ambitious technological advances must reckon with.

There’s a lot of debate now about whether LLMs are, in fact, a dead end. My gut feeling is that the scalability issue probably means the answer is moot: we will never get LLMs capable of the kind of complexity required to move much beyond what they can do now.

6. Mo money, mo Elon

There is absolutely no way on the planet that this represents any kind of value for money

7. The VC takeover of America

Who owns America? Not its citizens.

8. Tom Stoppard and Indiana Jones

This, via Phil, is great: Tom Stoppard is, apparently, “responsible for every line of dialogue” in Indiana Jones and the Last Crusade (AKA the best Indiana Jones film).

9. Microsoft to stop using OpenAI models exclusively

I’ve been playing around with all the big LLM models recently, and it’s interesting how different they are. Claude is fantastic at getting tone of voice if you train it, but doesn’t have up-to-date data. ChatGPT 4o is quite precise, but simply ignores instructions on how to write something about half the time. It does, though, have pretty good reasoning skills, which makes it better for things like sorting through data. Google has some great tools (NotebookLM is the kind of thing which would have made my PhD so much better) but otherwise trails.

Against this backdrop, it’s not that surprising Microsoft is going to stop using just OpenAI in favour of incorporating some of its own work. The question is whether they are actually any good or not. I know how my bet would go…

10. Retiring

Terence Eden has retired. Well, sort of. He’s telling people he’s taking a year off, or soft-launching retirement. But either way, I can heartily recommend doing this if you ever get the chance. I had seven months off after leaving Bauer, and it was one of the best things I have ever done. After working for nearly 30 years without any substantial break (and being a habitual bad taker of my leave entitlement) I really needed it.

It’s also one of the reasons that I decided I didn’t want to go back to five days a week working. I aim to stick with four days, leaving some time to do other stuff. I’m lucky enough to be able to do that currently (although who knows what the future holds) and it’s great. Four-day weeks for everyone would be fantastic.