Ten Blue Links, “Sunday Sunday here again” edition

Profile picture of Ian Betteridge
Ian Betteridge
Jun 08, 2025

1. Apple’s retort to the possibility of AI scaling is a dagger through the heart of the boosters

I’ve been around the world of artificial intelligence since I was working on my PhD in the early 1990s, which means I have seen how, historically, AI research comes and goes.

The pattern is this: a new technique comes along which promises to be the big leap towards what is now usually termed artificial general intelligence (AGI), but back then was just called AI. A huge amount of effort is spent in researching it, and progress is made. Then, the promised leap to AGI just never appears. What we’re left with is often useful tech, but not the breakthrough everyone believed in.

I suspect that LLMs will follow the same pattern. The biggest difference between LLMs and earlier AI tech isn’t in its potential for reaching AGI, but in its existence in a business environment which rewards the hype cycle. We have oligarchies and monopolies seeking to use their incredible financial muscle to “own” markets. We have venture capitalists who have an interest in hyping a startup so it gets sold at the highest possible price. It’s perfect conditions for hype cycles around particular technologies, and this time it’s LLMs.

The cracks, though, will inevitably appear – and the first one has come from the less than likely source of research at Apple. It’s not a hard paper to read, but the short version is that reasoning models – the adjuncts to LLMs which give them the ability to do multi-step reasoning – collapse badly on complex questions (as do unaugmented LLMs).

Broadly, I think LLMs are useful tech, particularly for some kinds of textual analysis and also for converting human language into machine queries. But that’s all: they’re not the solution which gets us to AGI. And they’re not worth the current hype.

2. The depressing state of AI in education

If you want to build your own stress and anxiety levels, then I would heartily suggest reading 404Media’s article which compiles the views from the ground of teachers about the use of AI in schools.

Education is going to be changed dramatically by these tools, whether we like it or not. My biggest concern is that we’re letting this happen, rather than making it happen in a way which is beneficial to us all as human beings.

3. More from the frontline of the war against publishers

Barry Adams – one of the best SEO people in the world – recently posted on the impact of AI overviews and AI Mode on traffic from Google. Predictably, if you’ve been reading what I wrote over the past few years, it isn’t pretty: as Barry puts it “publishers need to focus on audience strategies that exclude Google as a reliable source… In the next few years, many publishers will be unable to survive.”

Honestly it’s grim out there, and I still think some publishers are hiding their heads in the sand. Yes, you can get traffic from Discover – but once Google has worked out how to make all the money in the world without sending publishers any traffic, that’s it – the long summer is over.

4. Bill Atkinson

Some really sad news: Bill Atkinson, one of the early developers of the Mac, has passed away. I remember Bill mostly from his work on HyperCard, which was an amazing product. I built a lot of HyperCard stacks in the pre-web era, mostly to explore academic concepts, but also using it as a complete programming environment. When it gained the ability to read from the Mac’s serial ports, for example, you started to be able to use it for interfacing with equipment – I wrote a piece of HyperCard software which logged calls from a phone switch, for example. Fun times. Steven Levy has a great retrospective on Atkinson’s life and career.

5. Notes on Notes (and Markdown)

John Gruber has a long article detailing his thoughts on the rumour that Apple Notes will be able to export Markdown files in its next release. Like John, I think of Markdown as a format for creating content which will ultimately live on the web (most of these posts are written first in Markdown, because I could never be bothered with HTML). But I would love to see the ability to export everything into Markdown – I’ve done this in the past using a variety of tricks, but it is always quite painful.

6. Reddit, which claims that anything you create on Reddit belongs to it, sues Anthropic

I find it really hard to have much sympathy for Reddit, which only exists because of the millions of contributions of individual users, in its legal battle to stop Anthropic from scraping Reddit content.

Bear in mind that what Reddit is doing here is selling its users content without any real consent. It’s claiming the right to sell something that, really, it does not own.

7. This starship will never fly

I’ve written before about how Starship, SpaceX’s heavy lifter, is a bit of a science fiction fantasy of how rockets should work. If you’re a nerd of a certain age, then you will remember pictures from covers of science fiction magazines showing tail-first landings on planets and back on Earth.

Starship is driven by that vision, but there’s only one problem: physics. To land on a planet, you need to lose momentum, as much of it as possible before you hit the dirt. There’s two ways of doing that: using an atmosphere to either aerobrake or parachute (or both); or using retrorockets to slow you down.

If you have an atmosphere, then the former is preferable for one simple reason: weight. If you use retrorockets, the fuel they need to land you safely has to be carried up with you, reducing the amount of useful weight you can carry to orbit.

And of course the more weight you’re carrying, the stronger everything has to be. Which, in turn, increases the weight of the vehicle.

This is the problem that SpaceX is running into, and, as Will Lockett explains, there is no way round it. Physics can’t be avoided, even if you are the richest person in the world, and no amount of software engineering-inspired iteration will dig you out of the hole that gravity puts you in. It’s like Musk’s fantasy of going to Mars, a ferociously difficult journey which would yield little more than a bunch of photo opportunities covered in red dust.

8. What no English?

The wonderful Mic Wright – who has a book out which you should pre-order – has written a very telling article about the way that stories move from small local newspapers to bigger news networks. And, as Mic rightly points out, the majority of those stories are not even worth reporting about in the smallest of local papers.

Instead, they are there simply because they deliver clicks. There’s no other merit at all – just how important is it that a random woman was upset by a lack of “English” food on holiday? And yet this ended up in the Daily Mail, one of the country’s largest newspapers.

I don’t think I like this world of publishing much. And I always end up thinking it’s partly my fault. A lot of us early Internet people have these thoughts.

9. Ballmer speaks. A lot

Ever wondered about why Steve Ballmer did the “developers, developers, developers” chant that time? You can find out in this interview, and also hear Ballmer talk about a lot of really interesting Microsoft-related history.

10. Torment me

Some lighter shade of darkness: this is a wonderful look at the genesis of Marc Almond’s magnificent Torment and Toreros, not only a great album but one of the greatest. If you love drama.