George’s ability to find the warmest spot in the house is really quite spectacular.
23 books read in a year isn’t a bad effort, especially given the fact that remarkably few of them fell into the category of trash science fiction. This year, I want to have read 20 non-fiction books that I haven’t read before – got to keep that mind moving.
Chaffinches are funny little birds – like something drawn by a three year old, all triangular body and spindle legs. They like to hunker down too, which makes them even more squat and fat looking.
I suspect that the Venn diagram of people who own both an M1 Mac and a Surface Pro X is small. I fall into that section in the middle, so I thought it was worth summarizing how the two compare.
I should say from the start that the Surface Pro X is by far and away my favourite Windows device. I also have a Surface Book 3, which is a pretty powerful laptop in its own right, but which I just don’t love as much.
The hardware design of the Surface Pro X is exceptional, and it is one of the few non-Apple devices that I’ve owned which matches Apple’s level of industrial design. Using an ARM processor with its much less hefty thermal requirements than Intel chips has freed Microsoft’s hardware designers to make the Surface that they have clearly always had in their heads. It’s thin and light, with an exceptionally lovely screen, and in my experience it just never gets hot, something that I can’t say of any Intel-based computer I’ve ever used.
For some categories of user, myself included, the performance is actually good. Performance is a relative thing: what’s acceptable for someone who lives in the browser would be glacial for a designer who spends their life in Photoshop.
My work life is solidly in Office 365 and a browser, and for this kind of work–which we should remember is a huge chunk of users across the globe—the Surface Pro X is perfect. Office and Edge have never felt slow, no matter how many tabs or documents I have open. And the ability to use it anywhere thanks to built-in LTE makes it more useful than a conventional laptop.
My M1 Mac mini is… well, it’s a Mac. It does everything I have ever used my Macs for, including audio and video editing, and subjectively does it as well and as fast as anything I’ve ever used. It feels faster than my year old 16in MacBook Pro, which has double the memory and on paper at least ought to be easily quicker. And it does it while being silent, cool and snappy with everything.
Using it with third party hardware has also been a very Mac-like experience. Whatever third party hardware I have plugged in has worked from an old Logitech webcam to a Blue Yeti USB mic. Even the software which allows me to programme my Corsair gaming mouse works perfectly. I don’t even know if it’s running ARM or Intel code.
That’s the crucial difference between the two devices. You don’t have to think about the Mac mini as anything other than a Mac. With the Surface Pro X you need to remember that it’s not a Windows device, it’s an ARM Windows device, and limit what you can do accordingly. If your requirements sit within those limitations, it’s a great machine. If they don’t, you won’t be able to use it at all.
This is why Microsoft’s marketing language about Windows on ARM devices focuses on how they are “a new category of PC” and why it talks about the Surface Pro X as for “mobile professionals”. The company isn’t confident—rightly—that ARM devices can replace an Intel PC except in those specific circumstances. Apple thinks of the M1 as making the Mac just a better Mac, first for low-end customers where it can deliver performance that’s already close to the top end, and in 2021 and 2022 for its most demanding users.
What can Microsoft do? I honestly don’t think there’s much it can, at the moment. To get to where Apple is, Microsoft needs to persuade Qualcomm that it should devote time and effort to build chips optimised for higher thermal envelope devices, such as laptops and desktops. That’s not as easy as it sounds. It also needs to persuade Qualcomm that building in features to make it easier to emulate or translate Intel code is worth the effort.
On the software side, it also needs to sort out the mess that is Windows development. Windows Presentation Foundation, Universal Windows Platform, Progressive Web Apps… the entire system is a mess that makes it harder, not easier, to choose how to develop for Windows. This is something that Apple has been very good at managing in the past.
Can Intel or AMD keep up? Apple is already a larger processor company than either of them in terms of units shipped, thanks to its wholly owned designs for the A- and no M-series. Combine iPhone and iPad and Apple ships more than 250m devices per year. Most of those devices will share a great deal in terms of processor design as the M1, meaning that the Mac gains from the economy of scale and design cost amortisation that Intel can only dream of. There are of course big differences between an M1 and an A14, but they share the same cores, the same Neural Engine, the same image processing and secure enclave designs. Perhaps over time the high performance cores in iPhone and Mac will diverge, but at the moment Apple doesn’t need to do it, so it can benefit from being able to design one processor core which ships in 250m devices a year—more than the number of PCs shipping over the same time scale.
I think all this adds up to Microsoft and Intel being in a bigger boatload of trouble than most people think. The shift to ARM feels like a classic Clay Christiansen transformational technology moment, with M1 the tipping point when a technology moves from cheaper but not as powerful as the incumbent to beating it. The path that Microsoft and Intel must take is a radical rethinking of their own businesses, and I am not clear that the internal forces in either company have accepted that yet.
I lasted a day without preordering a new Mac running Apple’s M1 processor. That’s an improvement, right? Better than I would have done a few years ago?
Well… not by much. My chronic case of former tech journalist disease means that, even though I don’t need to keep abreast of the latest technology for work, whenever a new and interesting change in computers comes out, I end up having to buy it.
That’s how come I own a Surface Pro X, and it’s now why I own a Mac mini with Apple’s M1 ARM-based processor.
I bought the lowest-end model, with 8Gb of memory and 256Gb storage. Having paid out #1800 for a new 16in MacBook Pro earlier in the year, I didn’t really want to pay another #1000 plus for a MacBook Air or Pro 13. I also have too many laptops, anyway.
The timing was right, though, for a desktop Mac, the first one I have owned since another Mac mini about 12 years ago. The shift to working from home during the COVID-19 pandemic has made me rethink my home working setup, alongside how I work. I have another post in the works about this, but having a fantastic, clear working space has become very important to me, and utilising a small desktop computer with a screen makes a lot more sense than it did even a few months ago.
The relatively limited storage doesn’t worry me. The cloud storage services I use all do on-demand downloading and manage how much of your local storage they take up (this is the future, people).
What about that 8Gb memory, though? Almost everyone’s consensus is that 8Gb is paltry, 16Gb is the bare minimum for proper work, and anyone serious has 32Gb.
I think this misunderstands the advantages of moving memory from discreet components to part of the system on a chip (SoC). Apple’s unified memory architecture is fast and efficient, and swapping between fast, efficient memory and fast SSDs is nothing like paging in and out of RAM from spinning disks back in the day. Remember, too that Apple has a lot of experience in this: while flagship phones in the Android world now come with 12Gb of RAM, the iPhone 12 Pro Max has just 6Gb—and the iPhone 12 gets by with just 4Gb. And now one would say those devices are slow or lacking in power.
Sure, the requirements with a Mac aren’t the same as a phone, but neither does the Mac have the same design restraints.
The new Mac arrives tomorrow, and I’ll be setting it up tomorrow night, so I’ll know more then.
This week was the deadline for the latest piece of coursework in the masters that I’m working on (senior leadership, which is fun). That meant a scramble – with COVID tiredness affecting how much focus I have and plenty of other work to do I have to juggle and ration my time to keep everything in balance.
Doing the masters has given me a different perspective on many of the pronouncements from the government on work. It makes it obvious how outdated a view they have of management and leadership.
Consider, for example, the “everyone needs to get back into the office to get back to work” approach that Johnson and friends are so fond of talking about. Offices are Victorian, as much an invention of the Fordist Industrial Age as the production line. They are production lines for information: work passed (physically) from desk to desk in paper form. As the Information Age took over, we retained the same models because the documents we worked with had become digital, but they were confined to the physical office network.
Now, the documents and workflow are freed from this constraint. You can work from anywhere, and the information is where you are.
All this means that role of the office is now about social space, more than work space. The serried ranks of desks are no longer required, but what is needed is to retain the high-bandwidth emotional connections with the colleagues you work with and that turns a collection of workers into a team. And none of this requires you to commute in every day and sit at the same desk, eating the same Pret sandwich1
One thing I’ve been thinking about a lot this week is how much folklore of the early internet age we continue to lose.
Take Apple. Such a lot of the history of Apple has never been documented properly. It may exist in the company’s archives, but knowing how company archiving works when you’re talking about events of 25 years ago, I’m not hopeful.
For example: I don’t know how well known it is that Apple considered using Windows NT as the basis for it’s “next generation” operating system, what became Mac OS X.
Everyone knows the story of how the company ended up in with a two horse race between NeXTStep and BeOS, and how BeOS lost. But they weren’t the only contenders: both Windows NT and Sun Solaris were considered. The idea was to take one of these and use its multitasking foundations to build an OS which would have a Mac “personality” running on top. It would look like a Mac, and, at least in theory, they could find a way to run Mac applications, but it would run on NT or Solaris foundations.
Even after the NeXT acquisition, inside Apple some agitated not to use the NeXT kernel:
But insiders say Jobs and Hancock have argued bitterly over the “kernel,” the code that will become the core of the operating system. Jobs has pushed Hancock to swiftly adopt Next’s kernel and move to other pieces of the software, especially the modification needed to make Next’s software compatible with that of the Macintosh. Apple has promised that new machines would be able to run older Macintosh applications in a window on the computer’s screen.
But Hancock is conducting a study of Next’s kernel, comparing it with Copland’s as well as Sun Microsystems Inc.’s Solaris. She had promised that a decision on a kernel would be made by the end of January, but nothing has been announced. Should Hancock choose a kernel other than Next’s, Apple’s engineers would face the difficult job of merging software from two different sources.
And meetings took place with Microsoft which had this on the agenda, even with Steve Jobs there:
One item of discussion was the possibility of Apple’s licensing Windows NT, Microsoft’s industrial-strength operating system for the corporate market, sources told the Times. No further details on the topic were disclosed.
Stories like this exist online, but unless you know to search for them, you’ll never find them. Digital archeology in action.
Apple should have stood its ground on ad tracking, but when you’re faced with the likes of Instagram’s CEO crying bitter tears because they will no longer be able to spy on you, it’s easy to see why they didn’t.
I feel sorry for Pret: they have become a symbol of dull commuter life over lockdown, through no fault of their own. ↩︎
Of course, a couple of weeks go I should have said that I was on holiday so wouldn’t be posting much for a couple of weeks. I should have said that, but I didn’t.
Normal service will be resumed this weekend.
I hadn’t heard of Tot before I read MacStories’ article about its new share extension in iOS, but when I did I was intrigued. And when I used the Mac version I knew it was something I really wanted on my Mac.
At its heart, Tot is a scratch pad. It’s just a place to jot down little snippets of text, often that you will use elsewhere.
There’s only seven documents, called dots, represented by – you guessed it – a series of dots along the top of the window. If a dot has text in it, it has a colour fill (you can change this for accessibility purposes – a nice touch).
This conceit of seven and only seven possible “documents” is what makes Tot so good. It places a limitation on what the user can do which nudges you towards a particular kind of behaviour. Applications like Drafts or Apple Notes allow you to keep on making more and more new documents and that encourages you to never actually look back on what you’re written.
The seven-dot limitation of Tot means you can’t do that: if you keep taking notes, as soon as you hit that seven dot limit you’re going to have to go back through what you’re written and either delete something or, if it’s still valuable, move it elsewhere.
There are some other cute little interface touches, all of which remind me quite why I love the Iconfactory’s software. You can have Tot set up either as a menu bar icon or a dock icon. If you have it in the dock, the icon changes to match the colour of the front-most dot.
You can set a keystroke to invoke it on the Mac and there’s a smart set of keyboard shortcuts which let you move forward and back through dots without taking your hands off the keys. You’d be surprised how many text applications don’t have proper navigation like this. There’s also, I’m pleased to say, Touch Bar support.
The Mac version is free: the iOS version is $20. That sounds like a lot for an iOS app, but in the great history of what you can charge for software it’s peanuts. I paid more than that for ridiculous shareware games in the past. And as Mike Schmitt on Sweet Setup points out, for an app this simple a subscription model just doesn’t make any sense.
And the iOS version is excellent, working exactly how you would expect it. If you’re using an iPad with a keyboard then you will find all the keyboard short cuts you have on the Mac version. To switch between dots, you can just swipe across the screen with a single finger. Again, it’s simple, but you can see and touch (literally) the thought that has gone into making it easy.
The iPadOS version really comes into its own when used in a Slide Over window. It’s ideal in this kind of scenario. Of course you can use it full screen, or split view, but when you have it in Slide Over you can see the screen and take notes easily from what you’re working on, or just drag and drop text or links from your “main” view.
The key question with any new software, though, is “what can you actually use it for?” For me, it’s all about jotting down random thoughts and ideas that I’ll take and do something with later – this blog post started life as a set of jottings in a single dot, and then moved to Ulysses once I thought I had enough to start writing a full blog post on it. And the nice thing is that when I exported to Ulysses, all my links and formatting just dropped right in.
Computer viruses are called viruses because like biological viruses, they spread by themselves. What Malwarebytes is talking about are scam apps — things that trick or otherwise convince the user to install voluntarily. Dan Goodin had a piece at Ars Technica last month about the scourge of fake Adobe Flash installers — which work because unsophisticated Mac users had been truthfully told they needed to upgrade their version of Flash for a decade. It’s a real problem — but third-party antivirus software is not the answer. As usual, Tsai has a wonderful compilation of links to commentary on the matter.
Sigh. I can’t believe John is still making this distinction as if it matters. The vast majority of malware on Windows and pretty much any platform is scamware, not viruses. This has been true on Windows for probably a decade, maybe longer.
What matters is, as I argued 12 years ago, that the Mac is now a large enough target to bother creating malware for. There’s money to be made out of those Mac users, particularly the ones who bought the line that the Mac is immune from malware.