Dear Apple,

Please can we use third party Bluetooth mice with pre-boot MacBook/MacBook Pros encrypted with FileVault. I’m not happy that only Magic Mouse mice work during this part of the boot process.

Last year I posted a blog which expressed my frustrations with the system recovery process and non-Apple Bluetooth mice. Even with the Logitech Master 2S hooked up via USB, System Recovery would have none of it. That’s when I had to fork out £55 for the Apple Magic Mouse.

Performing system recovery circa 2019 (pre-macOS Catalina)

Speaking of pre-boot problems, the 2019 MacBook Pro 16″ appears to have issues with Apple’s own (external Bluetooth) Magic Keyboard during pre-boot. When I go to enter my account password to unlock FileVault, more often than not, some key presses aren’t registered in the input form. I have to press them multiple times to get them to register. Sometimes it’s so bad, I have to open the laptop lid to fully expose the built in keyboard and enter it there. That’s works okay – but I’d rather use the external keyboard wherever possible if I’m hooking the machine up to my external monitor. I’m pretty sure my 2018 15″ MacBook Pro didn’t suffer from this problem. Maybe macOS 10.15.4 will resolve the problem?

I generally love the Apple Magic Mouse 2 with its gesture based touch surface (scrolling sideways has never been easier), but when it runs out of battery and you need to charge it – well, you’d better have a backup.

Thankfully I do, because about a year or so ago I bought a mechanical keyboard and gaming mouse – similar what I was using at home – for work when I was using a Dell laptop rather than the Mac Mini that I use now. I still have that mouse, and it’s being put to good use while the Apple mouse has a little rest..

Apple mouse on its back while the Corsair takes over

Since Apple started producing its own silicon for the iPhone, the A-series of ARM-based chips has gone from strength to strength. No better device to demonstrate the muscle of the Apple designed SoCs is the iPad Pro. It’s a very capable multitasker – great with media consumption and even pretty decent when it comes to media creation too. But is iOS and iPadOS limiting the series’ potential?

If Apple intends to put their own silicon at the heart of the Mac, it needs to be able to run all current software at the same or better performance than that which is offered on the current Intel platform. Apple has only just released the Mac Pro, a full on Intel-based Mac with up to 28 cores (56 threads) and is a beast (and possibly one of the best design PCs in existence according to several reviews I’ve seen – it’s so clean and manageable inside!). It seems they intend to release a 14″ 10th generation Intel based MacBook Pro at some point this year too.

Yet in the PC world, AMD has the Ryzen 3990X processor with a stupidly insane number of cores: 64 (128 threads) and is an overkill for all but the most intensive applications. For those that need the performance, I don’t think the ARM64 architecture has got to the point where it can compete in that space for a good number of years. Certainly, if Apple were to release a Mac in 2021 with an Apple designed SoC – even if it’s the standard MacBook (not the Pro) model – this means introducing tools to convert existing x86 code to ARM64 and vice versa. Nobody is ever going to run Logic Pro or Final Cut Pro on a simple MacBook, but how powerful does the Apple processor got to be in order to perform the code translation. Or how much work will the developer have to put in to create a universal binary that runs on both platforms? It’s Rosetta all over again.

Then there is the question of Windows. Intel Macs can run Windows natively via Boot Camp. Or via virtualisation within macOS. But if Apple starts moving to ARM processors, this obviously will break that feature – which is very useful for those that develop for both the macOS and Windows platforms.

Microsoft has done some legwork porting Windows to ARM. They’ve even released a Surface Pro laptop (the Surface Pro X) which runs Windows under ARM. But there are so many limitations with the platform which make adoption pretty terrible (and expensive) right now. Apple could potentially update Boot Camp for use with ARM Windows, but until the Windows on ARM platform is sufficiently mature – I don’t think it’s worth it. Even through x86 emulation, it’s not going to be good enough.

Then there are the Thunderbolt 3 ports on current Macs. Dongle city. Thunderbolt is very much an Intel thing, so Apple would still have to continue licensing it from them as well as perform extensive testing to ensure existing peripherals continue to work.

The important thing for me is that Apple doesn’t try to force an iPad-like experience on macOS. If macOS is going to go ARM, I want the macOS experience and to see the performance from applications around the same mark as the Core i9 and AMD Radeon Pro 5500M in my MacBook Pro (which has got to last me 4 -5 years before I can afford another major Apple purchase).

So at what point do you release an ARM-based Mac (if at all)? Difficult to say, but I’d say it’s 2021 would still be far too early. It’s not as though we’ve reached a plateau of power/performance which was certainly the case with the G4 and G5 processors. IBM pretty much forced Apple’s hand, because it just wasn’t possible to put a G5 processor into a laptop.

So maybe Apple should keep ARM to the mobile devices, and switch to AMD for its processors instead. They’ve leapfrogged Intel at an important milestone when it comes to die shrinkage – and, after all, they devised the whole x64 architecture anyway. And AMD must be pretty decent given that both Microsoft and Sony will be using their CPU and GPU technology in the PS5 and Xbox Series X. So any all-AMD Mac/MacBook Pro would be a decent all rounder.

It’s been argued that he x86 architecture is old and out of date – and it has been around for a very long time, this is true. But ultimately it’s allowed those of us with feet both in the Windows/PC world and Mac world the ability to co-operate with each other like never before and do stuff that just wasn’t possible back in the days when Macs where running 68000 or G4/G5 processors.

Let me start off by saying that I’ve been working with electronic communications since I was about 8 or 9 years old. I’m now 43, nearly 44. I started off being allowed to “play” (read: carefully dictate messages) with a major international re-insurance broker’s telex system whenever my dad took me to work with him. I found it absolutely fascinating. And the beautiful mechanical DEC keyboards were a joy to type on. Much later on I’d spend a few summers on work experience at the same company, telexing and working with spreadsheets and FileMaker Pro databases on expensive early Macintosh kit.

At secondary school I wrote an internal email system in BASIC which made use of the local school’s Windows network. It was super simple and merely demonstrated how a typical email system would work. But work it did. I did all this for my Computer Studies GCSE.

The combination of work experience and the GCSE project very much defined my career because for the past 24 years I’ve been a systems administrator. My first ever gig after leaving university was setting up a local Norwich ISP. This included providing email services for both dial-up (remember that?) and web hosting customers. Things back then where much simpler than they are now. We had very few spammers, phishing was a mere twinkle in scammers eyes, and anti-virus was something that only naughty people caught. Generally any filtering was performed client-side. We didn’t have much in the way of webmail – everything was POP3. IMAP was considered a novelty.

But even then, it took some effort to manage and maintain an email system. As years went on, Sendmail (for it was the de facto at the time) filtering started and both commercial and free anti-virus scanning became essential. Then SpamAssassin integration. When I was working for The Moving Picture Company (MPC) in London, I looked after the main mail servers. I split off anti-spam filtering (powered by SpamAssassin) onto its own server (which cost us nothing – I used an old render farm machine – whereas when we were looking at Barracuda’s anti-spam system at the time (circa 2002-2003) it was merely an expensive pimped up SpamAssassin box with a fancy user interface) and replaced the ageing 486 that was powering the entire company’s email with something beefier – as well as performing a major upgrade of Exim and having to rewrite its filtering/configuration files. Oh, and integrating Mailman for internal mailing lists and writing some PHP code to make it all look prettier and easier to use for the VFX producers. All that took a LOT of work. But when the business/non-production side wanted Microsoft’s bloated and super expensive Exchange – I explained I could implement a cheaper system at a third of the price which could do everything Exchange could – I was ignored. They went with Exchange and old muggings here had to implement a split email system (which worked well enough).

During all this time, my personal email was hosted by myself. I usually had some kind of Linux virtual machine running Exim and some IMAP/POP3 daemon, or in some cases, a Windows virtual machine and MDaemon. MDaemon was (and still is) a lovely Windows-based mail system. Very comprehensive. But for a single user or household, it’s flipping expensive and I had to eventually give it up. There were other times when I hosted my email on a cPanel account or server. But the point is that I’ve managing my own email with my own domain (the one you’re reading this post on, in fact) since 1997. I went through more ISPs than people have had hot dinners. But my email address always remained the same.

When Google’s Gmail came on the scene around 2004, I thought it was one of the best web-based email systems around. It beat the living daylights out of Yahoo! and Hotmail. It had genuinely useful features. But you couldn’t attach a domain to it. You had to make do with a @gmail.com address, and you had to put up with advertising. This also meant no easy support from Google. But in 2006, Google started trialing Google Apps For Your Domain. It was initially free, and allowed you to attach a domain to Gmail and use it alongside other Google applications too. I started getting involved in the Google forums helping to support it, as I had liked it enough to move my email over to it. Having to use only a web browser for email when I was very disappointed in all dedicated email clients was wonderful. It meant that I could use different web browsers, but the functionality would remain the same. My bugbears against dedicated email clients included the use of fixed width fonts and word wrapping, email filtering was bad, or the quoting methodology was insane. I kind of liked Outlook for a while, and there were some workarounds to the quoting system, but when Microsoft updated Outlook, everything broke and I gave up.

When Google started offering paid versions of Google Apps For Your Domain (which had subsequently become just Google Apps), I started to pay for it. £5/month (well, less – but you need to factor in VAT). It enabled me to turn off advertising, so I had email privacy (with the usual caveats – you need to allow an email system to scan incoming email for spam, phishing and viruses – all this is automated and no human sees it). I had email privacy and a much bigger email quota to boot. And I had official support.

When I worked for Imagineer Systems (now Boris FX), I migrated the email system from a single virtual machine running on a dedicated box to Google Apps for Business. The cost saving alone was worth it. It just made everything easier and simpler to maintain.

Meanwhile back in workland (during my time at Memset), I was supporting customers who were also rolling their own email – albeit mainly via cPanel/WHM which does provide a very comprehensive set of tools. Some customers rolled their own separate Exim or Postfix configs, but mainly it was cPanel. Many a time I discussed us becoming a Google partner and supporting Google Apps, but Memset rolls its own cloud services and it was not something that ever was going to happen.

Where I work now, we using G Suite. And it makes light work of maintaining a corporate email system. It’s survived a company rebranding easily enough and the tools and services it provides is having a very positive impact on the business. We’re looking to extend that too, so’s all good. My experience with working with G Suite over the years is paying off!

Phew. After that long introduction, I’d like to get around to the whole reason I’m posting this thing in the first place. The cost of ISP branded email and its use after you move ISPs.

As you can appreciate from above, a lot of work goes into managing email systems and it’s also not cheap depending on the system you go for. Many ISPs used to roll their own email systems using open source applications like Sendmail, Exim, Postfix, Amavis, etc. but when Google opened up its Google Apps for Domains to ISPs, a number of them switched to that.

Then Google ended Google Apps for ISPs, and they had to move to yet another system. Some moved to Yahoo!, some moved back to hosting email internally again, others to other externally hosted services.

And many did move to externally hosted services because maintaining a functioning email system for thousands of customers – even using open source tools – takes considerable time, effort and money. Scaling such a system is expensive. Keeping it functioning when many hundreds or thousands of people keep hitting the POP3 or IMAP server every few minutes requires monitoring and maintenance.

So why should ISPs expect to maintain your email, for free, and for life, when you leave their broadband service? I find it doesn’t make any kind of sense either economically or practically.

Apparently Sky do let you keep your email address when you leave them, but as they’re using Yahoo!, I doubt it costs them much to do so. Yahoo! does the heavy lifting and you probably get adverts within your mail which recoups the cost of the maintaining your account.

They key thing here is: how much do you want to pay for your email? It’s never free. There is always some cost attached to it. One option may be using an ad-blocker – but this is deprives the provider from any income which is used to pay for your hosting. Another option is to move to a dedicated email provider such as Yahoo!, Gmail or Outlook.com – but you’ll end up with having to cope with adverts. Yahoo! offers a paid upgrade option which gets rid of them. Outlook.com too (Outlook Premium).

You could use your old ISP mail when you move ISPs, but for the likes of BT and TalkTalk who charge – it’s not an unreasonable charge in comparison to other hosting options out there. But here’s what I would do:

  • Buy a domain. Any domain. They don’t need to be expensive, but you what to find something that’s going to last a bloody long time.
  • Find a dedicated email hosting provider. Follow any instructions they give on how to set things up (or get your family IT tech support to help out!) – or consider the likes of G Suite, Office 365, or even Rackspace.
  • Depending on how you have your email set-up at the moment, you’ll either need to migrate all existing mail into the new hosting provider, or you’d need to move you email to the new account within your email client. With Outlook, it may be easier to export your old mailboxes as PST files and import them into the new account, then delete the old ISP mail account afterwards. Again, your new email hosting provider can help you with this (or your friendly family IT manager can!).

The domain part is important because it means that you get to choose any email address you like at that domain. Providing you continue to pay for the domain and hosting, you never need to change email addresses ever again. I also think that if you’re running a business, a domain name makes things a lot more professional.

So I completely disagree with Ofcom’s assessment that ISPs are charging too much to keep old email addresses. I think this would become less of a problem if ISPs allowed you to host a domain name with them for the purposes of email (or free web space – which is still a thing, apparently). You either import or buy (cheaply) a domain and use it with the ISP for your email. When you need to move ISPs, you provide the new ISP with the domain name they take care of the move for you. How about that? That could potentially work.

The term Visual Effects (VFX) usually refers to post-production work to integrate CG or model photography into live action footage. Special Effects (SFX), in turn, usually refers to practical, on-set effects like explosions, smoke, mechanical effects, etc.

When I first started working in the VFX industry way back in 2001, CG post-production had become the dominant force in VFX. Few productions were using miniature/model effects, and even fewer animatronics (to the point in 2004, Jim Henson’s Creature Shop packed up and moved back to Los Angeles from London).

MPC made strides in a number of CG innovations, including its innovative Alice crowd simulation software which was used on a number of big productions. Over the years, we’ve seen major improvements in motion capture, character rigging, CG modelling, compositing – generally all the usual fields that CG VFX is put to use.

But as film makes way for increasingly ambitious television productions, new techniques have to be found in order to reduce costs and speed up production turnaround.

The Star Wars films have always been a major innovator in the world of visual and special effects. Right from the first film back in 1977, George Lucas had the foresight to start his own company for the purposes of producing the complex visual effects needed to tell his story.

Those complex visual effects have mainly been restricted to film only – costs for producing something on that scale for television has been been pretty prohibitive. Attempts at the first Star Wars TV special were interesting to say the least:

Getting back to the the evolution of VFX tools, the games industry were also making progress with more and more complex video games. This too required substantial development and R&D, and you’ll find that both VFX and the games industry share many of the same tools and techniques – each industry complementing each other.

Fast forward to Disney’s “live-action” The Jungle Book and subsequent The Lion King. Director/Producer Jon Favreau and his team developed brand new techniques to help create the films. This in turn has been used in the first ever Star Wars television series, The Mandalorian.

In order to create the fantastical sets, landscapes and other backgrounds, the production brings the old technique of rear projection bang up to date by using giant LED screens to produce a highly detailed background. This allows filmmakers to shoot practically whereas before you’d typically shot on green or blue screens and replace the backgrounds in post-production. My biggest issue with this approach is that scenes which use green/blue screens never have fantastic lighting. If you’re shooting a scene which is, for example, set outside but shot indoors with green screen compositing, it never looks real. The lighting is a dead giveaway. Whereas with this new virtual set system, the lighting is a lot more accurate and realistic.

The virtual set system obviously requires some VFX to be produced pre-production to display on the giant LED screens, so this technique is bringing together the terms VFX and SFX – creating a blurred line between practical and post.

What’s interesting is that the system uses Epic Games’ Unreal Engine which is probably best known for the video games Unreal Tournament and Fortnite. Filmmakers can make live changes to landscapes and environments on the fly through the use of Unreal Engine. All these techniques are thanks to ILM‘s Stagecraft Virtual Production team, Epic Games and Jon Favreau’s Golem Creations.

It’s all very impressive, and I consider it a major game changer to the point that if they haven’t already, the entire team responsible should be given an AMPAS scientific award for filmmaking innovation ASAP. It’s certainly the most exciting FX technology that I’ve seen in past 20 years.

The Mandalorian airs exclusively on Disney+, the new streaming service from Disney, and will be available to watch in the UK starting the 24th March (I can’t wait!).

For the love of all things holy, I cannot believe this company. 5 days compensation is better than nothing, but when you consider it was a full 27 days, it still feels rather stingy. But that’s not what’s got my goat. After reading the initial blurb, there’s a link to an update site which allows you to put in your name and email address.

ALAS!

They’ve not put a valid SSL (or TLS, if you prefer – technically it should be referred to TLS these days, but people are set in their ways) certificate on their site. Which means that any form data transmitted will be sent unencrypted between the user’s browser and the server. This could (unlikely, but still possible) for data being sniffed and captured by a third party.

Another method is by spoofing the southwesternrailway.com domain. I could register a domain such as southwestermrailway.com (as an example) and duplicate the same hostname and the site contents (changing the form details so that anything is sent to me or a file on the server), leaving out the SSL certificate. I could potentially hoover vasts amounts of data as people don’t bother to check the URL or SSL certificate.

In any event, putting an SSL/TLS certificate on your site is vitally important these days, and it’s not difficult to do. I’m still amazed that Bafta.org hasn’t put its entire site behind SSL/TLS (try going to https://www.bafta.org, and it’ll redirect you back to non-SSL content), nor Milk VFX which solicits job applications to submit entries via an unencrypted form. Bad, Milk VFX, bad!

Update: Looks like they’re using external Salesforce CRM to capture the information. The Javascript form code is hosted securely – thank goodness – and it looks like the form data is also submitted securely to Salesforce servers. Even so – I’d still be pretty weary about any site without a proper SSL certificate and encrypted traffic between the browser and server, and not everybody is going to want to scour the page’s source code to determine what’s going on.

Here’s my biggest gripes with this new foldable phone craze:

  • When folded up, the phone’s.. girth.. appears to be considerably more than that of a non-foldable phone. It saves space length-wise, but not thickness-wise. It’ll probably fit in most men’s trouser pockets just fine, but it’ll be bulky:

“Why, is that a Galaxy Z Flip in your pocket, or are you just a pervert?’

  • Easily damageable screens, and potentially easily damageable/limited hinge lifespan. These phones requires far greater care and attention that a regular phone which negates the whole reason for having a folding phone in the first place. You want durability. You want to save space (both in length and thickness). You should not have to pick one or the other.
  • They don’t look that much different from the flip phones (you know, the ones with a physical keypad on one half, and a tiny non-touch screen on the other) from the 90’s and early 2000’s.

In short – foldable phones, in their current state, are simply cute and fluffy Mogwais just waiting to turn into Gremlins. It’ll cost you an arm and a leg at this early stage of the technology, and if you damage your phone, possibly a spleen or kidney to continually replace the screen.

They say (and rightly so), that you don’t buy a Mac to play games. And yet, how do you explain Apple Arcade – the £4.99/month subscription service from Apple which provides a selection of high quality games (albeit no AAA titles) across iOS, iPadOS, tvOS and macOS devices?

On my old 2018 15″ MacBook Pro, I could play the same games I had on my iPhone on my Mac – and the performance wasn’t too bad. It’s even better on the 2019 16″ MacBook Pro, of course. But Macs weren’t really intended for heavy gaming – this has long been the dominance of games consoles such as the Playstation, Xbox, or the Nintendo Switch. And gaming PCs, of course – Nvidia graphics, quad/hexa/octa-core CPUs with 16Gb+ RAM and superfast SSD drives.

Yet modern Macs have quad/hexa/oca-core CPUs, 16Gb+ RAM and superfast SSD drives. Yet they can’t play AAA titles even if they were ported to macOS. In part this is due to the Nvidia vs AMD graphics. Nvidia has had a long established foothold in the graphics market on PCs – yet AMD’s graphics power the likes of the Xbox and Playstation (and will do with the next generation consoles coming this year).

Nvidia vs AMD

Macs did once have Nvidia GPUs, but due to a long running spat between Apple, Intel and Nvidia, things were never the same. This is not to say AMD produce inferior graphics chipsets – as we’ve seen, they’re used in today’s modern consoles alongside AMD CPUs too. And AMD has just released a 64-core CPU capable of 128 threads. This is a monster of a CPU (with a monster price – $4k for the CPU alone).

But Macs graphics have never been particularly powerful for gaming – primarily because Apple has been concentrating on more professional creative workflows than 3D gaming. And MacBook Pros have been very slender machines which makes designing thermals to keep the machine cooled a bit of a challenge.

Another problem with Macs is that now macOS Catalina has gone fully 64-bit, many 32-bit titles will not work. Goodbye Team Fortress 2 – many a wasted hour spent laughing long and hard playing that game.

But with the 16″ MacBook Pro with an Intel Core i9 processor and an AMD Radeon Pro 5500M with 8Gb of video RAM, I can finally play Fortnite at reasonable framerates. The only downside is that with the release of macOS Catalina 10.15.3, the native Fortnite client takes around 4-5 minutes to load every time. With 10.15.2, it was near instantaneous. Also, with the newer thermal design, the CPU fans will ramp up and it does become quite noisy – so I revert to using headphones.

Nvidia’s GeForce Now – a potential solution to all Mac gamer problems?

Nvidia has a possible solution to the Mac/older PC problem. They have a subscription service which lets you play owned games (including Fortnite) by effectively providing a hosted virtual machine with one of their high end graphics cards. You’re effectively playing the game on their server and streaming the video back to your machine. This relies on:

  • Having a fast connection (50Mbs+ recommended)
  • Using ethernet rather than Wi-Fi
  • Low latency

You also need to own PC games. Fortnite is free to play, so as long as you have your Epic Games credentials, you’re all good. GeForce Now requires that you have logins for the games you’re playing. Nearly all games these days require some form of connection to the internet anyway, so this isn’t so much of a problem. Many of them are available from Steam anyway.

The downside to this is:

  • You’re giving credentials to a third-party service (Nvidia) which stores those credentials on their platform. The logins are usually connected to accounts where credit/debit card details are stored.
  • On a Mac, some symbols are only accessible via the option key – if you have a particularly complex password, good luck trying to enter them via the GeForce Now Mac client. Oh yes, copy and paste between the Mac and the client isn’t supported.
  • Nvidia does not support any form of two-factor authentication on their accounts. This is very bad.

So you’ve got to be very trusting that Nvidia will keep your credentials safe. And you’ll need to ensure that your GeForce Now account’s password is a strong one. Nvidia really need to get their arse into gear and deploy 2FA as soon as they bloody well can. They also need to fix their SPF and DMARC record, because all Nvidia store email goes to spam as a result. This is basic, basic stuff.

Nvidia needs to go back to email school and learn all about SPF and DMARC

Fortnite under GeForce now is very good. Initially it felt as if there was a little bit of lag (latency) when running under the Balanced setting, though it seems to have passed and gaming feels as good as running it locally. As I run my 16″ MBP via an external monitor – it’s limited to 1920×1080 which is a decent setting to run most games on high mode. GeForce Now Fortnite runs well with the high settings enabled and connected to Zen Internet via ethernet at 300Mbs download/50Mbs upload.

Fortnite using Nvidia’s GeForce Now on a Mac. High framerates! (Match sped up x 2)

In terms of loading speeds, GeForce Now and Fortnite are considerably faster than the native macOS Fortnite client. And the Mac’s fans never ramp up at all during gameplay. But I’d ideally still like to play Fortnite natively – if only the loading time issues can be resolved.

Can’t run Team Fortress 2 on macOS Catalina because of 32-bitness? GeForce NOW *can*.

The biggest bugbear is that GeForce Now doesn’t support one of the biggest titles in the past 7 years – Grand Theft Auto V.

GeForce now has two subscription tiers: free, with one hour sessions, and a limited edition Founder’s level which gives longer sessions and priority access (whatever that means). That said, it is £4.99/month with a 90 day trial before your card is charged, which is the cheapest and most generous I’ve seen.

Beware of the Shadows

There are alternatives to Nvidia’s video game streaming. One of them is Shadow. I’ve tried them before. They essentially provide you with a fully virtual Windows PC with Nvidia Geforce graphics card. You’d install games as you would under Windows. Unlike Nvidia’s GeForce Now, you have disk space and, as such, a quota to work with.

The biggest stumbling block I found with Shadow was the latency and overall streaming performance. Despite ethernet or Wi-Fi connection, the lag was very noticeable. And it was extremely expensive for what it is. You’d be better off saving the money towards a console or middle-end gaming PC. Even now, you have pre-order – with some specifications unavailable until 2021.

There’s also Google’s Stadia. Another streaming platform, designed to work across TVs, laptops and tablets. From what I understand you’ll need to buy hardware (at the very least a controller) and a subscription. From what I understand from the web site, you need to buy the games directly from Google to play with Stadia rather than bringing your existing library into it. That kind of limits things somewhat, and makes everything more expensive if you already own titles held on a different platform.

My recommendation

The current range of consoles – especially with Microsoft’s Xbox – are shaping up nicely as a good all round gaming system. Xbox has introduced mouse and keyboard support which when developers take advantage of, give PC-like gaming at a fraction of a cost. The next generation of consoles will also introduce SSDs for storage, which means much faster loading times. It’s also possible to stream from the console to a Mac or PC over local LAN, should you so wish.

But for the Mac user, regardless of whatever model you may be using, a combination of Apple Arcade and GeForce Now may be good option. Providing Nvidia continue to add titles, fix bugs, add essential features (as I mentioned earlier – copying and pasting between environments, and 2FA protection of Nvidia accounts).

I’ve been looking at the stats for this site, and along with lack of active blogging recently, I’ve decided to head back to WordPress.com for the hosting and ditch the likes of DigitalOcean and Cloudflare.

I’ve paid up front for two years which works out about a third of what I’m paying now. And I’m also saving money by not paying for plugins such as WP Rocket (caching/optimisation), Foobox (lightbox for images) and the Wayfarer theme. Though I am still using the Wayfarer theme because it comes part of my WordPress.com plan.

And, I don’t know whether it’s just me or not – I’ve noticed that with the Premium plan I can now add Google Analytics. I’m sure that when I signed up early last year the premium plan didn’t have this – only Business or E-Commerce plans had that option. So good news!

What’s not so good news is that trying to import a media library (images, etc.) is not the easiest thing to do with WordPress – so I’m ditching all articles from last year and starting anew. Besides which, one of my bugbears about search engines is that a lot of the stuff they archive can be so old and out of date, the information is useless. It can really hinder when searching for technical information. So the posts in this blog may not last much more than 1-2 years maximum before biting the dust.

Still a bit more work to do around here, but it’s nearly done. Comments should continue working as before.