I generally love the Apple Magic Mouse 2 with its gesture based touch surface (scrolling sideways has never been easier), but when it runs out of battery and you need to charge it – well, you’d better have a backup.

Thankfully I do, because about a year or so ago I bought a mechanical keyboard and gaming mouse – similar what I was using at home – for work when I was using a Dell laptop rather than the Mac Mini that I use now. I still have that mouse, and it’s being put to good use while the Apple mouse has a little rest..

Apple mouse on its back while the Corsair takes over

As my readers are aware, I recently sold my 2018 15″ MacBook Pro (2.6Ghz, 6-core i7, 16Gb RAM, 1Tb SSD) to make away for the 2019 16″ MacBook Pro (2.3Ghz, 8-core, 32Gb RAM, 4Tb SSD). I sold it to a work colleague for what I think is a decent buyer’s price – especially when you consider it also came with over a year of AppleCare remaining.

What is interesting is that my colleague lives in another country, and there was some initial confusion with the AppleCare team in Ireland as to whether it would be valid there. At first I was told yes, then no. But that was okay – the machine was in good working order – had lasted me a full year without any problems whatsoever, and we agreed that if it did go wrong, I’d sort it out here in the UK.

Three weeks later, I hear from the colleague to say that he spotted a dead pixel. Fair enough – I usually use the MacBook Pro with the lid closed, external keyboard and mouse, an external monitor, and the machine always plugged into the mains.

So he took it to an Authorised Apple Reseller in his country and they both noticed that the machine wasn’t sitting flush on its rubber legs. It turns out the battery had swollen too, and was pushing the aluminium case out of shape.

And the very nice people at the Authorised Apple Reseller replaced the whole screen, the touch-bar, the keyboard and trackpad, battery and bottom of the machine. For free. The AppleCare warranty WAS valid there. So effectively my colleague has got a 70% new 2018 MacBook Pro (only the main logic-board, SSD and RAM hasn’t been replaced) which should hopefully last him 3-4 years beyond that of the AppleCare warranty (assuming the battery doesn’t swell again – design fault, maybe)?

This kind of makes me a bit worried about my recent purchase – but at the moment there are no known reports of 2019 16″ MacBook Pro batteries swelling up. Given that the keyboard and system internals have been given a complete redesign, maybe that won’t impact the Lithium Ion battery as much? At least I have AppleCare+ on this new Mac in case things do wrong. In fact, I have AppleCare on my iPhone 11 Pro Max and the AirPods Pro because in all the years/decades I’ve been using Apple kit, it’s been well worth it.

But I do wish Apple would do a better job with selling AppleCare+ outside of the US and Canada. For example, paying monthly like the US would be good – the cost of AppleCare+ for this thing is fairly eye watering. And it wasn’t until fair recently that AppleCare+ for Macs was introduced. It didn’t exist back in 2018. And the issue of trying to figure out whether a warranty bought in the UK is valid in another country is a bit of a bugbear too. The parts are all the same – it all ships from China, so I can’t see why the warranty shouldn’t be valid elsewhere where there are valid Apple Stores or Authorised Apple Resellers.

Finally, AppleCare+ would benefit from onsite support as well as having the ability to take it to an Apple Store/AAR. You won’t get any kind of onsite support from Apple unless you have bought hundreds/thousands of machines from them – in which case, you get onsite repairs from IBM. But if you’ve only bought one or two £50k Mac Pros and loaded them to the hilt – you’ve still got to drag them to your nearest Apple Store/AAR for repair. And this is partially why I have never bought another iMac because I don’t want to drag a bloody great machine half away across Surrey to a London Apple Store for repairs. Dell absolutely lead the way here with their next day on-site repairs. And Dell offers 4 year warranties whereas Apple only go to a maximum of three.

But do I still recommend AppleCare/AppleCare+? Yes, absolutely – this is another case where Apple and/or their authorised partners go out their way to fix things. But it’s still a long way from the likes of Dell and other PC manufacturers.

Since Apple started producing its own silicon for the iPhone, the A-series of ARM-based chips has gone from strength to strength. No better device to demonstrate the muscle of the Apple designed SoCs is the iPad Pro. It’s a very capable multitasker – great with media consumption and even pretty decent when it comes to media creation too. But is iOS and iPadOS limiting the series’ potential?

If Apple intends to put their own silicon at the heart of the Mac, it needs to be able to run all current software at the same or better performance than that which is offered on the current Intel platform. Apple has only just released the Mac Pro, a full on Intel-based Mac with up to 28 cores (56 threads) and is a beast (and possibly one of the best design PCs in existence according to several reviews I’ve seen – it’s so clean and manageable inside!). It seems they intend to release a 14″ 10th generation Intel based MacBook Pro at some point this year too.

Yet in the PC world, AMD has the Ryzen 3990X processor with a stupidly insane number of cores: 64 (128 threads) and is an overkill for all but the most intensive applications. For those that need the performance, I don’t think the ARM64 architecture has got to the point where it can compete in that space for a good number of years. Certainly, if Apple were to release a Mac in 2021 with an Apple designed SoC – even if it’s the standard MacBook (not the Pro) model – this means introducing tools to convert existing x86 code to ARM64 and vice versa. Nobody is ever going to run Logic Pro or Final Cut Pro on a simple MacBook, but how powerful does the Apple processor got to be in order to perform the code translation. Or how much work will the developer have to put in to create a universal binary that runs on both platforms? It’s Rosetta all over again.

Then there is the question of Windows. Intel Macs can run Windows natively via Boot Camp. Or via virtualisation within macOS. But if Apple starts moving to ARM processors, this obviously will break that feature – which is very useful for those that develop for both the macOS and Windows platforms.

Microsoft has done some legwork porting Windows to ARM. They’ve even released a Surface Pro laptop (the Surface Pro X) which runs Windows under ARM. But there are so many limitations with the platform which make adoption pretty terrible (and expensive) right now. Apple could potentially update Boot Camp for use with ARM Windows, but until the Windows on ARM platform is sufficiently mature – I don’t think it’s worth it. Even through x86 emulation, it’s not going to be good enough.

Then there are the Thunderbolt 3 ports on current Macs. Dongle city. Thunderbolt is very much an Intel thing, so Apple would still have to continue licensing it from them as well as perform extensive testing to ensure existing peripherals continue to work.

The important thing for me is that Apple doesn’t try to force an iPad-like experience on macOS. If macOS is going to go ARM, I want the macOS experience and to see the performance from applications around the same mark as the Core i9 and AMD Radeon Pro 5500M in my MacBook Pro (which has got to last me 4 -5 years before I can afford another major Apple purchase).

So at what point do you release an ARM-based Mac (if at all)? Difficult to say, but I’d say it’s 2021 would still be far too early. It’s not as though we’ve reached a plateau of power/performance which was certainly the case with the G4 and G5 processors. IBM pretty much forced Apple’s hand, because it just wasn’t possible to put a G5 processor into a laptop.

So maybe Apple should keep ARM to the mobile devices, and switch to AMD for its processors instead. They’ve leapfrogged Intel at an important milestone when it comes to die shrinkage – and, after all, they devised the whole x64 architecture anyway. And AMD must be pretty decent given that both Microsoft and Sony will be using their CPU and GPU technology in the PS5 and Xbox Series X. So any all-AMD Mac/MacBook Pro would be a decent all rounder.

It’s been argued that he x86 architecture is old and out of date – and it has been around for a very long time, this is true. But ultimately it’s allowed those of us with feet both in the Windows/PC world and Mac world the ability to co-operate with each other like never before and do stuff that just wasn’t possible back in the days when Macs where running 68000 or G4/G5 processors.

Let me start off by saying that I’ve been working with electronic communications since I was about 8 or 9 years old. I’m now 43, nearly 44. I started off being allowed to “play” (read: carefully dictate messages) with a major international re-insurance broker’s telex system whenever my dad took me to work with him. I found it absolutely fascinating. And the beautiful mechanical DEC keyboards were a joy to type on. Much later on I’d spend a few summers on work experience at the same company, telexing and working with spreadsheets and FileMaker Pro databases on expensive early Macintosh kit.

At secondary school I wrote an internal email system in BASIC which made use of the local school’s Windows network. It was super simple and merely demonstrated how a typical email system would work. But work it did. I did all this for my Computer Studies GCSE.

The combination of work experience and the GCSE project very much defined my career because for the past 24 years I’ve been a systems administrator. My first ever gig after leaving university was setting up a local Norwich ISP. This included providing email services for both dial-up (remember that?) and web hosting customers. Things back then where much simpler than they are now. We had very few spammers, phishing was a mere twinkle in scammers eyes, and anti-virus was something that only naughty people caught. Generally any filtering was performed client-side. We didn’t have much in the way of webmail – everything was POP3. IMAP was considered a novelty.

But even then, it took some effort to manage and maintain an email system. As years went on, Sendmail (for it was the de facto at the time) filtering started and both commercial and free anti-virus scanning became essential. Then SpamAssassin integration. When I was working for The Moving Picture Company (MPC) in London, I looked after the main mail servers. I split off anti-spam filtering (powered by SpamAssassin) onto its own server (which cost us nothing – I used an old render farm machine – whereas when we were looking at Barracuda’s anti-spam system at the time (circa 2002-2003) it was merely an expensive pimped up SpamAssassin box with a fancy user interface) and replaced the ageing 486 that was powering the entire company’s email with something beefier – as well as performing a major upgrade of Exim and having to rewrite its filtering/configuration files. Oh, and integrating Mailman for internal mailing lists and writing some PHP code to make it all look prettier and easier to use for the VFX producers. All that took a LOT of work. But when the business/non-production side wanted Microsoft’s bloated and super expensive Exchange – I explained I could implement a cheaper system at a third of the price which could do everything Exchange could – I was ignored. They went with Exchange and old muggings here had to implement a split email system (which worked well enough).

During all this time, my personal email was hosted by myself. I usually had some kind of Linux virtual machine running Exim and some IMAP/POP3 daemon, or in some cases, a Windows virtual machine and MDaemon. MDaemon was (and still is) a lovely Windows-based mail system. Very comprehensive. But for a single user or household, it’s flipping expensive and I had to eventually give it up. There were other times when I hosted my email on a cPanel account or server. But the point is that I’ve managing my own email with my own domain (the one you’re reading this post on, in fact) since 1997. I went through more ISPs than people have had hot dinners. But my email address always remained the same.

When Google’s Gmail came on the scene around 2004, I thought it was one of the best web-based email systems around. It beat the living daylights out of Yahoo! and Hotmail. It had genuinely useful features. But you couldn’t attach a domain to it. You had to make do with a @gmail.com address, and you had to put up with advertising. This also meant no easy support from Google. But in 2006, Google started trialing Google Apps For Your Domain. It was initially free, and allowed you to attach a domain to Gmail and use it alongside other Google applications too. I started getting involved in the Google forums helping to support it, as I had liked it enough to move my email over to it. Having to use only a web browser for email when I was very disappointed in all dedicated email clients was wonderful. It meant that I could use different web browsers, but the functionality would remain the same. My bugbears against dedicated email clients included the use of fixed width fonts and word wrapping, email filtering was bad, or the quoting methodology was insane. I kind of liked Outlook for a while, and there were some workarounds to the quoting system, but when Microsoft updated Outlook, everything broke and I gave up.

When Google started offering paid versions of Google Apps For Your Domain (which had subsequently become just Google Apps), I started to pay for it. £5/month (well, less – but you need to factor in VAT). It enabled me to turn off advertising, so I had email privacy (with the usual caveats – you need to allow an email system to scan incoming email for spam, phishing and viruses – all this is automated and no human sees it). I had email privacy and a much bigger email quota to boot. And I had official support.

When I worked for Imagineer Systems (now Boris FX), I migrated the email system from a single virtual machine running on a dedicated box to Google Apps for Business. The cost saving alone was worth it. It just made everything easier and simpler to maintain.

Meanwhile back in workland (during my time at Memset), I was supporting customers who were also rolling their own email – albeit mainly via cPanel/WHM which does provide a very comprehensive set of tools. Some customers rolled their own separate Exim or Postfix configs, but mainly it was cPanel. Many a time I discussed us becoming a Google partner and supporting Google Apps, but Memset rolls its own cloud services and it was not something that ever was going to happen.

Where I work now, we using G Suite. And it makes light work of maintaining a corporate email system. It’s survived a company rebranding easily enough and the tools and services it provides is having a very positive impact on the business. We’re looking to extend that too, so’s all good. My experience with working with G Suite over the years is paying off!

Phew. After that long introduction, I’d like to get around to the whole reason I’m posting this thing in the first place. The cost of ISP branded email and its use after you move ISPs.

As you can appreciate from above, a lot of work goes into managing email systems and it’s also not cheap depending on the system you go for. Many ISPs used to roll their own email systems using open source applications like Sendmail, Exim, Postfix, Amavis, etc. but when Google opened up its Google Apps for Domains to ISPs, a number of them switched to that.

Then Google ended Google Apps for ISPs, and they had to move to yet another system. Some moved to Yahoo!, some moved back to hosting email internally again, others to other externally hosted services.

And many did move to externally hosted services because maintaining a functioning email system for thousands of customers – even using open source tools – takes considerable time, effort and money. Scaling such a system is expensive. Keeping it functioning when many hundreds or thousands of people keep hitting the POP3 or IMAP server every few minutes requires monitoring and maintenance.

So why should ISPs expect to maintain your email, for free, and for life, when you leave their broadband service? I find it doesn’t make any kind of sense either economically or practically.

Apparently Sky do let you keep your email address when you leave them, but as they’re using Yahoo!, I doubt it costs them much to do so. Yahoo! does the heavy lifting and you probably get adverts within your mail which recoups the cost of the maintaining your account.

They key thing here is: how much do you want to pay for your email? It’s never free. There is always some cost attached to it. One option may be using an ad-blocker – but this is deprives the provider from any income which is used to pay for your hosting. Another option is to move to a dedicated email provider such as Yahoo!, Gmail or Outlook.com – but you’ll end up with having to cope with adverts. Yahoo! offers a paid upgrade option which gets rid of them. Outlook.com too (Outlook Premium).

You could use your old ISP mail when you move ISPs, but for the likes of BT and TalkTalk who charge – it’s not an unreasonable charge in comparison to other hosting options out there. But here’s what I would do:

  • Buy a domain. Any domain. They don’t need to be expensive, but you what to find something that’s going to last a bloody long time.
  • Find a dedicated email hosting provider. Follow any instructions they give on how to set things up (or get your family IT tech support to help out!) – or consider the likes of G Suite, Office 365, or even Rackspace.
  • Depending on how you have your email set-up at the moment, you’ll either need to migrate all existing mail into the new hosting provider, or you’d need to move you email to the new account within your email client. With Outlook, it may be easier to export your old mailboxes as PST files and import them into the new account, then delete the old ISP mail account afterwards. Again, your new email hosting provider can help you with this (or your friendly family IT manager can!).

The domain part is important because it means that you get to choose any email address you like at that domain. Providing you continue to pay for the domain and hosting, you never need to change email addresses ever again. I also think that if you’re running a business, a domain name makes things a lot more professional.

So I completely disagree with Ofcom’s assessment that ISPs are charging too much to keep old email addresses. I think this would become less of a problem if ISPs allowed you to host a domain name with them for the purposes of email (or free web space – which is still a thing, apparently). You either import or buy (cheaply) a domain and use it with the ISP for your email. When you need to move ISPs, you provide the new ISP with the domain name they take care of the move for you. How about that? That could potentially work.

The term Visual Effects (VFX) usually refers to post-production work to integrate CG or model photography into live action footage. Special Effects (SFX), in turn, usually refers to practical, on-set effects like explosions, smoke, mechanical effects, etc.

When I first started working in the VFX industry way back in 2001, CG post-production had become the dominant force in VFX. Few productions were using miniature/model effects, and even fewer animatronics (to the point in 2004, Jim Henson’s Creature Shop packed up and moved back to Los Angeles from London).

MPC made strides in a number of CG innovations, including its innovative Alice crowd simulation software which was used on a number of big productions. Over the years, we’ve seen major improvements in motion capture, character rigging, CG modelling, compositing – generally all the usual fields that CG VFX is put to use.

But as film makes way for increasingly ambitious television productions, new techniques have to be found in order to reduce costs and speed up production turnaround.

The Star Wars films have always been a major innovator in the world of visual and special effects. Right from the first film back in 1977, George Lucas had the foresight to start his own company for the purposes of producing the complex visual effects needed to tell his story.

Those complex visual effects have mainly been restricted to film only – costs for producing something on that scale for television has been been pretty prohibitive. Attempts at the first Star Wars TV special were interesting to say the least:

Getting back to the the evolution of VFX tools, the games industry were also making progress with more and more complex video games. This too required substantial development and R&D, and you’ll find that both VFX and the games industry share many of the same tools and techniques – each industry complementing each other.

Fast forward to Disney’s “live-action” The Jungle Book and subsequent The Lion King. Director/Producer Jon Favreau and his team developed brand new techniques to help create the films. This in turn has been used in the first ever Star Wars television series, The Mandalorian.

In order to create the fantastical sets, landscapes and other backgrounds, the production brings the old technique of rear projection bang up to date by using giant LED screens to produce a highly detailed background. This allows filmmakers to shoot practically whereas before you’d typically shot on green or blue screens and replace the backgrounds in post-production. My biggest issue with this approach is that scenes which use green/blue screens never have fantastic lighting. If you’re shooting a scene which is, for example, set outside but shot indoors with green screen compositing, it never looks real. The lighting is a dead giveaway. Whereas with this new virtual set system, the lighting is a lot more accurate and realistic.

The virtual set system obviously requires some VFX to be produced pre-production to display on the giant LED screens, so this technique is bringing together the terms VFX and SFX – creating a blurred line between practical and post.

What’s interesting is that the system uses Epic Games’ Unreal Engine which is probably best known for the video games Unreal Tournament and Fortnite. Filmmakers can make live changes to landscapes and environments on the fly through the use of Unreal Engine. All these techniques are thanks to ILM‘s Stagecraft Virtual Production team, Epic Games and Jon Favreau’s Golem Creations.

It’s all very impressive, and I consider it a major game changer to the point that if they haven’t already, the entire team responsible should be given an AMPAS scientific award for filmmaking innovation ASAP. It’s certainly the most exciting FX technology that I’ve seen in past 20 years.

The Mandalorian airs exclusively on Disney+, the new streaming service from Disney, and will be available to watch in the UK starting the 24th March (I can’t wait!).

I noticed this advert on the train today and thought long and hard about it. The big feature with most gaming consoles and games is the ability to compete with other players online. The Nintendo Switch makes it possible to do this offline and on the go too.

It’s a very social console – my employer recently bought a Switch with multiple controllers for staff use (we have games nights here – whether it be classic console gaming or card games – something for everyone).

She’s going to lose the game – she’s too busy coyishly playing with her hair having noticed the gentleman sitting opposite her across the aisle! There goes her chances at the E-Sport Championships! Drat!

But when I looked at the advert, it felt more like an advert for a dating site than it does for a games console! The background characters make me chuckle – romance at the back of the bus and she’s looking completely unimpressed! I don’t know what’s special about that bus, but I’ll have whatever its having.

Maybe this is where I’ve been going wrong for the past 7-8 years – I need to buy a Nintendo Switch to find a wife! Though I do wonder what the divorce statistics are for couples that spend too much time playing computer games and not with each other (oo-er missus).

For the love of all things holy, I cannot believe this company. 5 days compensation is better than nothing, but when you consider it was a full 27 days, it still feels rather stingy. But that’s not what’s got my goat. After reading the initial blurb, there’s a link to an update site which allows you to put in your name and email address.

ALAS!

They’ve not put a valid SSL (or TLS, if you prefer – technically it should be referred to TLS these days, but people are set in their ways) certificate on their site. Which means that any form data transmitted will be sent unencrypted between the user’s browser and the server. This could (unlikely, but still possible) for data being sniffed and captured by a third party.

Another method is by spoofing the southwesternrailway.com domain. I could register a domain such as southwestermrailway.com (as an example) and duplicate the same hostname and the site contents (changing the form details so that anything is sent to me or a file on the server), leaving out the SSL certificate. I could potentially hoover vasts amounts of data as people don’t bother to check the URL or SSL certificate.

In any event, putting an SSL/TLS certificate on your site is vitally important these days, and it’s not difficult to do. I’m still amazed that Bafta.org hasn’t put its entire site behind SSL/TLS (try going to https://www.bafta.org, and it’ll redirect you back to non-SSL content), nor Milk VFX which solicits job applications to submit entries via an unencrypted form. Bad, Milk VFX, bad!

Update: Looks like they’re using external Salesforce CRM to capture the information. The Javascript form code is hosted securely – thank goodness – and it looks like the form data is also submitted securely to Salesforce servers. Even so – I’d still be pretty weary about any site without a proper SSL certificate and encrypted traffic between the browser and server, and not everybody is going to want to scour the page’s source code to determine what’s going on.

Here’s my biggest gripes with this new foldable phone craze:

  • When folded up, the phone’s.. girth.. appears to be considerably more than that of a non-foldable phone. It saves space length-wise, but not thickness-wise. It’ll probably fit in most men’s trouser pockets just fine, but it’ll be bulky:

“Why, is that a Galaxy Z Flip in your pocket, or are you just a pervert?’

  • Easily damageable screens, and potentially easily damageable/limited hinge lifespan. These phones requires far greater care and attention that a regular phone which negates the whole reason for having a folding phone in the first place. You want durability. You want to save space (both in length and thickness). You should not have to pick one or the other.
  • They don’t look that much different from the flip phones (you know, the ones with a physical keypad on one half, and a tiny non-touch screen on the other) from the 90’s and early 2000’s.

In short – foldable phones, in their current state, are simply cute and fluffy Mogwais just waiting to turn into Gremlins. It’ll cost you an arm and a leg at this early stage of the technology, and if you damage your phone, possibly a spleen or kidney to continually replace the screen.