Bad Apple: Why Some Developers Are Switching Platforms

Share article

In the early days of the Mac versus Windows debate, one of Microsoft’s main advantages was easier development tooling on the Windows platform. While the Mac OS was notoriously difficult for developers to work with, Microsoft opened APIs, produced a wealth of development tools (for beginners and advanced programmers alike), and made developers feel like their most important customers. The end result was a thriving third-party ecosystem.

Mac OS seemed less and less viable for developers. Then, something changed.

Apple on the Upswing

In the late nineties, with Apple on the verge of bankruptcy, the board made the decision to bring back company founder Steve Jobs by buying his company NeXT. Apple inherited NeXT’s operating system, based on a Mach microkernel and BSD Unix, as well as Jobs himself and his design team. Over the following few years, Apple transitioned from legacy Mac OS to Mac OS X (now simply called macOS), founded on new Mach/Unix underpinnings.

One of the biggest benefits of this new operating system? A wealth of new tools available for programmers. Mac developers suddenly had native access to a world of Unix-based, open source development tools. And once Apple moved from PowerPC to Intel chips, virtualization opened the door for developers to use a single platform to develop for the big three — Mac, Windows, and Linux. With the release of iOS, Macs became even more popular as development machines, since the lucrative field of native iOS development requires a Mac.

When Apple released Swift, the successor to Objective-C, as the programming language of choice for both Mac and iOS, it quickly became a popular choice among developers, rising to the second most-used programming language behind Rust. Thanks to all of the above factors, as recently as 2016, the Mac enjoyed the top spot among development platforms.

Apple’s Missteps

In spite of these gains, Apple has increasingly come under fire for a perceived abandonment of high-end users, especially developers. After four years with no updates to the Mac Pro (and lackluster updates to the MacBook Pro), some developers began to feel that Apple wasn’t taking them seriously as a market segment. As new technologies like improved processors and graphics cards became available for Windows machines, many developers started to think seriously about switching — after all, what developer doesn’t want to target the newest, nicest systems on the market?

Additionally, Apple’s decisions regarding the Mac App Store (MAS) and digital app signing have continued to alienate some developers. While some hoped the MAS would duplicate the success of the iOS App Store, Apple never truly gave the MAS the same level of attention, leading many developers to pull their applications from it in favor of selling them directly.

In contrast, Microsoft is starting to appeal to developers — as they did in the days of Windows 95 and 98 — with improved hardware, more reliable software, and their own app store. And under CEO Satya Nadella, Microsoft has been loosening the reins on their development environments, a move viewed by many programmers as more inclusive than Apple. The company’s new goal seems to be making things as friendly as possible for developers with versatile, easy-to-use tools.  In other words, more powerful hardware, easy access to development tools, and less of a “walled garden” approach to applications have created an environment where some developers have started migrating back to Windows as their platform of choice.

In other words, more powerful hardware, easy access to development tools, and less of a “walled garden” approach to applications have all created an environment where some developers have started migrating back to Windows as their platform of choice.

Choosing the Best Option

That said, many developers are still on the fence. The question remains: which is the better development platform? Unfortunately, the answer is complicated and depends largely on the goals of each developer.

For cross-platform developers, Mac remains the only platform that can legally run all major operating systems. While Windows and Linux can be run on Mac or Window machines, macOS can only be run on a Mac. Additionally, developers looking to create native iOS applications still need to use a Mac. And while Apple has received criticism for lagging behind in hardware improvements, updates for the Mac Pro, iMac, and MacBook Pro were all announced or unveiled in 2017 (along with an all-new iMac Pro). In a rare moment of transparency, Apple executives admitted they had mishandled the Mac Pro and publicly recommitted to their high-end users.

However, developers looking to create software primarily for the Windows platform may prefer to use a Windows machine. While Windows can run on a Mac using virtualization (or even through the dual-boot Apple Boot Camp), nothing quite compares to the convenience and native support of a Wintel solution.

For programmers looking to develop in Linux, the choice of Mac or Windows comes down to a question of hardware and cost. Apple’s hardware still enjoys a stellar reputation, but some PC manufacturers have started to close the gap in design and quality, offering competitive designs that can be purchased for substantially less money.

Whatever choice you make as a developer, the fact remains: Apple’s missteps have jeopardized their position as the primary platform for software development — and now, the field is more competitive than it’s been in two decades. Whether Microsoft or Apple will gain more ground remains to be seen.