Apple must truly hate gaming, or suffer from a serious case of Not Invented Here with their Metal stuff. As if any serious gaming studio would target Metal which doesn't run on Windows.
In fact, they couldn't get their act together, keep with current versions, and as a result titles like Elite Dangerous were being shut down anyway. Reason: OpenGL stuck on an old version without support of compute shaders.
To be fair, most games today are built using Unity3D, Unreal Engine etc, which all support Metal already. Hardly anyone writes their own game engines these days, and if they do they probably have the resources to support Metal.
Overall still a bummer though.
The problem is still with apple forcing them to invest resources, without any reason, but to advance their vendor lock in. And if you're a developper of a small high performance 3d graphics and gpu computing library like me, its just a giant middle finger from apple and I will either need to drop opengl/opencl or apple - there is no way that i can afford to offer both, especially since i'd need to buy apple hardware to test things.
The Witcher 3 is using RedEngine, GTA V RAGE, the Battlefields and SW:Battlefront {1,2} are using Frostbite IIRC, the two new Tomb Raider on Horizon, Rainbow 6 Siege and the Assassin's creed are on Anvil, Overwatch & SC2 have their own engines too, same for League of Legends, CoD are on a heavily customized id Engine, Minecraft is custom, Bethesda have their own engines too for Skyrim & Fallout, Path of Exile cutom too, all taken from Steam 100 most played.
That's a nice list, quite complete. Many console exclusives also use custom engines by the way, e.g. Decima for Horizon:ZD, KillZone and Death Stranding, Naughty Dog has their own engine (don't know the name), etc.
The OP's point was that the companies that make these engines can afford to invest in supporting an additional back-end API though. I think it's hard to argue that any of the companies that develop these engines would not be able to also add a Metal back-end. Many of them already work across a pretty wide range of back-ends anyway. Xbox One, PS4 and Switch all use completely different API's, for example. I think most of the work is not in adding an additional backend like Metal, but in tuning the back-end for some specific piece of hardware (NVidia vs. AMD vs. mobile GPU, etc).
Whether companies are actually willing to invest in a Metal back-end remains to be seen, but considering many of them license their engine for commerical use, I would be surprised if the major players will simply ignore Metal.
I tend to agree with Jonathan Blow's comments on Twitter, that the low-level graphics API should be just that: as low level as possible, small, focussed, and not actually intended (but still allowing!) to be used directly. Engines or higher-level API's can be built on top of that, with the option to dive down to the lowest level when needed (which will probably be a rare occasion).
DirectX will definitely not be this API because it is Windows specific. Likewise for Metal because it is Apple-specific. Blow appears to be of the opinion that Vulkan is also not moving in the right direction, because it is becoming too complex, and trying to be too many things for too many applications at the same time.
If true, in a sense, it's not that surprising Apple is doubling down on their own API. I think they should consider making the API open-source though, and develop something like MoltenVK (but the other way around) for Windows/Linux.
The top 10 most played today in steam are using UE4 (2), Source 2, Source (2), and custom engines (5: AnvilNext, RAGE, Evolution). That's a lot of variety, there's almost no reuse.
With a bit of luck, Godot Engine. Sort of a dark horse, but I like it and my very smart corporate-programmer brother likes it. He says it's designed like a programmer would design it: everything's a node. I know I did a game in Unity (which has become overcomplicated) and had a surprisingly easy time jumping into Godot.
Go back in time six years ago. What were Apple’s choices?
(1) continue to live with the deficiencies of OpenGL. Remember that, over time, it had come to fail at one of its primary purposes which was to provide efficient access to GPU hardware. Further, sticking with OpenGL would be to accept the leadership of a group that had allowed its flagship standard to falter.
(2) They could marshal their resources and create the better API that the Khronos Group wouldn’t/couldn’t.
They really had no choice. Note that Vulkan wasn’t announced until after Metal was released.
The gripes in this are should really be leveled at the Khronos group, which fumbled their stewardship of OpenGL and, with it, the chance to lead open GPU APIs.
The time table is being pretty generous to Apple. Metal, Vulkan, and DX12 are reworked versions of Mantle.
The entire point of Mantle was to be a proof of concept that could be reworked into cross platform API (which became Vulkan), there was plenty of work already being done by Khronos in 2014 (and Apple knew this). And they just went out and released Metal anyway.
I also blame Microsoft for the same thing, early parts of the DX12 docs were taken word for word out of the Mantle docs, that's how similar they are. But Microsoft at least had couple decades of having a competing API, but Apple went out to create a new one for some reason.
Talk about rewriting history, Mantle wasn't never supposed to become Vulkan, it only happened because AMD was generous and Khronos would otherwise still be thinking how OpenGL Next would look like.
While I get the concern, everybody's history here is backwards. Apple released Metal 2 YEARS before Vulkan. Why? Because OpenGL wasn't hacking it anymore and had become too asymmetric. Vulkan copied Metal, not the other way around.
I'm not sure they should have spun around and dropped Metal for Vulkan once it became available, or slow down the pace of progress til the rest of the market caught up. Doesn't make sense.
Also Apple is perhaps the largest GPU manufacturer in the world, with 200-250M GPUs shipped in 2017. That is 4-5X of Nvidia! Also Apple is investing highly in AI from tools to devices to GPUs, being able to customize may have tremendous value.
It is highly possible that Apple sees owning their interface stack as a means to keep their software-hardware warchest a couple years ahead of the competition. Which in mobile has been paying off of the last 5 years, as they constantly have crushed all others by 2-3X.
Does it matter anymore? People are using less and less of the higher level stuff of OpenGL. Most of the graphics code is now in the engine. OpenGL is getting very outdated, who starting a project today would chose it over Vulcan, Directx or Metal? I would bet most small shops would prefer to use some sort of middle layer or engine from a third party. That pushes the problem of implementing the lower layers in Vulcan, DirectX or Metal to a small group of specialists.
No games aren't going to target Metal to support the Mac any more than printer manufacturers are going to go out of thier way to support AirPrint to make printers Mac compatible.
What developers will do is go out of thier way to support iOS and supporting the Mac is just a side benefit. Just like almost every printer company supports the Mac as a byproduct of wanting to support iOS.
Hypothesis: There are more machines in consumer hands which support Metal than DirectX.
This may sound crazy, but remember there are billions of iOS devices out there in the world, and I don't think X boxes plus windows game machines count in the billions.
Its true Apple hasn't won the hard core gamer market, but they are no longer the niche player that had to cater to windows users.
If you're counting only gaming PCs (i.e. device used mainly for demanding 3D games) you should also count only gaming Macs/iPads/iPhones. How many are there in the world?
>This may sound crazy, but remember there are billions of iOS devices out there in the world, and I don't think X boxes plus windows game machines count in the billions.
There should do, albeit not for gaming but most of the office software is Windows with DirectX support. You won't be playing on, though.
Are there more Android devices that actually have hardware that can actually play high end games decently? The average Android phone is a low end phone - with an average selling price of $225 for all Android phones how can they not be?
OpenGL is part of the platform, they all support it. The stats page doesn't even include 'not supported' [1]
Being able to run anything slightly demanding is other thing, but you can't argue there's no support.
Also, the benchmark you linked is for application load, which is heavily influenced by storage speed and load method (android has to JIT compile sometimes) and has almost no impact from the graphics' performance other than the bus between CPU/memory and GPU
Being able to run something suboptimally doesn't turn into sales. I'm sure that the owner of a $70 Blu R1 HD is not going to be spending money on high end games.
It really isn't. The fastest GPU available is a Vega 64 underclocked to basically the performance of a normal Vega 56. A 1080Ti is ~50% faster. Even if you connect an external 1080Ti it's constrained by TB3 bandwidth.
In fact, they couldn't get their act together, keep with current versions, and as a result titles like Elite Dangerous were being shut down anyway. Reason: OpenGL stuck on an old version without support of compute shaders.
https://forums.frontier.co.uk/showthread.php/424243-Importan...
https://support.frontier.co.uk/kb/faq.php?id=228