Apple announces M3 Ultra—and says not every generation will see an “Ultra” chip

Post content hidden for low score. Show…
An M4 Max has about the same power as an M1 Ultra.

The ultra had the advantage that no new design was needed. Just an interconnect to double cost and power. I could imagine they can design a max-like chip with 16 or 20 instead of 12 performance cores. I can’t quite see why you wouldn’t base it on an M4 Max with extra cores instead of M3 with extra cores.
 
Upvote
31 (36 / -5)
Post content hidden for low score. Show…
Post content hidden for low score. Show…

NYReichman

Seniorius Lurkius
17
Subscriptor
Honest question: what are the people who need the Ultras doing with them?
We're using Pro Tools and the Dolby Atmos Renderer to mix albums and TV shows on one Mac, where it would have taken two or three Macs in the past to do the same job. Logistically easier. Mac Studio Ultras are amazingly cheap on the price/CPU vector.
 
Upvote
144 (144 / 0)

ColdWetDog

Ars Legatus Legionis
13,270
Subscriptor++
This is entirely from memory, but doesn't the M3 lack the AV1 encode functionality that M4 has? This seems most likely the reason for segmentation, here, to try and force studios into full-fat Pros with coprocessors/GPUs instead of Studios.
I can do pretty much any encoding that I need on an M3 at a quite zippy pace. Maybe for a production studio there would be an advantage but as a single user, the M3 is better than anything I've used in the past.
 
Upvote
10 (12 / -2)

Matey-O

Ars Scholae Palatinae
1,313
Subscriptor
Knowing nothing about the details, but making a completely uneducated guess: You have to have two perfect M3 Max chips and a perfect interposer and you've gotta join them up perfectly. I'm thinking this is an issue with yields. If doing it were easy, we wouldn't have seen delays in the M3Ultra rollout and the M4Ultra would be available on day one. (Do the chip designs have additional P and E cores to improve yields? )
 
Upvote
87 (87 / 0)

DistinctivelyCanuck

Ars Scholae Palatinae
2,268
Subscriptor
There's another possibility... (Having seen this in my own career)
You have two sets of teams working silicon, the "odd" and "even" teams... so you end up with M1/M3/M5(eventually) and that 'odd' team works the challenges that result in 'Ultra' Silicon
You have the 'even team' that does the M2/M4 etc.

Both teams highly capable, both working the same overall product: but slightly different focuses, especially due to the timelines involved
 
Upvote
68 (68 / 0)

caramelpolice

Ars Scholae Palatinae
1,375
This is entirely from memory, but doesn't the M3 lack the AV1 encode functionality that M4 has? This seems most likely the reason for segmentation, here, to try and force studios into full-fat Pros with coprocessors/GPUs instead of Studios.
Neither M3 nor M4 have AV1 encode, only decode.

It's odd to me that they would use the m3 moniker for a binned version in the ipad air and then specify an old chipset for this Ultra. It's just a name, why not call it something else? M3.5? M4SE?
M3 and M4 have different CPU architectures and manufacturing nodes. Calling this M3 means this is using the previous ones.

There's another possibility... (Having seen this in my own career)
You have two sets of teams working silicon, the "odd" and "even" teams... so you end up with M1/M3/M5(eventually) and that 'odd' team works the challenges that result in 'Ultra' Silicon
You have the 'even team' that does the M2/M4 etc.

Both teams highly capable, both working the same overall product: but slightly different focuses, especially due to the timelines involved
Both M1 and M2 had Ultra chips.
 
Upvote
66 (68 / -2)
This is entirely from memory, but doesn't the M3 lack the AV1 encode functionality that M4 has? This seems most likely the reason for segmentation, here, to try and force studios into full-fat Pros with coprocessors/GPUs instead of Studios.
The pros cant use discrete GPUs, Apple doesnt support third party GPUs anymore
 
Upvote
26 (26 / 0)
Wow, it's a $1500 price bump to add 4 performance cores to the Ultra. I'll consider the base Ultra, but there's no way I'll pay $1500 for another 4 cores.
Very diminishing returns, except when that $1500 offsets against time saved over the 2-3 years the Mac will be in use for creative pros saving them a minute here and there over these 2-3 years - it adds up.

Personally I would love a base M4 Max Studio for myself. The creative apps I use would fly on it.
 
Upvote
35 (35 / 0)

fazalmajid

Ars Praetorian
468
Subscriptor++
Honest question: what are the people who need the Ultras doing with them?
Software development thanks to the higher core count, LLMs thanks to unified memory and the higher RAM limit allowing you to use larger models than even the highest-end nVidia H200 cards (albeit slower processing due to a slower GPU/NPU).

The new AMD Strix Halo (and desktop variants theof may give it a run for the money, though, and of course the nVidia Digits.

My 2022 $5000 M1 Ultra 128GB Mac Studio is obsoleted by the M4 Mac Mini, however, so these machines have a short relevance span, even if they are usable for more than 3 years,
 
Upvote
40 (47 / -7)

jacs

Ars Centurion
291
Subscriptor
Honest question: what are the people who need the Ultras doing with them?

I didn't need my 64GB M1 Ultra. It replaced a 2018 mini and was major overkill for my usage. But I had the opportunity to purchase it when it came out and I expect to get many more years usage out of it.
 
Upvote
27 (27 / 0)

StrangeOnion

Smack-Fu Master, in training
92
M
Honest question: what are the people who need the Ultras doing with them?
My (hopefully) intelligent guess is stuff like software dev & compilation, simulation, visualization, “high end” film & audio, scientific computations, etc where time saved = a lot more money saved.

As some say, if you have to ask, it’s likely not for you.
 
Upvote
43 (43 / 0)

famousringo

Ars Scholae Palatinae
1,046
Subscriptor
Software development thanks to the higher core count, LLMs thanks to unified memory and the higher RAM limit allowing you to use larger models than even the highest-end nVidia H200 cards (albeit slower processing due to a slower GPU/NPU).

The new AMD Strix Halo (and desktop variants theof may give it a run for the money, though, and of course the nVidia Digits.

My 2022 $5000 M1 Ultra 128GB Mac Studio is obsoleted by the M4 Mac Mini, however, so these machines have a short relevance span, even if they are usable for more than 3 years,
Strix Halo (and Digits too, from what I've seen) have the memory bandwidth of an M4 Pro. They don't compete with an M4 Max or M3 Ultra. Though of course Digits at least benefits from almost the entire AI software environment targeting nVidia GPUs.
 
Upvote
15 (15 / 0)
Post content hidden for low score. Show…

Still Breathing

Ars Centurion
239
Subscriptor
Honest question: what are the people who need the Ultras doing with them?
Can't speak for everyone, but for me I have a home-grown astronomy app. The telescope shoots at 24MP, in mono, with 7 (automated) filter wheels, for R,G,B,L,S2 H-A, O3 images. These are all combined to provide a multi-spectral image, but to provide temporal averaging, I'll typically shoot a few hundred frames over a session, with the mount tracking what I'm shooting.

I'll frequently end up with 50,000 MP of image-data to reconcile, which is a mathematically-intensive process, even when I do it on the GPU. You need to register all the frames with each other before combining them, to get the best image; you need to watch out for Elon [redacted] Musk's satellites all over the place, as well as the occasional plane going overhead, etc. I use an FFT auto-correlation on the GPU to self-register the images but there are other methods.

Once all that is done, there's a massive "combine them into one image" to do... It ends up looking gorgeous though. Overall processing can take hours with everything maxed out.

I used to have an M1 Ultra, bought it as soon as it came out. I upgraded to an M4-max MBP when that came out, and now (even if it's "only" the M3) I'm looking at 2x the CPU/GPU/memory of the M4-max if I got the M3-ultra... I still love the portability of the MBP so I'm a little torn right now, but I am considering it...
 
Upvote
81 (81 / 0)

Still Breathing

Ars Centurion
239
Subscriptor
An M4 Max has about the same power as an M1 Ultra.

The ultra had the advantage that no new design was needed. Just an interconnect to double cost and power. I could imagine they can design a max-like chip with 16 or 20 instead of 12 performance cores. I can’t quite see why you wouldn’t base it on an M4 Max with extra cores instead of M3 with extra cores.
I think that depends on your use-case. See my reply just above, but I got about 1.5x the performance out of the M4 Max over the M1 Ultra.
 
Upvote
18 (18 / 0)

gkaimakas

Seniorius Lurkius
9
Subscriptor
My 2022 $5000 M1 Ultra 128GB Mac Studio is obsoleted by the M4 Mac Mini, however, so these machines have a short relevance span, even if they are usable for more than 3 years,
the fact that someone would consider a machine obsolete the moment a better one appears just baffles me
 
Upvote
37 (42 / -5)

berjb

Ars Centurion
234
Subscriptor
Honest question: what are the people who need the Ultras doing with them?
Currently I've got an M2 Ultra with 128GB RAM. I work on film/streaming VFX. For me it's software development and evaluating Houdini sims, Maya animation and rigging, Nuke compositing, rendering, etc. Mainly I need the RAM but the CPU/GPU performance is great too.
 
Upvote
30 (30 / 0)

Shadowself

Ars Scholae Palatinae
629
Subscriptor++
What I don't understand is, since the "Ultra" chips are just two Max chips fused together, shouldn't it be (relatively) trivial for them to make an M4 Ultra given the M4 Max already exists?
When the M4 first came out there were reports that the architecture of the M4 was quite different from the M1 through M3 architectures. This led to two different speculations in the industry.
1) The new architecture would eliminate the need for the same type of interposer between the two M4 Max implementations making it "easier" to create the equivalent of the M4 Ultra. It is very possible that Apple has run into issues joining two M4 Max into one M4 Ultra due to this new architecture.
2) The way the Ultras were constructed made doing the whispered Mx "Extreme" (effectively four Max chips welded together for a true, high end workstation chip for the Mac Pro) an impossibility. The new M4 architecture was rumored to get around this blockage by making the way the Mx chips could be bound together more technically achievable.

All this was speculation around the time the first M4s started showing up. We'll just have to see if an M4 Ultra does show up based on this new architecture.
 
Upvote
26 (26 / 0)
@Andew Cunningham it looks like you've got a copy paste error in your table there—not every Ultra chip has 819.2 GB/s memory bandwidth.

edit 2: My bad, they do all have 819.2 GB/s memory bandwidth. I completely assumed that the 96GB M3 Ultra config had a cut-down memory interface like we saw for the binned M3 Max chips.

Also:
The existence of the M3 Ultra puts to rest some lightly sourced speculation from last year, suggesting that the M3 Max was shipping without the silicon used to fuse two Max chips together into a single Ultra chip.
The M3 Max die (TMQG40) did not have the D2D interface. This is a different die. If we eventually get the die markings from the M3 Ultra I reckon it will bear this out. The original source for this is TechanLye which is a completely reputable reverse engineering firm.

edit: Link to EE Times Japan article contributed by Yoji Shimizu of TechanaLye with additional details about M3 Max https://eetimes.itmedia.co.jp/ee/articles/2401/25/news052_2.html
 
Last edited:
Upvote
16 (16 / 0)

The Real Blastdoor

Ars Tribunus Militum
2,313
Subscriptor++
Honest question: what are the people who need the Ultras doing with them?
In my case I guess you could say "data science" (although I'm old, so I'd call it 'statistics").

I want as many CPU cores as I can get for the purposes of running simulations to test out how statistical models perform in various scenarios. For a time I had a Linux Threadripper system (a 32 core 2990wx), back when Threadripper was a more cost competitive product. Now Threadripper is crazy expensive, and I prefer Macs anyway, so I'd rather just have an Ultra. If the personal finances work out, I'll get the base Ultra (not paying $1500 for another 4 cores).

So far, I don't really need the GPU power, so I'd prefer Apple to offer options that are more CPU-intensive rather than GPU-intensive. Maybe if they move to a more 'tile' based approach to CPU design that will become feasible.
 
Upvote
15 (15 / 0)
Post content hidden for low score. Show…

Got Nate?

Ars Scholae Palatinae
1,223
Fwiw it’s nice of Apple to admit they are using M3 but it’s super confusing to the consumer that M3 Ultra is better than M4 Max
At $4000 before addons, I don't think a consumer will be considering the M3 Ultra at all. The Pro who knows what they need will lap these up, but the Ultra is not targeted at the consumer.
 
Upvote
24 (24 / 0)

stilgars

Smack-Fu Master, in training
67
@Andew Cunningham it looks like you've got a copy paste error in your table there—not every Ultra chip has 819.2 GB/s memory bandwidth.

Also:

The M3 Max die (TMQG40) did not have the D2D interface. This is a different die. If we eventually get the die markings from the M3 Ultra I reckon it will bear this out. The original source for this is Hiroharu Shimizu of TechanLye which is a completely reputable reverse engineering firm.
Exact, it is clearly revised die for the combined M3 Max, considering they have Thunderbolt 5 and can reach 256 GB each (=512 total)
 
Upvote
15 (15 / 0)