Review: Asus’ ROG Flow Z13 tablet takes the asterisk off integrated GPUs

Epimetheus_Secundus

Ars Scholae Palatinae
866
Subscriptor++
Which one? I’ve all but given up on that.
I'm sorry, I misread the article. I was thinking gaming laptop.

My gaming laptop that has reasonable fan noise AND doesn't blow scalding air on my right mouse hand is a HP Victus 16" with a 4060 and i7/32gb/2TB
 
Upvote
2 (2 / 0)
As a Tim, I love the phrase "Tim in Product". But what does it mean? Did someone fall into the code vat? That something extra that the PHB or marketing needs to close the deal? Unfortunately, a search just led me back to this article. Is it self-recursive, self-reflexive, or maybe just a bad BASIC construct...
I think he means you'd expect to see it as an on-the-go system, "but there’s a lot of frame-rendering and multicore processing power inside this glass-fronted slab with a keyboard cover."

At least that's how I read it. "Tim in Product" is a more relaxed way of saying: "John Doe in Product Marketing."
 
Upvote
0 (0 / 0)

Coolie

Ars Praetorian
407
Subscriptor++
A 60W or 65W power USB-C power brick should work but it's not guaranteed. The device is allowed to say what power profiles it supports and charger and device don't have overlapping supported profiles it won't charge. It's possible that they require a 100w (or higher) USB-C power brick to charge via USB-C but hopefully they do what they can with a 60w or 65w brick.
A spec-compliant 60W / 65W USB-PD charger (really an AC/DC power adapter) will almost definitely work with any spec-compliant USB-PD capable laptop.

The important point is not the wattage, really, but the voltage. Any 60W / 65W charger will be required to at least put out 20V. (That is, 20V @ 3A or 3.25A.) It is also mandatory that it supports 5V (up to 15W), 9V (up to 27W), 15V (up to 45W), as well as optional supports 12V or the variable voltage PPS capability.

Similarly, any USB-PD device (laptop / tablet / phone / etc.) which allows 100W is required to accept at least 20V: it will take any amps at that voltage which the host/ charger provides it, up to 5A. It doesn’t matter if the charger that 20V power draw to 1.5A (i.e. 30W), 3A (i.e. 60W at ) or other limit, the device will accept it all at the 20V is needs.

(My personal experience: I have used and still sometimes use a 30W charger capable of 20V with my Surface Pro and both a lenovo and Dell laptop. It charges slowly and gives a ‘lowe power charging’ alert in Windows 11, but it does slowly charge the battery when the device is not being pushed.)

Even if the device is PD EPR enabled — 28V up to 140W, 36V up to 180W and /or 48V up to 240W — I am per sure it must first support 20V (@ 5A) / 100W.

The only way a USB-PD device does not work with a spec-compliant 60W / 65W charger is if the device only asks for 12V; that means the device would be limited to 36W as per spec. That essentially excludes just about any laptop.

Non-spec compliant chargers from alphabet soup brands on Amazon / TaoBao / AliExpress / etc, you’re on your own…
 
Last edited:
Upvote
2 (2 / 0)

Voldenuit

Ars Tribunus Angusticlavius
6,554
4070 laptop GPU only has 8GB VRAM which is a limiting factor for a number of current games such as Indiana Jones, and certainly future titles. With 32GB (or more) of total system RAM available, the Strix Halo GPU could plausibly be assigned 10, 12, up to 16GB to work with, which should bring real benefits in those VRAM-limited scenarios.
You can allocate up to 24 GB to VRAM on the (as-reviewed) 32 GB Z13 SKU.

On the 128 GB SKU, you can allocate up to 96 GB VRAM.

It seems to be toggleable on-the-fly (no need for reboot).

I'm really curious to see how a Strix Halo system with 96 GB allocated to VRAM would do in LLM AI tasks like Deepseek.

Or maybe not. SHHH! nobody test this, not until the shelves are filled!
 
Upvote
4 (4 / 0)

DNA_Doc

Ars Scholae Palatinae
720
<snipped> You could easily buy a laptop with a 4070 in it for this price. I'm not even talking about some 16"+ gaming monstrosity either. The Asus Zephyrus G14 with a 4070 is around $2200 and I bet you could get a pretty good deal on one in a few months when the 2025 version comes out.

<snipped>
You don't even need to spend that much. You can pick up something like a Swift X, with a Core Ultra 7, 32GB RAM, 4070, gorgeous 14.5" OLED display, lightweight metal build for $1,299.
 
Upvote
1 (1 / 0)
4070 laptop GPU only has 8GB VRAM which is a limiting factor for a number of current games such as Indiana Jones, and certainly future titles. With 32GB (or more) of total system RAM available, the Strix Halo GPU could plausibly be assigned 10, 12, up to 16GB to work with, which should bring real benefits in those VRAM-limited scenarios.
I doubt it's much of a factor at settings you'd actually be able to use on a mobile 4060 performance level device. I have a desktop 3070 and Indiana Jones runs fine, I just can't use the highest texture settings. I'm not saying it's not a potential problem, but I'd rather have a faster GPU than a GPU with more VRAM.
 
Upvote
1 (1 / 0)

Coolie

Ars Praetorian
407
Subscriptor++
Performance like an integrated RTX 4060
Subhead needs a word.

You mean “Performance like [a laptop] RTX 4060”?
(Kevin referenced that point out clarification briefly in the article body: “It's not quite in the realm of dedicated desktop GPUs, of course.”)

There are no [CPU-]integrated GPUs from Nvidia, as far as I am aware? They are all dedicated GPUs, even if embedded as a chip on the motherboard.

Only CPU manufacturers so far have integrated the CPU and GPU, so AMD, Intel, Apple, Qualcomm, etc…

Of course, that might change if Nvidia becomes more ambitious with their newly-announced ‘Project DIGITS’ and they really mesh the GPU into the CPU package.
 
Last edited:
Upvote
0 (0 / 0)

Hyoubu

Ars Scholae Palatinae
682
I am really looking to see how the silent mode performance of this compares to the other AMD APU releasing in the ZBook Ultra 14 G1a. I am getting one or the other, specifically for playing in an airplane in silent mode for those long flights. It’s nice that we are finally at that point, because while the Switch and Steamdeck and ROG Ally are good, I just want a real laptop.
 
Upvote
1 (1 / 0)
I know the Z13 gets all the hype, but the X13 is by far and away the superior product

Having lived with them both, the X has superior thermals, sound, battery, keyboard, and far less bugs (seriously, the Z13 really cannot compete with a Surface for day to day software reliability)

Honestly, for CAD, the X13 is so brilliant, and there's no comparison until we start looking at Razer, but they hate AMD so actually the X13 is best in class IMHO
 
Upvote
4 (4 / 0)
According to Asus' Tech specs for the more expensive GZ302 model, the Asus Z13 comes with an Asus Pen active stylus that uses the MPP (Microsoft Pen Protocol) 2.0.

I didn't find any mention of stylus support in the review, other than a (hopefully) red-herring remark about trying to use a non-active stylus.

Was there an included stylus in the box or the press review kit?

I love my Surface Pros. For notetaking: they fall behind for anything that truly needs a GPU. I've considered moving up to a Surface Pro with Thunderbolt 4 and adding an eGPU that way for things like Blender.

So this looks intriguing. But even Asus' own page for it doesn't list an active stylus or related compatibility. Which... yikes? To me, why even bother with the form factor if you're going to skip having a pressure sensitive stylus? So that would be great to know for sure, as to whether or not it does.
 
Upvote
0 (0 / 0)

Voldenuit

Ars Tribunus Angusticlavius
6,554
I love my Surface Pros. For notetaking: they fall behind for anything that truly needs a GPU. I've considered moving up to a Surface Pro with Thunderbolt 4 and adding an eGPU that way for things like Blender.

So this looks intriguing. But even Asus' own page for it doesn't list an active stylus or related compatibility. Which... yikes? To me, why even bother with the form factor if you're going to skip having a pressure sensitive stylus? So that would be great to know for sure, as to whether or not it does.
JustJosh showed it with an active stylus and also mentioned in their review that it has pen support.

 
Upvote
2 (2 / 0)

TheFongz

Wise, Aged Ars Veteran
117
Calling the addition of a headphone jack and USB-A port a "nostalgia nook" makes you part of the problem.
Desire for these ports is not nostalgia. USB-A is still what 90% of new peripherals come with; not to mention the 100% of older, still-perfectly-functional peripherals. And a headphone jack is how one plugs in a pair of freak'n headphones. Ones which will never run out of charge nor require fiddly pairing.
 
Upvote
8 (9 / -1)
Calling the addition of a headphone jack and USB-A port a "nostalgia nook" makes you part of the problem.
Desire for these ports is not nostalgia. USB-A is still what 90% of new peripherals come with; not to mention the 100% of older, still-perfectly-functional peripherals. And a headphone jack is how one plugs in a pair of freak'n headphones. Ones which will never run out of charge nor require fiddly pairing.

Because I work in audio, a few years ago I bought myself an external soundcard because it had two dedicated audio-ins and two dedicated audio outs. At the time, I thought I was purchasing a handy, 'just in case' peripheral.

Little did I know it would become the most indispensable piece of equipment I own

But I guess one wouldn't be playing Tomb Raider on the highest settings with a wired controller and headphones...
 
Last edited:
Upvote
4 (4 / 0)

Coolie

Ars Praetorian
407
Subscriptor++
Calling the addition of a headphone jack and USB-A port a "nostalgia nook" makes you part of the problem.
Desire for these ports is not nostalgia. USB-A is still what 90% of new peripherals come with; not to mention the 100% of older, still-perfectly-functional peripherals. And a headphone jack is how one plugs in a pair of freak'n headphones. Ones which will never run out of charge nor require fiddly pairing.
A bit harsh, and USB-C must be a much larger share of newer peripherals(?) — though my pile of older peripherals with captive USB-A cables is still going strong — but I am 100% with you on the 3.5mm jack, more-so as this is a laptop(-alternate).
 
Last edited:
Upvote
1 (1 / 0)

HiroTheProtagonist

Ars Praefectus
5,875
Subscriptor++
A bit harsh, and USB-C must be a much larger share of newer peripherals(?) — though my pile of older peripherals with captive USB-A cables is still going strong — but I am 100% with you on the 3.5mm jack, more-so as this is a laptop(-alternate).
I'm not necessarily using the absolute bleeding edge of peripherals, but the only PC peripheral in my arsenal that uses USB-C for basic connectivity is my Logi Streamcam. The only other device with a USB-C connection is my MX Vertical Mouse, which uses USB-C for the charging port on the device and otherwise plugs into my PC via USB-A.

I've seen some wireless mice/keyboards that use USB-C receivers, but they seem to be less popular than just using Bluetooth.
 
Upvote
2 (2 / 0)
Been looking for a good all-AMD workstation laptop to use (and of course immediately wipe the disk and install Linux as any workstation user should :p). This looks nice, but this article seems to be lacking a very important detail: RAM size, and any options on such (and if it is actually in there and I just read over it, sorry). Says 128GB on the Asus website, but I'm hesitant to believe that's actually RAM and not some SSD storage number.

I got the PX13 last month (Aug release) and Linux support is dismal and just starting to come on line--needs another 10 months for this to be stable. 2022 X13 got solid (i.e. all distros) on linux last year, now feel like I should have kept it.

Asus storefront says 32MB only. Never seen a 64 or 128GB version in any flow version. But glad Asus finnaly recognized 32MB as standard as CAD apps suffered under 16.
 
Upvote
0 (0 / 0)
I'm not necessarily using the absolute bleeding edge of peripherals, but the only PC peripheral in my arsenal that uses USB-C for basic connectivity is my Logi Streamcam. The only other device with a USB-C connection is my MX Vertical Mouse, which uses USB-C for the charging port on the device and otherwise plugs into my PC via USB-A.

I've seen some wireless mice/keyboards that use USB-C receivers, but they seem to be less popular than just using Bluetooth.
I have an external SSD enclosure that came with a C-to-C cable, but the C-port on my motherboard isn't very stable and tends to DC. Wound up replacing it with C-to-A.
 
Upvote
0 (0 / 0)

Coolie

Ars Praetorian
407
Subscriptor++
I'm not necessarily using the absolute bleeding edge of peripherals, but the only PC peripheral in my arsenal that uses USB-C for basic connectivity is my Logi Streamcam. The only other device with a USB-C connection is my MX Vertical Mouse, which uses USB-C for the charging port on the device and otherwise plugs into my PC via USB-A.

I've seen some wireless mice/keyboards that use USB-C receivers, but they seem to be less popular than just using Bluetooth.
Not the main point of my post, but I get it. I have peripherals from 15-20 years ago which I still use with micro, mini, Superspeed micro and even full USB B. I don’t see myself getting rid of them any time soon. I’m making good use of my stock of USB A to USB-C adapters.

However, keeping aside desktops (now smaller percent of computer sales vs 10+ years ago), I cannot think of a mainstream laptop in the last 4 years — and likely longer — which did not come with at least one USB-C port (equaling or outnumbering USB A ports in most cases) and in some cases like MacBooks, the dropping of USB A altogether.

Plus, the new handheld gaming computers are all USB-C only too, as are new tablets and phones (as hosts).

10% of new peripherals being USB-C vs 90% being USB A just sounds off to me…

Again, I am wholly in agreement on the 3.5mm jack being an expectation on a laptop(-alternate).
 
Last edited:
Upvote
2 (2 / 0)

Voldenuit

Ars Tribunus Angusticlavius
6,554
Not the main point of my post, but I get it. I have peripherals from 15-20 years ago which I still use with micro, mini, Superspeed micro and even full USB B. I don’t see myself getting rid of them any time soon. I’m making good use of my stock of USB A to USB-C adapters.

However, keeping aside desktops (now smaller percent of computer sales vs 10+ years ago), I cannot think of a mainstream laptop in the last 4 years — and likely longer — which did not come with at least one USB-C port (equaling or outnumbering USB A ports in most cases) and in some cases like MacBooks, the dropping of USB A altogether.

Plus, the new handheld gaming computers are all USB-C only too, as are new tablets and phones (as hosts).

10% of new peripherals being USB-C vs 90% being USB A just sounds off to me…

Again, I am wholly in agreement on the 3.5mm jack being an expectation on a laptop(-alternate).
Gaming tablet means gaming mouse, and they are predominantly USB-A, especially if we're talking 4KHz and 8KHz dongles. So having more than one USB-A port would have been nice. Bluetooth mice are just too high-latency for gaming.

Another nitpick I have about the Z13 is that the SSD slot is a 2230 formfactor, instead of the more common 2280, meaning your upgrade choices are limited (although at least the NVMe slot is easily accessible - you don't have to disassemble or break open the chassis or melt glue like in some other portable devices. Just a philips head screw, pop open a cover and replace the SSD).

EDIT: Nitpick #3 - the USB-C and USB-A ports are all on the top side of the edges (in landscape mode). So short dongles will dangle. The PCB of the Z13 is actually huge (by tablet standards), it's larger than the PCB of a Macbook Air, so it would not have been impossible to have placed the ports lower.
 
Last edited:
Upvote
1 (1 / 0)

Tamerlin

Ars Scholae Palatinae
601
What I'm really hoping for is a ProArt P16 updated with this CPU, with the power envelope raised to the full 120 watts, and OLED monitor, and 128 GB of ram. I'm also hoping that Asus will include TB4 and 5 on it, so that it can also use that new TB5 eGPU with the possibility of also adding one or two additional ones via the TB4 port(s).

I don't game much, I'm a Houdini + Nuke user. I like having a machine that's portable but that can drive its CPU to some major overclocking when plugged in.

BTW, the 395+ has 3GHz nominal clocks; in turbo mode it's supposed to be able to hit 5.1 GHz with all cores provided that the device can supply enough power and dissipate enough heat for it.

And being full-fat Zen5s all 16 cores have dual AVX-512 equipped FPUs... plus whatever amount of computing power the integrated GPU can offer.

The current ProArt P16 can drive the 12-core CPU at up to 80 watts and the built-in 4070 GPU at 105 watts. Asus should be able to deliver and cool 120 watts :)
 
Upvote
1 (1 / 0)

daverayment

Smack-Fu Master, in training
20
On-chip bandwidth for 4xLPDDR5X-8000 channels is 256GB/s. Memory bandwidth on the the RTX 5080 is 960GB/s. That's certainly a big gap in the RTX 5080's favor, but it's not the 6-10x you are suggesting -- and the RTX 5080's power consumption is multiple times larger than the power budget for an entire Strix Halo laptop.
I appreciate the correction. I had mistakenly thought the bus was 128-bit, but it's 256-bit on the AMD chip. This is still half the buswidth of Apple's solution, which is a shame.

As for the 5080, yes, it's power hungry. However, memory bandwidth is not the only discriminator for performance. The powerful tensor cores on that card act as another multiplier over the mobile chips.

Still, mobile AI is an interesting avenue for future exploration, and I hope Ars can make it part of their testing in the future.
 
Upvote
0 (0 / 0)
I appreciate the correction. I had mistakenly thought the bus was 128-bit, but it's 256-bit on the AMD chip. This is still half the buswidth of Apple's solution, which is a shame.

As for the 5080, yes, it's power hungry. However, memory bandwidth is not the only discriminator for performance. The powerful tensor cores on that card act as another multiplier over the mobile chips.

Still, mobile AI is an interesting avenue for future exploration, and I hope Ars can make it part of their testing in the future.
That's all true. When all is said and done, Strix Halo's GPU is still a midrange mobile solution -- it's just a midrange mobile solution potentially backed up by enterprise-level VRAM.

I suspect its peak AI performance / usefulness will be in scenarios where people need more than the 16 - 32GB commonly available on consumer GPUs. I genuinely don't know how many people that covers, given how much corporate spending is focused on AI these days.
 
Upvote
0 (0 / 0)