Is Vectric VCarve Pro compatible with Snapdragon X laptop?

Today, I bring the dumb…

I am an Apple guy. Don’t like the idea of spending money on Microsoft products AT ALL. LONG history there that I won’t bother going into, but that being said, since Vectric refuses to make their products available to the Mac, I may concede and buy a new PC laptop for running VCarve Pro…

If I go the new laptop route, the new laptop would become my primary design machine, both in the shop and at my desk.

I know it’s complete overkill, but in looking at all my options, a new Mac laptop to run the emulator vs, X, vs Y, vs Z, I ran across the new “AI Laptops” running Windows 11 Pro and a Snapdragon X Elite processor which is getting great reviews and cost a bit less than a (more useful to me) MacBook Pro and even less in most cases, than an Intel i9 laptop…

Before I go that route, does anyone know if Vectric VCarve Pro is compatible with the new Snapdragon laptops, or will become so in the near future?

I’m having trouble getting the link, but what I’m looking at is essentially found on Amazon.

# S5507QAD-PS96


Probably not relevant for this but I’m going to put this here anyway.

Usual preface, I’m with PreciseBits so while I try to only post general information take everything I say with the understanding that I have a bias.

Short version I wouldn’t bother with it even if it did work. Those processors in single thread work (which most work will be in) are weaker than even a 12th gen i3 without optimization (Link).

Speaking of that you can/will have the same issues as running Parallel or Boot Camp on a M1+… ARM emulation for non-optimized code. In this same vein the snapdragon cores SHOULD work. However, it can have the same caveats and potential bugs due to emulation. I’m basing this on Apple users and the system requirements still being X86/X64.

Regardless of application I probably wouldn’t bother with i9 or even some of the i7 in this form factor. You are going to be so power and thermal limited that you won’t get a lot of extra performance out of them.

It’s been a version or so of Aspire since I last checked. But back then on my main home system (5950X/3090RTX) everything was still pretty much stuck on waiting for a single thread. So at least from my last experience with it, on something that had resources to spare, the only thing that really mattered was the performance of the main working core. Memory can be an issue depending on what you are doing and if you go into the “hidden” modeling resolutions.

If you really hate Microsoft enough (I do, and more and more per day). I have also run Aspire with WINE in Linux. No noticeable issues or performance problems. It’s even listed as gold rated on version 10 (Link). Your mileage may GREATLY vary though.

One last thing. For now I would recommend staying away from the OLED for anything that’s static screen work. The real world results are still pending but burn in is still a thing (Link). Additional to that at least at a quick glance they don’t list the type of OLED (QD-OLED, WOLED, etc.). This is making it even more of a question to me as the above test is on the best of current tech. There’s also still an issue with text and sharp line clarity due to the subpixel arrangement other than if the panel is a JOLED.

Hope that’s useful. Let me know if there’s something I can help with.

1 Like

Thanks very much for the well-positioned reply @TDA I had only noticed the Snapdragon laptops and I see them being hailed by all as the “future of computers”. Being a Mac guy at heart, I just saw it as the PC’s answer to the Apple Silicon chips, and – if successful – the direction Microsoft and others would go in the future.

Especially when I saw the claimed 18+ hour battery life and the benchmarks being shown compared to the i7 and i9 chips.

I haven’t time to do any of the research behind threads etc. I just see the core count and processor speed, and benchmarks. That’s all I had to go by, and hadn’t considered whether or not software would have to be optimized for it, as had to be done when Apple went from Intel to Apple Silicon several years ago… That being said, I’m kicking myself because I was at Costco a week or so ago and they had a fully stocked MSI i9 laptop on display for $699 and I didn’t pull the trigger (between paydays)…

Circling to the Linux idea, even though I am a Microsoft developer at work, I really am just a “I want s**t that just works” kinda guy, so Apple does that for me. Microsoft is a short distance behind, but I find that with Windows, I spend more time trying to get Windows and the machine to just do what it’s supposed to than I do actually getting things done (as I do with the Mac).

As such, Linux has always seemed to be 100% in the wrong direction of “s**t that just works” to me. Admittedly, as such I’ve never really taken a look because I get too frustrated if I have to fall back to secret squirrel monkey spit command line crap to work with the machine.

Call me a lazy WYSIWYG programmer. :rofl:


1 Like

No problem. Glad it was useful.

This obviously isn’t the kind of stuff I usually talk about here… But I can’t help myself so…

Yeah, in general I don’t really buy into the ARM “replacement” thing. From my perspective they aren’t universally better. They are application specific better, in most cases on a efficiency/power or specific multi-thread applications. There are some very specific places where they outright beat x86. But those are usually based on a special instructions/optimizations… and x86 has those in spades.

That’s not to say that x86 isn’t bloated or that it won’t eventually be replaced though. I just don’t think that it will be with ARM soon/ever with current architecture other than by use case. e.g. consumer level where power and thermals are king or where high performance isn’t the priority.

Yeah, there seems to be a lot of drive (and probably money) behind those recently. Especially Qualcomm stuff. You can make them look better by cherry picking. Probably the most fair apples to apples would be something where there are optimizations for both ARM and x86. So, mostly server, physics, modeling, etc. Closest 2 easy examples to get (Link 1, Link 2). No Snapdragon stuff so I’m using M2 for that as it has enough data. That comes with a lot of caveats but it’s a decent base line in my opinion. All this falls by the wayside though as soon as you are talking about specific programs.

Just did something similar myself. Needed something for travel and passed up a really good cheap/good spec Lenovo. Didn’t need anything with real power behind it. So it wasn’t as big of a difference as yours. But, I know the feeling.

Most important thing from my perspective is what works best for you to get done what you need/want. Just so my biases are clear. Between Apple and Microsoft I find it deciding between the devil and the antichrist. They both have various levels of concerns to me with privacy, walled garden, right to repair, closed source (hardware and software), etc. That’s on top of keeping them from doing things I don’t want or having to do things “their way”. It’s slowly pushed me to Linux over the years.

For me at least it’s lead to my main system being Microsoft and Linux with a bias towards Linux. But there’s a lot of software I can’t get versions of or at the same performance level in Linux. There’s also the fictional universe where I’ll someday get to game again and while it’s getting better there’s still a lot of issues with it on the Linux side.

Linux has gotten better with not NEEDING CLI as much. But I find I still need it even in Windows (PowerShell). So I’m probably not a good use case.

It depends on what you need. I run XCP-NG (VMs, Docker, K8, etc.) and TrueNAS (NAS, Syncthing, Jellyfish, etc.) servers with pfSense (local networking, VPN, IP blocking, etc.) network backend (at home and for business). In terms of “s**t that just works” for applications that those can handle they beat anything from/on Microsoft or Apple in my opinion. Especially from a reliability, consistency, and maintenance standard ONCE they are setup. Again though, probably not the “average” use case.

Lol, whatever works. If I can get the same results with less typing/time I’m all for it. As long as I don’t have to constantly fix/correct it and can still access the underlying code.

Going to stop rambling now.