User Name
Password
AppleNova Forums » Apple Products »

Apple Silicon M1 Ultra


Register Members List Calendar Search FAQ Posting Guidelines
Apple Silicon M1 Ultra
Thread Tools
kscherer
Which way is up?
 
Join Date: Aug 2004
Location: Boyzeee
 
2022-03-09, 13:16

The details are coming in.

That Apple hid the M1 Max's interconnect is amazing to me. Did anyone spot this in the industry before Apple made it known? I remember reading a lot about these things and the "x-rayed" details of the architecture, but I cannot recall anyone pointing out, "Hey, what's that row of pin-thingies?" Were they too small to be seen?

Anyway, the Ultra is a little beast — and a pricey one at that, but it's a lot cheaper than Intel's offering. The 28-core Xeon option for the Mac Pro is $7000 on its own!

A completely maxed-out Mac Studio is $8000, and is faster — and cheaper — than the Mac Pro minimum configuration with the 28-core Xeon upgrade ($13,000).

Swinging for the fences? Holy cow, I guess so.

Still, we have to remember that the Mac Pro has all of those slots for super-pricey GPU's that not very many people use. But, considering the Ultra's GPU is claimed to be faster than the very best single GPU available in the Mac Pro, I suspect >90% of pro needs will be met with the performance of this little character. It remains to be seen what thermal envelope the Ultra has and whether or not the Studio's cooling system will be able to keep up. If the M1/Pro/Max are any indicator, I suspect it won't have any trouble.

- AppleNova is the best Mac-users forum on the internet. We are smart, educated, capable, and helpful. We are also loaded with smart-alecks! :)
- Blessed are the peacemakers, for they shall be called sons of God. (Mat 5:9)
  quote
turtle
Lord of the Rant.
Formerly turtle2472
 
Join Date: Mar 2005
Location: Upstate South Carolina
 
2022-03-09, 13:58

You know, the fact that they figured out how to go with a multi-CPU set up that is more like a beast of a single socket is amazing. I know they talked about the performance hits for multi-socket configurations but it isn't so bad that we couldn't do it. I anticipate the Mac Pro being multi-socket/CPU and an absolute beast. They will figure out a way to make it so crap can be recompiled and work just as will with the multiple sockets and be just fine.

Really though, those are going to be some pretty specific apps and use cases.

I know I saw nothing about the interconnect before yesterday. It really is pretty wild.

Louis L'Amour, “To make democracy work, we must be a nation of participants, not simply observers. One who does not vote has no right to complain.”
Visit our archived Minecraft world! | Maybe someday I'll proof read, until then deal with it.
  quote
kscherer
Which way is up?
 
Join Date: Aug 2004
Location: Boyzeee
 
2022-03-09, 14:38

A theory rolling around here is that the Max has another secret: It's stackable!
  quote
chucker
 
Join Date: May 2004
Location: near Bremen, Germany
Send a message via ICQ to chucker Send a message via AIM to chucker Send a message via MSN to chucker Send a message via Yahoo to chucker Send a message via Skype™ to chucker 
2022-03-09, 14:46

Quote:
Originally Posted by kscherer View Post
Did anyone spot this in the industry before Apple made it known? I remember reading a lot about these things and the "x-rayed" details of the architecture, but I cannot recall anyone pointing out, "Hey, what's that row of pin-thingies?" Were they too small to be seen?
Nah, multiple people totally called this, especially since the M1 Pro had nothing like this. The Mac looks a lot like the Pro’s GPU and memory controller copied and pasted (which is essentially what it is), but then at the bottom, there’s this flat row of stuff. What’s that? The interconnect.

Plus, at that point, the Gurman rumor about Jade 2C Die and all that had already been out for half a year. So we knew to look for it. Like, I wasn’t surprised by Srouji’s reveal. I was, however, kind of happy how all that fell into place by his explanation!

Oddly, though, that rumor suggested there would be one more model, the 4C. Either they scrapped that one, or it’s really just two sockets of the Ultra, with no interconnect at all.
  quote
PB PM
Sneaky Punk
 
Join Date: Oct 2005
Location: Vancouver, BC
Send a message via Skype™ to PB PM 
2022-03-09, 15:39

The interconnection between the two chips in the package isn’t completely new. Apparently some high end Xeons use a similar technology. It is different from what AMD and Intel are using currently in the chiplet approach they are using/moving towards. The explanation I heard was that it has more in common with high end GPUs.

The GPU on the package sounds great and fast, until you find out each core maxed out at 200GB/s. That GTX1070 territory, if memory serves me right, but I suspect I’m off. Maybe RTX 2070?
  quote
Bryson
Rocket Surgeon
 
Join Date: Feb 2005
Location: The Canadark
 
2022-03-09, 15:54

Quote:
Originally Posted by chucker View Post
Nah, multiple people totally called this, especially since the M1 Pro had nothing like this. The Mac looks a lot like the Pro’s GPU and memory controller copied and pasted (which is essentially what it is), but then at the bottom, there’s this flat row of stuff. What’s that? The interconnect.

Plus, at that point, the Gurman rumor about Jade 2C Die and all that had already been out for half a year. So we knew to look for it. Like, I wasn’t surprised by Srouji’s reveal. I was, however, kind of happy how all that fell into place by his explanation!

Oddly, though, that rumor suggested there would be one more model, the 4C. Either they scrapped that one, or it’s really just two sockets of the Ultra, with no interconnect at all.
What precludes the 4C as rumoured existing? Is there no room for a set of interconnects down the side?
  quote
chucker
 
Join Date: May 2004
Location: near Bremen, Germany
Send a message via ICQ to chucker Send a message via AIM to chucker Send a message via MSN to chucker Send a message via Yahoo to chucker Send a message via Skype™ to chucker 
2022-03-09, 16:43

Quote:
Originally Posted by Bryson View Post
What precludes the 4C as rumoured existing? Is there no room for a set of interconnects down the side?
Look at Apple's M1/Pro/Max graphic.

So the M1 and M1 Pro/Max actually don't have that much in common in this graphic, so ignore the M1.

But the Pro and Max are actually very similar. The Pro is basically the top two thirds of the max, almost exactly.

Let's look at that in more detail. The bottom ~half of the M1 Pro can be found in the M1 Max, twice. The portions to the left and right are the memory controllers. So the M1 Pro has two of those, and the M1 Max is four, which is why it has twice the memory bandwidth. The chips between those memory controllers are the GPU cores. So, since the M1 Max has this entire section duplicated, it has twice as many GPU cores.

But the M1 Max doesn't end there — at the very bottom, it has something the M1 Pro lacks altogether. And that's the interconnect for the M1 Ultra. It's literally just two of these, facing each other, with a connector (that I understand has ~10,000 pins) in between.

The M1 Pro is really just an M1 Max with the bottom third chopped off — thus, fewer GPU cores, and lower memory bandwidth, and no interconnect. The M1 Ultra, in turn, is two M1s Max.

Critically, the M1 Ultra didn't need for the M1 Max's chip design to change. The preparations for the interconnect were already there, and are simply unused for those who get an M1 Max.

But no further such unused mechanism exists on the side. So either the M1 Quad doesn't reuse the existing chip design, or it doesn't work with such an interconnect (it could just be two entirely separate chips, with an orchestrator controller*), or it doesn't exist. In the latter scenario, that doesn't really explain Gurman's rumor (and it's surprising, since that one rumor story from April 2021 correctly predicted M1 Pro as "Jade C-Chop", M1 Max as "Jade C-Die" and M1 Ultra as "Jade 2C-Die" — but also predicted, incorrectly, a "Jade 4C-Die"). And clearly, the M1 Ultra was already prepared before the M1 Max ever saw the light of day, since, y'know, interconnect right on the chip. So at the point the M1 Max shipped, they must have known that this isn't how the M1 Quad was gonna work. Either Gurman extrapolated something that wasn't there, or those plans were scrapped long ago — probably before even the original M1 shipped.

If the M1 Quad isn't a thing that exists, perhaps there won't be an M1 Mac Pro at all, and instead, it's the M2 Pro/Max/Ultra that give us clues, about a year from now, of how that will work.

*) which brings its own set of problems. One off-hand remark from Srouji that didn't make sense to me to highlight at first (I thought the presentation wasn't very well-structured, TBH) was that he was proud that developers didn't have to adjust their programming model. This is true with the M1 Ultra as well, even if you do GPU programming; the entire thing shows up as a single "device" in Metal, for example. Well, if they treat the M1 Quad more the way we used to do dual-socket CPUs, that may not be so simple any more. They might have to treat the GPU as two separate "devices". The Neural Engine, too. And for the CPU, while it doesn't really matter whether your code runs on 40 cores on the same chip, or 20 cores each two chips, it does have implications for performance: the cache between the two will be slow to keep coherent, which introduces new latencies, which makes this very tricky to get right. And all that for what, exactly? Just so people with a $6k and up Mac get a chip that, in very specific scenarios, is faster? That may not be worth it at all.
  quote
kscherer
Which way is up?
 
Join Date: Aug 2004
Location: Boyzeee
 
2022-03-09, 16:56

I wonder if the Mac Pro tower is just a Mac Studio computer inside a box that offers PCI and SSD slots, then? The fool thing is already fast enough, so maybe it's just the same computer inside a box with expansion slots?

I don't know anymore.

- AppleNova is the best Mac-users forum on the internet. We are smart, educated, capable, and helpful. We are also loaded with smart-alecks! :)
- Blessed are the peacemakers, for they shall be called sons of God. (Mat 5:9)
  quote
709
¡Damned!
 
Join Date: May 2004
Location: Purgatory
 
2022-03-09, 17:00

Mac Pro tower breadbox more like it.
  quote
chucker
 
Join Date: May 2004
Location: near Bremen, Germany
Send a message via ICQ to chucker Send a message via AIM to chucker Send a message via MSN to chucker Send a message via Yahoo to chucker Send a message via Skype™ to chucker 
2022-03-09, 17:04

Quote:
Originally Posted by kscherer View Post
I wonder if the Mac Pro tower is just a Mac Studio computer inside a box that offers PCI and SSD slots, then? The fool thing is already fast enough, so maybe it's just the same computer inside a box with expansion slots?

I don't know anymore.
Yeah, I've been wondering that. But if so, they might as well have announced it yesterday.

I think there's another shoe yet to drop. And maybe that shoe has nothing at all to do with the SoC! Apple Afterburner 2? Mac Pro Wheels: now just $299 and CNC-etched from Tibetian bamboo?
  quote
psmith2.0
Mr. Vieira
 
Join Date: May 2004
Location: Tennessee
 
2022-03-09, 18:11

Quote:
Originally Posted by kscherer View Post
I wonder if the Mac Pro tower is just a Mac Studio computer inside a box that offers PCI and SSD slots, then? The fool thing is already fast enough, so maybe it's just the same computer inside a box with expansion slots?

I don't know anymore.
I think the new, AS Mac Pro is that rumored “smaller tower”. I don’t think they’re working on two sizes of Mac Pro. I don’t think it has to be as big as they’ve always been. If this Mac Studio is the performance side of things, the Pro variant only needs room for some slots and whatever expansion pros want. And maybe enough room to double up on the M1 stuff?

But I think the next Mac Pro will be a good bit smaller than the current one, and the only thing really making it “pro” are the card slots, extra bays and things for specific, professional fields (audio, video, etc.).

I think we’ve seen the pro tower at the biggest it’ll ever be. It my not even need wheel anymore!
  quote
chucker
 
Join Date: May 2004
Location: near Bremen, Germany
Send a message via ICQ to chucker Send a message via AIM to chucker Send a message via MSN to chucker Send a message via Yahoo to chucker Send a message via Skype™ to chucker 
2022-03-09, 18:18

Quote:
Originally Posted by psmith2.0 View Post
I think the new, AS Mac Pro is that rumored “smaller tower”.
My current thinking is that what Gurman thought would be a small tower is actually the Mac Studio.

Quote:
Originally Posted by psmith2.0 View Post
If this Mac Studio is the performance side of things, the Pro variant only needs room for some slots and whatever expansion pros want. And maybe enough room to double up on the M1 stuff?
But I think the next Mac Pro will be a good bit smaller than the current one, and the only thing really making it “pro” are the card slots, extra bays and things for specific, professional fields (audio, video, etc.).[quote=psmith2.0;823138]

Yep.

Quote:
Originally Posted by psmith2.0 View Post
I think we’ve seen the pro tower at the biggest it’ll ever be. It my not even need wheel anymore!
They're good wheels, John.
  quote
kscherer
Which way is up?
 
Join Date: Aug 2004
Location: Boyzeee
 
2022-03-09, 18:39

I'm pretty certain of a few things:

1) Apple Silicon M-series processors did not design themselves overnight. They were likely in development for 5+ years.

2) The M1 through M1 Ultra series were all thought of/conceived at the same design meetings, and thus all were designed together.

3) Apple's announcement yesterday informs us that UltraFusion was designed in from the very beginning. As Chucker noted, the Pro variant was likely clipped off the Max, and not the other way around. So, the Pro, the Max, and the Ultra were all conceived simultaneously. Or, as he conjectures, the Max/Ultra were designed, and the Pro got clipped off. Perhaps even the M1 is just composed of some basic bits clipped off the Pro and stitched back together?

4) Knowing the above, it is possible to conclude that Apple's 1st generation Mac chips were all conceived to harmoniously lead to a bunch of other bits that will be available in the Mac Pro tower — such as Afterburner-type cards. Upgrade options and how they interconnect with the new architecture are all part of one meeting given by Johnny Srouji and team. They all got together in a dark room and said, "Here's what we want to do …" and laid the foundations for the next 10 years, up to and including what that would mean for the Mac Pro and its internal "upgrades". This thing has been thought out very hard by the world's leading team of chip architects, and it appears they have thought of everything well in advance, so that each piece of the puzzle is not only surprising, but downright revolutionary in the industry.

5) The M5 is being conceived right now, even as we talk about the M1. It is already being prototyped, and tooling is being developed. For Apple to hit such strides with gen-1, I cannot even imagine what gen-5 will deliver.

6) The future is bright!

Quote:
Originally Posted by chucker View Post
They're good wheels, John.
The wheels *are* coming off the bus, and wouldn't you know it? There's an amphibious vehicle underneath!

- AppleNova is the best Mac-users forum on the internet. We are smart, educated, capable, and helpful. We are also loaded with smart-alecks! :)
- Blessed are the peacemakers, for they shall be called sons of God. (Mat 5:9)
  quote
chucker
 
Join Date: May 2004
Location: near Bremen, Germany
Send a message via ICQ to chucker Send a message via AIM to chucker Send a message via MSN to chucker Send a message via Yahoo to chucker Send a message via Skype™ to chucker 
2022-03-09, 19:01

Quote:
Originally Posted by kscherer View Post
Perhaps even the M1 is just composed of some basic bits clipped off the Pro and stitched back together?
Well, at the core level, yes, certainly. All of them use the same Firestorm (performance) / Icestorm (efficiency) cores, just different counts. (4+4 on the M1, 8+2 on the Pro and Max, 16+4 on the Ultra.) All of them use the same GPU cores, although the Pro, Max, and Ultra each double those. The Neural Engine is 16-core on all of them, I think, except for the Ultra, which naturally has 32 (being two Maxes). The memory controller is different; the M1 has LPDDR4X memory (which could be one reason it's limited to 16 GB), whereas the Pro/Max/Ultra have LPDDR5.

The blocks of cores are arranged rather differently. Contrast the M1 and the M1 Pro. For example, the M1's GPU is very "we found some more room over here". The M1 Pro's is much tidier and more deliberate than that, and as I've said, the M1 Max then simple doubles that entire arrangement underneath.

So I think the M1 was actually conceived a little earlier or perhaps not earlier but by a separate team, and may even have originated as more of an A14X.

I expect the M2 to inherit some of the M1 Pro's features. LPDDR5, for example, and support for more external displays. And maybe we'll see the M1 layout be a bit of a one-off, with the M2 more as a slimmed down M1 Pro, but — of course — with Avalanche/Blizzard chips.

Quote:
Originally Posted by kscherer View Post
4) Knowing the above, it is possible to conclude that Apple's 1st generation Mac chips were all conceived to harmoniously lead to a bunch of other bits that will be available in the Mac Pro tower — such as Afterburner-type cards. Upgrade options and how they interconnect with the new architecture are all part of one meeting given by Johnny Srouji and team. They all got together in a dark room and said, "Here's what we want to do …" and laid the foundations for the next 10 years, up to and including what that would mean for the Mac Pro and its internal "upgrades". This thing has been thought out very hard by the world's leading team of chip architects, and it appears they have thought of everything well in advance, so that each piece of the puzzle is not only surprising, but downright revolutionary in the industry.
I think there were very broad big-picture plans, yep. Part of the answer to "why hasn't any other company done this?" is: because Apple is unusually patient about this, whereas many companies have an internal "how does this help our next quarter's financials?" approach. If it's 2008, and you propose "in two years, we're going to brand our CPUs our own way; in four years, we're gonna start our own ARM chip design, not relying on Cortex any more; after that, we'll begin using those CPUs in more and more places; in over a decade, we'll even switch the Mac to them" at a meeting, you'll be laughed out of the room. "Why not just keep using Intel?" "What's wrong with Qualcomm?" Well, whoever proposed that, and whoever listened, took the proposal seriously, and made it happen.

Patience isn't the only reason this worked, but I firmly think it's a big one.
  quote
kscherer
Which way is up?
 
Join Date: Aug 2004
Location: Boyzeee
 
2022-03-09, 19:40

Quote:
Originally Posted by chucker View Post
Patience isn't the only reason this worked, but I firmly think it's a big one.
Sounds like a 101 lesson to me.

Also, 114 billion transistors. I know there are other chips out there with more, but 14 billion! I lack enough chops to even understand how they design this stuff, let alone make any of it.

- AppleNova is the best Mac-users forum on the internet. We are smart, educated, capable, and helpful. We are also loaded with smart-alecks! :)
- Blessed are the peacemakers, for they shall be called sons of God. (Mat 5:9)
  quote
PB PM
Sneaky Punk
 
Join Date: Oct 2005
Location: Vancouver, BC
Send a message via Skype™ to PB PM 
2022-03-10, 08:59

Looks good, performance of the ultra nearly matches the 3 year old Zen 2 Threadripper 3990X on multi-threaded tasks in Geekbench. It blows past it on single core performance, but given the age of the Zen 2 architecture of the 3990X that’s was expected. Power per watt is no contest for Apple. Of course AMD just launched the Zen 3 5000 series Threadripper yesterday, be interesting to see how it stacks up. Not bad for a chip with less than half the cores and threads. Slap four in the package of the Mac Pro and Apple has a real beast.
  quote
psmith2.0
Mr. Vieira
 
Join Date: May 2004
Location: Tennessee
 
2022-03-10, 09:15

At some point doesn't all the tail-chasing/chest-beating of "who's got the fastest processor this week" get a bit pointless/silly, big picture? It's all so fast/capable that those things probably don't even truly matter in real-life, day-to-day use. No human can notice those differences, and no normal person is going to switch platforms/workflows on a dime, multiple times a year, because some competitor comes out with a chip that has a slightly higher bench rating/review. This stuff will leapfrog each other until the end of time. One week Apple will be on top, two weeks later Intel will release something that holds the title until [fill in the blank] unveils something even More Fasterer™.

It's probably areas like heating, efficiency, battery life, etc. where the true differences will lie...because they're all plenty "fast" at this point. I don't think there's a task anyone could perform that would truly push/max out any modern-day desktop processor, is there?

Apple's edge, as always, will be that they're the only outfit(?) doing the entire thing - hardware design, OS, the guts/brains, etc. - which means they're going to be able to tweak and fine-tune how everything works together in a way that the others just can't. They're all fast, but surely Apple will have an advantage in other areas over companies making processors that run on this other company's boxes, using yet another company's OS and...

Even if Apple didn't have the absolute "fastest" gear, I'm betting they'll have the most power efficient, cool-running and battery-sipping takes...all of which matters more in real world usage than slight differences in performance no Photoshop jockey or animator/3D artist is going to truly notice on the job.

"Ooh, my four-month-old rig feels a few MHz slower today, ever since Intel announced that Camp Crystal Lake stuff yeterday..." - said no one, ever.

Buy for the platform, for the long haul/big picture...not the ever-changing, leapfrogging-every-three-months specs. Right now, unless you're just rabidly anti-Apple/Mac for some deep-seated reason(s), throwing in with Apple seems to be a safe, smart move. I certainly don't need to be convinced of anything, after 29+ years on the platform...through both the aimless, lean years and the iGlory days. The Apple Silicon era has only strengthened that connection/loyalty. I now know there will be a solid Mac for me for the remainder of my life, be it five years or 35.

Or am I simplifying it too much? Wouldn't be the first time...

Last edited by psmith2.0 : 2022-03-10 at 09:40.
  quote
PB PM
Sneaky Punk
 
Join Date: Oct 2005
Location: Vancouver, BC
Send a message via Skype™ to PB PM 
2022-03-10, 10:13

Quote:
Originally Posted by psmith2.0 View Post
At some point doesn't all the tail-chasing/chest-beating of "who's got the fastest processor this week" get a bit pointless/silly, big picture?
It’s not about chest beating. Depending on what kind of work you are doing with that system, it can make a big difference. In some cases having the faster system means you get the job and a competitor doesn’t.

It’s also not about OS platforms, that’s even worse when it comes to chest beating (your post is a great example of this, and I don’t say that to be insulting). There are plenty of cross platform apps that neutralize the need to worry about that nonsense. A reasonable person picks based on what works best and gets the job done most efficiently and cost effectively, regardless of the other aspects of the system (OS, CPU brand, etc.).
  quote
chucker
 
Join Date: May 2004
Location: near Bremen, Germany
Send a message via ICQ to chucker Send a message via AIM to chucker Send a message via MSN to chucker Send a message via Yahoo to chucker Send a message via Skype™ to chucker 
2022-03-10, 10:41

Quote:
Originally Posted by psmith2.0 View Post
At some point doesn't all the tail-chasing/chest-beating of "who's got the fastest processor this week" get a bit pointless/silly, big picture?
Nah.

I mean, it's perfectly valid not to be too invested in the topic. But there are two reasons that chest-beating is great:
  • capitalism! The competitive drive keeps everyone (including Apple) on their toes and gives us better and better products
  • performance is not perfect just yet. For your uses, maybe — but there are still niche applications where a task taking ten seconds vs. it taking five minutes makes a huge difference. Can you quickly tweak settings here and there and see a different result, or is that another coffee break, so you won't even bother trying?
  • heat and ultimately the environment! Faster CPUs also has trickle-down effects. You can run them at a lower speed and save space, battery, power, heat.

Quote:
Originally Posted by psmith2.0 View Post
No human can notice those differences,
But that's not entirely true. There's still things out there, even for consumer features, that were prohibitively slow just a decade ago and aren't now. Your phone's camera (well, not yours; you have an SE ) is amazing in large part because the chip does a ton of work, and at breathtaking pace, to correct for errors in the sensor, for human perception, for expectations, etc. And it can even do some of that for video as well, at 50fps!

Quote:
Originally Posted by psmith2.0 View Post
and no normal person is going to switch platforms/workflows on a dime, multiple times a year, because some competitor comes out with a chip that has a slightly higher bench rating/review.
This is true for most people, but there are also effects here. Apple didn't move away from Intel to spite them. They moved for a number of reasons, including wanting to control their own destiny, but also because they were unhappy with the long-term roadmap. Just like, two decades ago, they were unhappy with IBM's and much happier with Intel's.

So you and I may not move for that reason, but companies ultimately do. And heck, at some point, even you might. If you're doing heavy processing, and a $2,000 device can get the task done so fast that you can literally move a slider around and see results in almost real time, but your current device, which is a different platform, takes 90 seconds to calculate each image, that's when you start thinking: should we buy one computer of those other guys, just so some of my colleagues can see results faster? And once you do that, the team might suddenly decide to move altogether.

Quote:
Originally Posted by psmith2.0 View Post
I don't think there's a task anyone could perform that would truly push/max out any modern-day desktop processor, is there?
Sure there is. Plenty of software still takes something like ten minutes to compile. That's annoying, and it really impedes your workflow. So you take all kinds of shortcuts so you don't compile the entire thing each time, but those come with downsides, such as the behavior being not exactly the same, in edge cases.

Performance opens up entire new use cases that were hard to conceive before. Like iPhone 12's cinematic mode. Not only is the CPU so fast it can figure out, live, which portion of the image you want to focus on; it even does so in a way that you can adjust this later on.

And in any case, this is the M1 Ultra. It's not like Apple is saying "from now on, everyone needs a CPU with 20 cores". Probably not for many years to come.
  quote
psmith2.0
Mr. Vieira
 
Join Date: May 2004
Location: Tennessee
 
2022-03-10, 10:52

I wasn't calling anyone a gorilla, or ragging on this thread or anyone in it. I just think getting into the weeds over every spec - that's going to change/volleyball anyway - is a tire-spinning thing. Stuff is fast, faster than ever before. It'll be faster again in a short period. And then some more after that.

Quote:
Originally Posted by PB PM View Post
A reasonable person picks based on what works best and gets the job done most efficiently and cost effectively, regardless of the other aspects of the system (OS, CPU brand, etc.).
And that's exactly what I've done. At least back when I got in, it was the route to go. It's what I learned on, and those early jobs I had in the mid-late 90's in Orange County and San Diego...none of those art/graphics departments were PC/Windows-based. Over time, I was invested in the apps and my knowledge. I wasn't about to get a "cheaper" Windows PC, late-90's/early-mid 2000's. I'll own - and use/earn with - one Mac in the same span of time others I know will be on 2-3 machines rendered useless by God-knows-what. It doesn't happen near as much now as it did 10-15 years ago, I will say that. It's been several years since I've heard a good ol' fashioned virus/malware horror story, so that tells me that either a) Microsoft has improved on things, or b) everyone just automatically installs/runs utilities to keep things running smoothly. Or a bit of both.

I'm thankful I never had to make that stuff part of my computer-using life. I take it for granted that my stuff is just going to work/perform with a minimum of hassle. That's addicting.

Quote:
Originally Posted by chucker View Post
[*]heat and ultimately the environment! Faster CPUs also has trickle-down effects. You can run them at a lower speed and save space, battery, power, heat.
That's exactly what I said.

I'm not against "specs" and performance, at all. Why would I be? I don't want to buy an underperforming piece of shit, regardless of the price or logo on the box. But I never have. I think it's actually a testament to Apple's build quality/aging/OS support that I'm on a 2013 MacBook Pro and 2016 iPhone SE and still getting everything I need doing done. Just more confirmation that I went a good route ages ago. I don't want to buy a new setup/system every 2-3 years (Mac or phone), and it's cool that I don't have to. I've definitely never been forced to, at least. But I also do a specific type of work at this point, the past 6-8 years, that can be done on modest gear/specs. YMMV, obviously.

I just don't think year-to-year, leapfrogging-every-few-months "matters" much, that's all. It's going to happen, all platforms/processors.

I don't work in the "industry", I'm not a developer, etc. so a lot of this stuff...yeah, it doesn't direct/impact me (or people like me), and I'm going to downplay/ignore/be unaware of things like compiling, etc. If you're in a different boat, more power to you. Enjoy...it's a great time to have access to fast, powerful stuff. Whichever platform you're on. Sounds like everyone is upping their game. That's kinda what's supposed to happen...Intel should be getting their shit together and trying to best Apple. And Apple should be spurred on by whoever else is making waves. They have to think/worry about that kind of stuff, I don't. I just get to enjoy the fruits of it all.

My years of demanding, higher-end print-based work are far behind me, so it's been ages since I've truly needed cutting-edge performance, so that colors my outlook/stance on such things too. I just really don't care. Which are sometimes my favorite words to say.

Last edited by psmith2.0 : 2022-03-10 at 11:32.
  quote
Matsu
Veteran Member
 
Join Date: May 2004
 
2022-03-10, 13:40

Our computers can do more and more, and so too our low end computers can do more an more. I know new software features tend to take up all the extra power and then some, but... I think about what's become relatively basic function over the years. Office tasks, web browsing, video conferencing, streaming... A machine that costs relatively little does all these things easily. I think audio, photo, and video processing/editing/compositing, 3-d modelling and advanced simulation, maybe more or less in that order are also becoming easier to do on comparatively more entry level machines. It's kind of amazing what can be done on an iPad or a phone for instance. I'm still editing fairly large pictures with relative ease on a 9 year old machine... Ingestion can take some time, batch processing/export can take some time, I just walk away from the machine at that point. The rare image that gets a multi-layered pass through photoshop is not always perfectly snappy, but everything is more than useable... If I think that even entry level machines are faster than what I'm running, at least for photo work, I think we're getting to the point where a basic machine leaves little to be desired.

.........................................
  quote
psmith2.0
Mr. Vieira
 
Join Date: May 2004
Location: Tennessee
 
2022-03-10, 14:26

iPads are so impressive these days. You don’t even have to get a pro model to do so many things.
  quote
PB PM
Sneaky Punk
 
Join Date: Oct 2005
Location: Vancouver, BC
Send a message via Skype™ to PB PM 
2022-03-10, 14:40

Indeed, the power of these modern CPUs are frankly amazing. If some had told me you could put together a 8-20 core machine together for under $3k ten years ago, I would have laughed. At the time consumer chips were quad core, and high end ones had hyper threading. Today you can.
  quote
Posting Rules Navigation
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Post Reply

Forum Jump
Thread Tools
Similar Threads
Thread Thread Starter Forum Replies Last Post
The Evolution of Apple Silicon kscherer Apple Products 240 2023-12-01 15:44
Macs With Apple Silicon kscherer Speculation and Rumors 269 2023-01-20 15:12
New Apple Silicon (M1 Pro and M1 Max) Capella Apple Products 48 2022-01-26 13:41
Apple Silicon Mac Pro kscherer Apple Products 4 2021-05-18 14:46
AppleInsider: Apple to adopt Intel's ultra-mobile PC platform psmith2.0 Speculation and Rumors 11 2008-01-14 17:55


« Previous Thread | Next Thread »

All times are GMT -5. The time now is 11:46.


Powered by vBulletin®
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004 - 2024, AppleNova