- cross-posted to:
- technology@lemmy.ml
- games@lemmy.world
- cross-posted to:
- technology@lemmy.ml
- games@lemmy.world
It’s a bit clickbaity to say Intel “lost” the PlayStation business when they didn’t have it to begin with.
Sony has been using AMD CPUs for a couple generations of PlayStation now. Moving over to Intel would have screwed up backward-compatibility, adding a ton of work and striking a huge blow to efficiency if anything were going to be backward-compatible down the line and also thrown that monkey wrench into the works of any developers publishing for both generations during the switchover. The article touches on this a little bit.
Intel would have needed to present some magically miraculously sweet deal for Sony to even consider switching, and especially when Intel is doing generally crappy I can’t see that being an easy thing for them to figure out.
x86/x64 code is pretty much 100% compatible between AMD and Intel. On the GPU side it’s not that simple but Sony would’ve “just” had to port over their GNM(X) graphics APIs to Intel (Arc, presumably). Just like most PC games work completely fine and in the same way between Nvidia, AMD and Intel GPUs. But they have to do that anyway to some extent even with newer GPU architectures from AMD, because PS4’s GCN isn’t 1:1 compatible to PS5’s RDNA2 on an architectural level, and the PS4’s Jaguar CPU isn’t even close to PS5’s Zen 2.
Other than that, you’re right. Sony wouldn’t switch to Intel unless they got a way better chip and/or way better deal, and I don’t think Intel was ready with a competitive GPU architecture back when the PS5’s specifications were set in stone.
Thank you for correcting and clarifying!
A dispute over how much profit Intel stood to take from each chip sold to the Japanese electronics giant blocked Intel from settling on the price with Sony,
Yeah that’s what I thought, Intel simply isn’t competitive. For a console SOC the CPU part eats about twice the power of an AMD, and the GPU part cost about twice to make, because it needs twice the die area to compete with AMD on performance. This creates design restrictions, and makes the system more expensive to build.
I’m surprised Intel was even in the game, they’d obviously have to sell at a deficit to compete. Maybe it was to prevent AMD from getting a sweet deal?
If Intel were serious about it, and had confidence in their technologies, they’d have taken a bad deal now, and improved their technologies to make it profitable later. But Intel already has problems with profitability, so maybe they won’t take on another loss giving investment.When AMD took the console business from Nvidia/Intel, they made a cutthroat offer, that Nvidia refused to compete with, and now AMD is dominant and make good money on consoles.
Another good decision by Intel execs. /s
I was surprised to see that their negotiations broke down because of price/cost as opposed to technology (unproven node and to my knowledge intel doesn’t really have any experience with semi-custom x86 business).