I mean if the new gen of gpus has accellerators it makes sense; actually out of curiosity, does any of the new intel stuff have any of that?
I am still at the old i5 chips
I’m having a hard time understanding your question, but I’ll try my best:
if the new gen of gpus has accellerators
GPUs are pretty much nothing but [graphics] accelerators, although they are increasingly general-purpose for parallel computation and have a few other bits and pieces tacked on, like hardware video compression/decompression.
If you typo’d “CPU,” then the answer appears to be that Intel desktop CPUs with integrated graphics are much more common than AMD CPUs with integrated graphics (a.k.a. “APUs”) because Intel sprinkles them in throughout their product range, whereas AMD mostly leaves the mid- to top-end of their range sans graphics because they assume you’ll buy a discrete graphics card. The integrated graphics on the AMD chips that do have them tend to be way faster than Intel integrated graphics, however.
If you mean “AI accelerators,” then the answer is that that functionality is inherently part of what GPUs do these days (give or take driver support for Nvidia’s proprietary CUDA API) and also CPUs (from both Intel and AMD) are starting to come out with dedicated AI cores.
does any of the new intel stuff have any of that? I am still at the old i5 chips
“Old i5 chips” doesn’t mean much – that just means you have a midrange chip from any time between 2008 and now. What matters is the model number that comes after the “Core i5” part, e.g. “Core i5 750” (1st-gen from 2008) vs. “Core i5 14600” (most recent gen before rebranding to “Core Ultra 5”, from just last year).
As far as “it makes sense” goes, to be honest, an Intel CPU would still probably be a hard sell for me. The only reason I might consider one is if I had some niche circumstance (e.g. I was trying to build a Jellyfin server and having the best integrated hardware video encode/decode was the only thing I cared about).
What I really had in mind when I say it makes me want to buy Intel (aside from joking about rejecting “AI” buzzword hype) is the new Intel discrete GPU (“Battlemage”), oddly enough. It’s getting to be about time for me to finally upgrade from the AMD Vega 56 I’ve been using for over seven(!) years now, so I’ll be interested to see how the Intel Arc B770 might compare to the AMD Radeon RX 9070 (or whichever model it’s competing against).
These companies are putting NPUs into their processors, and nobody will ever build the software to use them because everything is done on GPUs. It’s a dog and pony show.
I mean if the new gen of gpus has accellerators it makes sense; actually out of curiosity, does any of the new intel stuff have any of that? I am still at the old i5 chips
I’m having a hard time understanding your question, but I’ll try my best:
GPUs are pretty much nothing but [graphics] accelerators, although they are increasingly general-purpose for parallel computation and have a few other bits and pieces tacked on, like hardware video compression/decompression.
If you typo’d “CPU,” then the answer appears to be that Intel desktop CPUs with integrated graphics are much more common than AMD CPUs with integrated graphics (a.k.a. “APUs”) because Intel sprinkles them in throughout their product range, whereas AMD mostly leaves the mid- to top-end of their range sans graphics because they assume you’ll buy a discrete graphics card. The integrated graphics on the AMD chips that do have them tend to be way faster than Intel integrated graphics, however.
If you mean “AI accelerators,” then the answer is that that functionality is inherently part of what GPUs do these days (give or take driver support for Nvidia’s proprietary CUDA API) and also CPUs (from both Intel and AMD) are starting to come out with dedicated AI cores.
“Old i5 chips” doesn’t mean much – that just means you have a midrange chip from any time between 2008 and now. What matters is the model number that comes after the “Core i5” part, e.g. “Core i5 750” (1st-gen from 2008) vs. “Core i5 14600” (most recent gen before rebranding to “Core Ultra 5”, from just last year).
As far as “it makes sense” goes, to be honest, an Intel CPU would still probably be a hard sell for me. The only reason I might consider one is if I had some niche circumstance (e.g. I was trying to build a Jellyfin server and having the best integrated hardware video encode/decode was the only thing I cared about).
What I really had in mind when I say it makes me want to buy Intel (aside from joking about rejecting “AI” buzzword hype) is the new Intel discrete GPU (“Battlemage”), oddly enough. It’s getting to be about time for me to finally upgrade from the AMD Vega 56 I’ve been using for over seven(!) years now, so I’ll be interested to see how the Intel Arc B770 might compare to the AMD Radeon RX 9070 (or whichever model it’s competing against).
These companies are putting NPUs into their processors, and nobody will ever build the software to use them because everything is done on GPUs. It’s a dog and pony show.