AMD to buy ATi?!

AML

Well-known Member
Been hearing about this at different we sites.

What do you guys think will happen after they merge?

I assume AMD will advertise their chips as working best with ATi video cards, but i doubt they would ever make ATi cards exclussive for AMD chips or anything like that.

It would be nice to see them making PCs together. AMD chips, ATi GPU's and PPU's, all in one box.
 

AML

Well-known Member
Interesting read indeed. But this quote:

"Basically, GPUs are a dead end, and Intel is going to ram that home very soon. AMD knows this, ATI knows this, and most likely Nvidia knows this. AMD has to compete, if it doesn't, Intel will leave it in the dust, and the company will die."

Is quite surprising.

Dont really know what to make of it. But im guessing that AMD and ATi want to make a CPU with a built in GPU?
Therefore geting rid of the GPU card?

What about mini cores? How many cores will a CPU have? And at what speed? How will this affect gaming?

Too confusing for little old me!:confused:
 

cwick

Novice Member
Putting the GPU into the CPU makes sense and seems inevitable. As the CPU transistor count rises it pulls in more and more functionality from the supporting chipsets, and GPGPUs are too useful to be left just for graphics.

Core counts will continue to rise (Suns Niagra is at 8 4-way hyper-threaded cores today - so 32 hardware threads !), and putting the CPU and GPU on the same piece of silicon will lead to a massive leap in capability (downside: your next upgrade will be more expensive - you have to throw away both the CPU and the GPU. And we know how quickly Intel et al like to change socket designs, so best budget to change your motherboard too).

The downside to all this is that the software has to be scalable to take advantage of additional cores. Not just 'multi-threading', but full parallelism - if there are 8 cores, use 'em. If there are 32, use all of those too. That's much harder to do and I know that it's a major area of research - from language design to software tooling and compilers, they're looking at parallelism as a major paradigm shift in the industry. Keep in mind that Moores Law has given software developers progress almost for free thus far - write your code once, and just wait for clock speeds to increase to see your code go faster. Now you have to write with the expectation that you'll have not higher clocks (at least, not significantly), but more cores. It's a different mindset, and will require everyone to start over.

Which is great news if you're a software developer ;)
 

mikeyparkster

Standard Member
i heard about this the other day a bit worrying considering i have nvidia gpu and an AMD processor lol.... prospects for future upgrades look worrying...
 

AML

Well-known Member
mikeyparkster said:
i heard about this the other day a bit worrying considering i have nvidia gpu and an AMD processor lol.... prospects for future upgrades look worrying...
:D Me too.

But I dont think it means they will suddenly stop working together.

I think its going to be a long time before we see any major shift in the industry like what is being talked about. The current set up of having a CPU and a GPU works well and brings in the money.
Why try to fix what isnt broken?

When they do go that way, it will make things interesting and maybe even make PCs smaller? Right now things like quad SLI are taking up so much space!

What about PPU's? Are they gonna go the way of the dodo? or will they find a way to integrate that too?
 
It wouldnt make any sense to put the CPU and GPU in one - its too complicated in the sense that a graphics card works with the board i.e. memory chip set and also cooling. GPU+CPU = massive heat output AND no prospect of being able to upgrade GPU or CPU seperatly.
 

Rasczak

Distinguished Member
And we know how quickly Intel et al like to change socket designs, so best budget to change your motherboard too
Apart from the hastle factor of dismantling your PC motherboards aren't an expensive upgrade - surely if upgrading your CPU you frequently will reap more of a benefit from upgrading your MB at the sametime?
 

AML

Well-known Member
mikeyparkster said:
nah never the psu intergrated with that...cant put ALL your eggs in 1 basket can ya
Not PSU, PPU (phisics processor).
Recently theres been lots of talk regarding PPUs. The one from Aegea as well as ATi deciding to use a 3rd Graphics card as a PPU.

I was just wandering what they would do now that they have merged. They may integrate the PPU with the CPU/GPU.
 

AML

Well-known Member
SuperSaiyan4 said:
It wouldnt make any sense to put the CPU and GPU in one - its too complicated in the sense that a graphics card works with the board i.e. memory chip set and also cooling. GPU+CPU = massive heat output AND no prospect of being able to upgrade GPU or CPU seperatly.
Well, thats how this merger would benefit both companies as one. When you upgrade your CPU you are also upgrading the GPU. And vice versa.
Since the GPU needs updating more often then I guess AMD wins out on that one.

They would really need to bring costs down though. They wouldnt dare charge the cost of a high end CPU and GPU at the same time would they?
 

Singh400

Distinguished Member
Trust me, this is not being done to get one up with the competition. Both AMD and ATI have a gameplan. What a better way to achieve it other than this??

The entire PC/Entertainment market is going to change once Vista is released. With Cablecard technology coming with Vista and ATI in the forefront of releasing Cablecard's, this was an excellent move by AMD.

This is going to affect the entire PVR/DVR market. This is the right time for AMD to start marketing their mobile CPU's. With ATI's mobile chips already having a stable market, and ATI's AMD only motherboard market achieving increasing success as well with their integrated graphics et al, only thing missing is the CPU. And guess who is going to provide the CPU's???

This is what I perceive from AMD's bold move and I am 99% sure this is exactly what AMD thought as well!!!
What a friend said on another board. Makes sense IMO.
 

semiskimmed

Distinguished Member
cwick said:
you have to throw away both the CPU and the GPU. And we know how quickly Intel et al like to change socket designs, so best budget to change your motherboard too).
i wouldnt say that youd throw away kit as theres always soeone else futher down the line that doesnt like to sit on bleeeding edge hardware, so theres always a market for used bits. but the price of upgrading will go up as youll be doing it in one go :(
and whats this about intel wanting to keep changing sockets? theyve kept LGA775 from the last gen P4's and continued with the conroe's. they need new bios' which may or may not appear, i dont know that much about it tbh, but theyve not changed socket this time
 

WelshBluebird

Novice Member
semiskimmed said:
and whats this about intel wanting to keep changing sockets? theyve kept LGA775 from the last gen P4's and continued with the conroe's. they need new bios' which may or may not appear, i dont know that much about it tbh, but theyve not changed socket this time
you still need a new board though, as older LGA775 boards won't work with conroe, because conroe needs a different voltage (or something like that).

But yea, its better than changing socket, as you could get a conroe ready board with a cheapy Pentium 4 / D CPU for now, then upgrade to a conroe / kentsfield later on.
 

semiskimmed

Distinguished Member
i wouldnt be surprised if alot of bios' are released for the older 775 boards though, its only a matter of time for the guys that can to control all the voltages etc via the bios rework ;)
as long as the cpu fits it should be fine. there are a few boards that will support the conroes but were built for the older cpu's as intel made a bios for them :thumbsup:
 

The latest video from AVForums

Podcast: Best Hi-Fi products of 2020, Plus Best of the Month for TV Shows & Movies
Top Bottom