You're making a lot of assumptions there.
Well yes the ps4 and xbox1 are both semicustom designs with similar building blocks, but by the looks of it the ps4 has 3 memory controllers while the xbox1 has 2. See chip diagrams below.
xbone: http://www.chipworks.com/en/technica...-the-xbox-one/
ps4: http://www.chipworks.com/en/technica...ore-processor/
So yes the building blocks are there, but those are totally custom architectures. The OS on each device will have maybe 10 things going in the background while a normal windows/linux box could have 100 and that might be enough to hide whatever latencies are in there. I don't know as I am no micro architecture engineer, but I don't think you can claim that just because it works on a console that it will work on a general purpose computer.
How certain are you about that? More cache can increase the latency on storage and retrieval as there's just more to search for a given bit of data. Maybe the cores are tuned to expect a given latency and more cache would throw them out of wack (technical term).
Here's a good read on jaguar's memory and cache architecture if you'd like to read up on it.
http://www.realworldtech.com/jaguar/6/
Not sure where's you're getting that. Maybe, but I don't see the evidence one way or the other.
In order of bolded text:
Sure if it works well; Some are yes; Big is a vague term. 20% might be big but if 20% = 3 fps then... not so much; That's the ticket right there, we don't know what "properly configured" means or costs; Might not actually be on the cheap if development costs are huge; And they sell it as a server cpu where threads are usually the performance metric (as opposed to memory bandwidth, IPC, or whatever); You seem to be arguing that they do and that that it's in the xbox1/ps4
Actually MS paid AMD for the dev costs and AMD gets a cut on each chip made.
Bookmarks