Like that helped nV30. :clap:
Printable View
..."You may say that I’m a dreamer, but I’m not the only one..." [u]Imagine[/b] by John Lennon :p
Absolute nonsense!Quote:
Inferiour product on a better process can change the tide.
Proof1: Netburst @ 65nm didn’t change tide against K8@90nm, design of the new architecture did!
Proof2: RV670@55nm didn’t change tide against G92@65nm, the new design of RV770 will…
Proof3: NV35@130nm didn’t change tide against R300@150nm, ‘cos R300 was superior design
The list goes on, and the point is: design always wins against manufacturing process
P.S
Oooh I see now that DilTech and Macadamia surpassed me with their reply to this dreamer theory of imaginative miracles of manufacturing process
I didnt say it would do miracles. But if you are some 10-20% behind on the same process. You can catch up with a forward process. 65vs45nm etc.
P4 is also a whole other issue due to leakage at extremely high frequencies.
For those with reading comprehension problems. Can change doesnt equal to will change.Quote:
Inferiour product on a better process can change the tide.
Oh, Larrabee IGP on value Sandybridge will be about 10x of G33 with value Nehalem being some 6x. They havent disclosed any discrete GPU performance hints.
RV770XT and PRO are 256bit
512bit internal ring
We've posted that RV770XT will be 512 but our usually very reliable sources misinterpreted some parts of the information. The card is 512bit internally as ATI has its internal ring buss memory interface but externally we are talking about the 256bit card.
Both RV770XT and PRO will be the same and they will work with 256bit memory. The RV770XT pairs with 512bit memory and the ultra high speed should help the card improving the performance but from what we know Nvidia’s GT200, Geforce GTX 280 will work with the real 512bit memory controller.
GT200 is a higher league all together and we believe that GT200 will easily beat RV770XT but at the same time, it will come with a much higher price, almost twice as high. It’s now less than a month to see these cards in action.
Source : Fudzilla
If that is true, I don't think they'll stand any chance, not even to the 260.
I predict that once again nVidia is going to rule AMD for 2008. I don't care what anyone says about price performance ratio, it's not about that, it's about maximum performance.
Absolute performance is pointless if no games require it. 160 fps yay.
Tell me someone who really has BOUGHT 2x 9800GX2's. Yeah, friend of yours.. and another.. and another... Now, do I even have to name someoe who has bough 8800GT/GTS? Yeah, no. ;) Get it?
Do you get the fact that they have quite the same amount of bandwidth, despite the 512-bit bus for the Nvidia card? Do you have any clue how this is possible? No? :rolleyes:
Quite logic as not every gamer had 600/1200 bucks to spend on a single or dual card setup... really don't get ya point there...If ya want the fastests cars out here you need to spend cash...
And untill I see real Fps numbers instead of all these rumours and techtalk noone can tell how these cards will perform, all these nice numbers mean zilch to me as I want to see it perform on my screen not on the stickers on the box... and that counts for both teams...
I just buy the fastest thing out there and keep that for a year (or two) If the card is green or red who cares.... I just want something to replace my 8800GTX...
While I do tend to buy the fastest thing out there, I also do so with the reason that it's legitimately worth that cost. If not for the fact that I got my 8800Ultra, I would've gotten the 8800GTX simply because paying $200 more for a marginally faster card (in all the games I care about anyways) would not have been worth it. For the same reason that if a 4870 kills every game out there, then spending $300 more on a GTX280 or whatever to play the game at 120 fps vs. 80 fps is going to be hard to justify. It all depends on just how much of a gap the cards bring in games and how big of a gap the pricing is.
Any word on the layout of the HD-4850/HD-4870 ? Last thing I read (dunno where) was that the PCB will equal the one of the HD-3870.
I can name quite a few games that could use some more horsepower...
http://techreport.com/r.x/radeon-hd3...-1920-high.gif
http://images.hardwarecanucks.com/im...GTX/GTX-60.jpg
http://images.hardwarecanucks.com/im...GTX/GTX-73.jpg
Etc....
No, the pcb will be different. I believe this old picture is still the only pcb picture out on the internet.
http://img.techpowerup.org/080522/rv670vsrv770_pcb.jpg
Probably a nice jump without AA/AF, and a bigger one with. Which should be pleasant... enough for the G92b (which should be bigger AND slower at the same time)
Now that 8xAA seems usable, watch some G92 cards drop :D
regardsQuote:
Qimonda Wins AMD as Partner for Launch of New Graphics Standard GDDR5
Qimonda AG, a leading manufacturer of memory products, today announced that the company has won AMD as launch partner for the new graphics standard GDDR5. Qimonda already started mass production and the volume shipping of GDDR5 512Mbit components with a speed of 4.0Gbps to AMD, a leading global provider of innovative processing solutions in the computing, graphics and consumer electronics markets.
GDDR5 will become the next predominant graphics DRAM standard with a tremendous memory bandwidth improvement and a multitude of advanced power saving features. It targets a variety of applications, starting with high performance desktop graphic cards followed by notebook graphics. Later on also the introduction in game consoles and other graphics intensive applications is planned.
“We are very proud to supply AMD with GDDR5 volume shipments only six months after first product samples have been delivered,” said Robert Feurle, Vice President Business Unit Specialty DRAM of Qimonda AG. “This is a further milestone in our successful GDDR5 roadmap and underlines our predominant position as innovator and leader in the graphics DRAM market.”
“Qimonda’s strong GDDR5 roadmap convinced us to choose them as a primary technology partner for our GDDR5 GPU launch,” said Joe Macri, Sr. Director, Circuit Technologies, AMD. “Both the early availability of first samples and volume shipments added great value to the development and launch of our upcoming high-performance GPU.”
More information on Qimonda’s GDDR5 products is available at: http://www.qimonda.com/graphics-ram/gddr5/index.html
Qimonda
http://www.techpowerup.com/60869/Qim...ard_GDDR5.html
http://www.xtremesystems.org/forums/images/xslogo.gif
Since when is this Xtreme Budget Systems? Maximum performance is what we need. :rolleyes:
Atom 1.33GHz (45nm) has a lower CPU score than a Celeron 353 900MHz (90nm) and that's with a worse companion chipset aswell. Process will always be beaten by a superior architecture, even if it's only clock for clock.
R600 was ATI's Netburst, RV670 was it's Core Duo & RV770 is their Allendale. Still no Conroe yet though.
seems like everyone is forgetting that the GT280 will have to fight the 4870X2
You forgot the wattage and size. Process is not always beaten by a superiour architecture. Unless the architecture is superiour enough to ofset the process gain.
If I can take 2 different GPUs. A is 20% faster IPC than B. But A is a 1Ghz GPU on a 65nm process like B is on 65nm. If I make GPU B on a 45nm process and make it run at 1.3Ghz. Then B wins. Very simple. If B still run under 1.2Ghz A wins. Asume they are all same TDP.
FischOderAal,
Just the version with the pasted GPU is fake. AFAIK, there's no reliable info on what the card with the dummy GPU actually is.
The RV770XT is paired with GDDR5 thus has comparable BW to GT200... 512bit external bus would greatly increase cost(both for the manufacturer and consequently consumer) due to PCB complexity. 512bit + GDDR5 = overkill; that would hurt sales and margins.
Wow... So only $300 for the RV770XT, have any proof to back those claims? I thought it's MSRP was $349 which thanks to etailers price gouging is always more than that...
HAHA, love the cheap shot. I think you might be taking this a little to personal...
While a smart move for the budget limited, that's about as extreme as watching paint dry.
why would you get low end hardware? At the worst I would get performance/midrange stuff, and either way that's not exactly extreme. Smart buying and being extreme are two different things.
extreme: "exceeding the bounds of moderation"
While it is smarter to buy things like an 8800gt or 9600gt and e8400 or even amd cpu than a qx9650 and 2 3870x2 for 3dmark benching, that isn't exactly what most people here consider extreme
no problem
The biggest issue I have is that there are people that try to say that you have to buy the most expensive hardware to have a good rig or that you can't spend xx on certain hardware also. There are some situations where the extra $30 can make a huge difference (such as 8800gts 512mb, it can be bought for as low as $160, only like $30 over the cheapest 3870), but also it isn't smart to buy the 8800ultra when the g92 gts outperforms it. Thing is its hard to teach how to understand that balance without letting people make their own mistakes/decisions, I know I learned the hard way myself by not waiting for conroe and going straight to am2, but hey it does what I want and so I'm fine with it for now, besides I think k10.5 will be a great cheap upgrade for me
RV870 will be on a whole new architecture and the latest insider info revealed that it will have 1000 SP. RV740 is 45nm version of RV730 and will appear by end of the year.
http://bbs.chiphell.com/viewthread.p...extra=page%3D1
WTF is ATI doing? they have to make 2 GFX revisions to compete with the gt200? well, that will mean dirt cheap RV700s!!!! :D
My understanding is perfectly fine thank you, you gave a definition yourself, where does it talk about money... right, nowhere. Some are silence extremists, some are cooling extremists, some are performance extremists, to each his own. Going by your reasoning the noob that buys the expensive stuff and run it stock is extreme while the guy that buys midrange, performs all the vmods and whatnot and blow the noob out of the water performance wise, well this guy is not extreme.
BS
Well, this sites adress says everything:
http://bbs.chiphell.com/
bbs = big bulls*** :P
I love these leaked infos.
But see I never said you can't blow the n00b out of the water, but we're not talking about average people here at xs, you're being compared to people that just about all of them know how to oc and what not, and that's why the more expensive hardware can help push people like shamino and kinc towards the world records.
So yes you can be extreme compared to the average person, but I was comparing you to the people here at xs, which that does result in having to spend some more money
Fake but it has been rumored that AMD's reference RV770 will use a slightly tweaked RV670 PCB but with better components.
Who said anything about budget. We are talking about economics, finances, business strategies, etc. But still, comparing a ~$300 card to a +$600 card is pointless, compare it to the AMD card in the same price range.
Your answer-
Well with it rumored to be offering better performance than the 9800GTX it would make sense to price it at $350 but I don't see AMD wanting to give up any sales, so they will price them lower.
I don't know how many times you have been asked, I know it is more than 5, but please read the thread or at least use the search button.
Also, Jawed was the one who just used 2000 ALU lanes as an example, explaining something else and it looks like certain people wanted to make a rumor out of it.
It is possible though, it is only 1000shaders on each GPU.
Chiphell usually has some pretty good information, only because they usually post ANY information/rumor/BS.
Yeah, but all of those games are old, bad, and boring. World in Conflict blew, I hated it, Crysis was a tech demo worth playing once, and Call of Jaurez .. ugh.
Oh, as for Tim's statement about the 256 bus hampering the RV700 .. you haven't been reading this thread or anything concerning ati's new card, have you?
:rofl:
I have 6 USB slots at the rear of my tower PC not being used.
Now, would I have 6 times the speed as being USB, it would run in serial mode?
But, they would be seen as 6 independent multi-core video cards, would this be in SLI or Crossfire mode?
The only problem I see is finding space for the 6 x 2,700 watt power supplies needed to power these wonders of technology.
:rofl::rofl::rofl::rofl::rofl::rofl::rofl:
this is all becomming a circus, is there any concrete release dates yet?
this going happen the 18 for 4850, a week after for 4870
So strange, how end of last year AMD said no new cards, yet here we are talking about such a card. My head is doing cartwheels with all the rumours....so you say June 18th?
True! very true. Lots of people can be extreme in their own ways.
The likes of Shamino, Kinc, FUGGER etc are what I see as the Xtreme end of the spectrum. Where almost all the rest of XS members aspire to be.
Then you have your average enthusiasts that are happy with Q9450's or E8400's or even X4 9650's etc & overclocking to get higher performance.
Then you have the budget enthusiasts who buy E6xxx, E4xxx, E2xxx, X2's or X3's & overclock them to the hilt.
All are Xtreme in their own ways (for some Xtremely boring, others Xtremely ignorant etc) and it is that variety that makes XS work for even average Joe Bloggs to learn something new, get good advice or just have a laugh at all the fanboys going at each other.
I fall into the average/budget enthusiast as I spend a set amount every year (£2,000-£3,000) to update 4 PC's & a server. Being Xtreme doesn't mean XtremelyBankrupt, altho that might be an idea for a sister site for those that take it too far.
Make it some. And even if they could get all their hardware for free... you're still starring at the ocean but fail to see the water :D
You're missing the cooling costs, such as the expensive pots, LN2 dewars, LN2 supply, expensive thermometers & multimeters, etc etc.
"Corrected above".
I forgive you ;)
While we do get a lot of freebies, the amount of money you have to spend on hardware and the other stuff which is required to stay at the top is pretty insane.
Even if we (NH) had a deal with AGA for liquid nitrogen, it still cost us a pretty penny.
(I graduate in 2 weeks, I'm finally going to start making money again. God damn, this is going to be one expensive summer...)
//Andreas
I agree, even if i was rich i wouldn't spend $400+ on a GPU or CPU.
i would.....
and a few smaller sys for xtreme oc and crunching
That's what I was going by, Shamino was recently interviewed and I've talked to kingpin a couple of times (and I've got to say he's an amazing guy, definitely hasn't let the fame go to his head) and I can definitely say he doesn't get everything free, and that he's spent plenty of money on hardware (especially when trying to find a "golden" chip)
Those are shader clocks.. and it looks like they're overclocking
It does look a shader clock, but I have to ask where is the core clock? BTW- It does say core clock at the bottom.
And no, it doesn't look like they are overclocking.
Also would like to point out the fruits of Jawed searching through certain companies patents-
Quote:
Originally Posted by esp@cenet
hurry up and release the cards already.
dog champing at the bit, drool slaver.
Here are the cooler pics; rv700, rv700xt,rv770pro,rv730
source translated; http://translate.google.com/translat...F8&sl=de&tl=en
stock source; http://www.hardware-infos.com/news.php?news=2089
i want do CF with my old 3870 please ^^
The Cooler Pictures of Radeon HD 4870x2, 4870, 4850, 4690 leaked
http://www.pczilla.net/en/post/21.html
I like how they are making the bases of the rv770pro and rv730 coolers' all copper now, that should make a nice difference. Only big complaint I have is wide did they got back to the r600 style cooler from the rv670? They should have left the bottom of the fan open imo, that would have allowed for better cooling of the power supply and also more air could be taken in with less noise. At least they decided to put some heatpipes on the rv770/r700 coolers, I think that was a mistake not to with the 3870 and 3870x2 even if they didn't need it, that could have brought complete silence and better cooling.
I like the concept of the blower they're continuing with the r700, reminds me of the original monster 2900xtx cooler (which I think it was a shame that it never was released, even if the performance difference wasn't much they still should have released it for the uber cooler it had). That thing had something like 4 heatpipes with a blower at the end, I would gladly deal with the length anytime for amazing cooling performance.
Anyone got any idea why the sides are left open at the end of the coolers (for the 4870 and 4870x2)?
Black and red = win! :)
I've said it before I'll say it again.
The HD 4850 pro only requiring a single slot cooler even after the architecture upgrade is a definite accomplishment on the part of ATI.
Perkam
Yup, that monster one that only a few OEMs got in the end, I still think at least ati should have offered it as say 2900xt ultra edition with much higher stock clocks or something along those lines, nvidia did that with g80 and made plenty of extra sales that way.
I couldn't agree with you more, 480 shaders with a single slot cooler will be amazing, I just wonder how loud the fan will be and how hot it will load (8800gt is a good example of that)
Charlie is saying that the R700 should beat the GeForce 280.
http://www.theinquirer.net/gb/inquir...0-280-revealed
That Anti-nV talk just hurts my brain. :shakes: Though, at least it is something against the regular anti-ATI talk, but doesn't really help here.
It's all FUD! GT200 > R700 in terms of raw performance(so 3DMark benchers are happy), but performance/price ATI wins.
Nah I think the r700 will spank gt200 in 3dmark (except maybe Vantage, I think Nvidia had a lot to do with that though), we're comparing 960 shaders (minimum, very low possibility of 1600) and 3dmark is extremely shader intensive. As for in game performance, it'll be close, but the gtx 280 will probably have a slight lead over the 4870x2. As big of a leap the rv770 is over the rv670, ATI is still playing catchup so its hard to say at this point
And guys leave the fanboyism out of the thread please, give an honest guess of what you think will happen, there's no point to have a flamewar ruin a 40 page thread
Well if we extrapolate our non existant data ie: 1x GeForce 280 is 3x faster than a 3870 and a RV700XT is neary 2x 3870 then the R700 should just about win, thats my 2 cents.;)
See the thing is, folding means nothing for in game performance, and also due to the law of deminishing returns, except with high detail I don't think the 4870 will be 2x the 3870
nVidia probably used a slower CPU for their folding "benchmark". Ahem. As CUDA isn't as CPU intensive as CAL is- for now.
As an ATI fan, i didn't mind nVdia had the performance crown and ATI the price/performance crown, but I wouldn't mind if ATI would take the performance crown for a change.. This because nVidia was pretty much on top last 2 years.
although the GTX 280 will be very powerful, i don't see any trouble for ATI as nVidia's top model will consume 236watt with all negative details that will give.
ATI will have some powerful cards for a good price, while there top dualgpu card probably can compete with nVidia's best card.
weeee, i am so psyched for both 4870 and GTX 280, the price of 4870 makes it looks very attractive, can not wait for leaked/rumor bench numbers
Are you thinking of the 4870 or the 4870x2? (which is the R700)
because two 4870s vs one gt280 could very well result in nvida being inferior as said.
I don't think it's biased to make such a guess
Yeah, it should. Whether it will is a guessing game for now
i'm not impressed by the stock 4870x2 cooler.... single heatpipe?
2 heatpipes.
I reckon 4870 vs GTX260 is a much closer fight than most reckon, probably 3870 vs 8800GT v2, but this time around the former card IS attractive enough.
2 4850s face off GTX260 as posted earlier, same prices but probably better performance.
And the big question...
2 4870s vs GTX280. It's hard to estimate, but there might be a probability that BOTH are competent. :rolleyes: ;)
no way the 4850 is up against the gt260 with the current rumored specs. Imho the direct competition of the 4580 will be the G92b
RV770 is not very powerful compared to GT200. Even if it was some 65% of what GT200 is(and some 130% of what G92 is, 105% what G92b is), two of them would result in 130% yes, but problems with CF and scaling makes R700 to be some 90-95% of GT200. This is my guess. :p
dude no way g92b will be 20+% faster than g92, its just not going to happen. G92b is a direct shrink, look at the r600 to 3870, did you see any big ipc gains? Samething will happen with g92 to g92b, it'll drive down prices and costs+up yields, but it won't do much for performance. You'd need it clocked at something like 800+mhz stock for it be 20% faster than g92 and that you know as well as everyone else that won't happen unless nvidia plans on having low yields again
Yeah G92b is 55nm so its an optical shrink - it'll be better priced than at 65nm but performance wise, it won't see anything massive improvement wise
I'm not firing up my calculator, but let's say that G92b will arrive clocked at ( I'm not saying things that originate from any source, this is pure speculation, not even well thought :p: )
Core 800MHz
SPs 1998MHz
Mem GDDR5-3200
Which is feasible considering that a 65nm 8800GT/8800GTS512/9800GTX can run those clocks ( 780 to 800MHz core depending on your luck ) ( except the RAM obviously ) with a decent air cooler ( HR03-GT ).
There's also a chance ( not that much, but still possible ) that nVIDIA has touted the SPs and the MUL is working "properly" on G92b ( it didn't work most of the time on G8x/G92/G94 ), which results in even better numbers.
We just have to wait a tad to find out ( for the info, the card won't be out before August more than likely )
Yeah. I just threw the G92/G92b numbers without thinking really. If G92b is (only)G92 @ 55nm, then with what is nV going to compete agasint RV770? ...haven't heard about low/mid range chips based on GT200. :|
Well, anyway I think that R700 stays somewhere between GTX 260 and GTX 280, leaning towards GTX 280. Though, I am not very sure about how big difference there is between 260 and 280. Heard about 25%, which is IMO rather high to be true.
Nothing, except maybe the gtx 260, but that's probably designed to compete with a 4850x2 (assuming ati makes one), or perhaps gt200 is so powerful it will compete with the 4870x2 and there won't be any competition for the gtx 280 (like what happened with g80 vs r600, the g80 gtx was so powerful that the gts was in line with the 2900xt)
It is possiblefor the g92 gts to run there stock (I haven't seen any non modded gt though, don't know where you've seen that), but the thing is, g92's yields were bad enough on 65nm, I don't see how they'll be that much better on 55nm that they can run stock 800mhz (I predict probably around 700mhz for the 9900gt and perhaps 750, top of 775 for the 9900gtx). Also why would g92b get gddr5? You said it yourself, the reason why nvidia didn't bother with gddr5 for gt200 (aside from more ROPs, but they could have changed their design to work around that) is because ATI would get just about all the GDDR5 in the channel. So they'd be hurting their yields even more to go with GDDR5, I just don't see it happening.
When's the launch date? Is it before or after gt200?
I believe it is somewhere near GT200, a bit later I think. So August it is. :P