-
yes, but onboard video makes it geat for an internet machine, OR folding, or and distributed computing platform.
I already have a gaming system..don't need much funtionality for my others in regards to 3d. Just sold 2 systems too...gotta build 2 more...but not with the sapphire board!
-
why would you want a DFi if you just want a folding board?
It is made for two things Gaming and Benching....
and CrossFire shouldn't even enter that equation....
-
I don't want a dfi..i want a board with decent overclocking options and onboard video. Not one company offers that with a current chipset.
-
Quote:
Originally Posted by cadaveca
I don't want a dfi..i want a board with decent overclocking options and onboard video. Not one company offers that with a current chipset.
Yes there are boards like that..for 939, the MSI Rs480 comes to mind.. ECS has a board like that.. the list goes on and on...
it just isn't at the same level of overclock that most people here are used to...
-
my point exactly. They are forgetting about a fair share of the market, imho. I know the majority of the market is not ready for such a beast, but it doesn't mean most of us aren't! How many college student gotta buy a vidcard, and end up with a crap card or a PCI card, becaseu it's all they can afford? how many here? I mean really...tehy bought the dfi, and dealt with the lack of 3dperformance...because of the other stuff it offers. if it had video, and having video did not affect performance...then why not?
-
actually there is always a small performance hit just for having it in the board... but only a few hardcore users would notice it...
-
Quote:
Originally Posted by nn_step
actually there is always a small performance hit just for having it in the board... but only a few hardcore users would notice it...
what website is this again?
oh yeah, we be hardcore users
-
Quote:
Originally Posted by Molester
what website is this again?
oh yeah, we be hardcore users
Hence I said none of us would want onboard graphics chips then...
-
Quote:
Originally Posted by nn_step
Hence I said none of us would want onboard graphics chips then...
Exactly :fact: , lets get off this BS onboard GFX subject, only ~ 1% would want it = waste of time
-
i think he was asking you to quote a source, because i think he, like i, disagrees with your statement.
The only way onboard video can hurt is if its turned on, in which case it would use up system memory.
otherwise, when properly implemented, there shouldnt be a performance decrease.
now, im not saying that i want onboard graphics, personally i wouldnt pay for it because for the extra price you can get a cheap PCI-E card that performs just as well or better that doesnt take up system memory.
-
You are 100% correct. :fact:
If the onboard graphics is disabled, it will not degrade performance at all, not even a little bit. Atleast that's how it works with our chipsets. :D
Quote:
Originally Posted by Revv23
i think he was asking you to quote a source, because i think he, like i, disagrees with your statement.
The only way onboard video can hurt is if its turned on, in which case it would use up system memory.
otherwise, when properly implemented, there shouldnt be a performance decrease.
now, im not saying that i want onboard graphics, personally i wouldnt pay for it because for the extra price you can get a cheap PCI-E card that performs just as well or better that doesnt take up system memory.
-
There are sooo many advantages to having IGP it's stupid! One scenario for the non-htpc crowd would be if you had to RMA your new viddy card and didn't have another to use, especially with the switch from AGP to PCIe, you could RMA your part and continue using your computer till it's replaced.
-
Quote:
Originally Posted by IRQ Conflict
There are sooo many advantages to having IGP it's stupid! One scenario for the non-htpc crowd would be if you had to RMA your new viddy card and didn't have another to use, especially with the switch from AGP to PCIe, you could RMA your part and continue using your computer till it's replaced.
that's what i wanna say too :toast: ;)
-
but wouldn't it be wiser just to use a PCI graphics card...
a $40 one would out perform any onboard chip under $50 out there...
-
Quote:
Originally Posted by Grayskull
You are 100% correct. :fact:
If the onboard graphics is disabled, it will not degrade performance at all, not even a little bit. Atleast that's how it works with our chipsets. :D
Why don't you send me a reference board based on RS480, and I'll show these guys! :D I'd even pay for it...heck..i want 2!
-
I already Tried the RS480 and the RXS480
teh only difference is the addition of the graphics integrated and even when disabled the RS still Ran 1-2% slower than the RX...
and the RX was cheaper of the two...
-
Were they reference boards?
-
yep.. they belonged to my friend Dan but I convinced him to let me just do a couple benches...
-
That's interesting..still would like to try for myself.
-
Go for it...
i would like someone else to prove me right...
-
Quote:
Originally Posted by nn_step
but wouldn't it be wiser just to use a PCI graphics card...
a $40 one would out perform any onboard chip under $50 out there...
RS480 have the X300 IGP? and an old PCI card will out perform it? Yes, I'd like to see that too. :stick:
BTW what DirectX version does the PCI card support? and will it accelerate
Vista's GUI?
Quote:
ATI has also taken this opportunity to graft a full DX9-class integrated GPU that carries surprisingly good pixel and shading performance to boot. It's also unique in using an optional motherboard-mounted local framebuffer that's architected to alleviate the possible performance penalty imposed accessing main system memory.
Hexus
-
i think the point is you can get an x300 for pretty damn cheap in pci-e form.
-
Ya, that makes more sense, however, it would cost more to do that and IGP allows surroundview, which an PCIe would not allow in a single slot motherboard (I think).
-
good surroundview uses Two graphics cards.. and remember your slowest card/Chip is what limits your Fps/Settings in Surround view...
Two 7800gtx is parralell but not Sli would be best.. atleast untill we see about the R520/R580
-
IGP affects performance to some degree when activated ? OK, disable it in bios - check!
IGP is a great backup in case the main pci-e gfx goes bunkers and needs to be RMA? Ok- check!
Euhhh...pci-e gfx goes bunkers and IGP was disabled in bios in order to gain performance , guess what? No screenie available to go in bios to activate IGP = SOL!
Thanks, I'll go get meself an x300 as a backup on ebay for $30.00.- Check!
-
-
Quote:
Originally Posted by ReelMonza
Clear CMOS.
True, but bah..$30.00+ more on the acquisition of the motherboard with IGP versus a backup gfx. I'll save meself the trouble and get the backup gfx, that's just me though.
-
In a benching board I want onboard 2D 4 meg discrete graphics with a removeable line going to a header and jumper that forces it enabled or disabled as the default video ...like on a server board with an discrete Rage XL. You can keep the silly infrared port...what a useless waste of space and bios data room.
That's all I need to fix bad flashes on vid cards etc....If I can avoid having to plug something else in and pull this and that out of my slots all the better. My backup card is typically the same as what I'm working on so if I have to RMA it, I won't settle for an x300. Yeah I have an old PCI vid card for "those moments". Nice if I didn't have to go fish it out all the time.
If you bench Pi a lot, you are using a PCI 2D vid most likely anyhow...mninimal drivers and overhead....
-
Grouper? (897) or has new thread started and i missed it.
-
where would that thread be?
-
That is not how ATI surround view works. FPS is determined by the display the application is running on. If the app is run on the display driven by external gfx, then that is what will do the rendering.
Quote:
Originally Posted by nn_step
good surroundview uses Two graphics cards.. and remember your slowest card/Chip is what limits your Fps/Settings in Surround view...
Two 7800gtx is parralell but not Sli would be best.. atleast untill we see about the R520/R580
-
So, will these new ATI product be available in Canada at launch this time? Country-wide?
-
-
It looks like Cross Fire might be borked :(
-
OMG i really really hope that aint true
-
Yeah, saw this over at the Inq too. If this is true Crossfire is born dead.
-
Not a problem for me, I'm more interested in grouper. I do feel sorry for u dual card guys though.
-
i would only run one card at first but if this is true then no point in crossfire at all, this is a sad sad year for ati already how can they make such a mistake
-
I really do hope that this is'nt true, I personally dont give a rats about crossfire BUT this could gut ATI :(
These sort of problems/total screwups only occur when there is no OC'er/Mad Gamer involved in the process :rolleyes:
-
yeah, i was following that story over at rage3d.
If it's true, it would be the deal breaker for me.
I run all my games at 1920x1200 on my 24" CRT.
Something tells me that story is bogus though. ATI wouldn't be that stupid.
Only thing is, they specifically say that chip is used on the x8xx master cards, so it might not be true for the r520 based cards.
I am waiting 2 more weeks to see what ATI has to offer, or I am pulling the trigger and switching over to nVidia. :(
-
Maybe Grayskull could shed some light on this one.
-
ya cmon Grayskull say it isnt so
-
it wouldnt matter anyways. The cards would be godly expensive. According to Anandtech the XL version will retail for $499!!! Come on now!
-
come on now guys we all know this isnt true....
i mean macci and anandtech have had CF running at diferent resolutions before....
-
ATI CrossFire - Presentation circa Late May '05
-
That article is almost 5months old. Its pretty much irrelevant by now.
-
CrossFire is about dead at this point... but the single card solution boards might still be able to restore them...
-
Quote:
Originally Posted by nn_step
CrossFire is about dead at this point... but the single card solution boards might still be able to restore them...
:with:
-
Quote:
Originally Posted by situman
it wouldnt matter anyways. The cards would be godly expensive. According to Anandtech the XL version will retail for $499!!! Come on now!
What's so godly expesnive about it?
People are buying the $600 evga 7800gtx KO x 2 for their sli systems.
-
correction 579-456 :fact:
-
Quote:
Originally Posted by situman
That article is almost 5months old. Its pretty much irrelevant by now.
Just part of the literature ATi was handing out at that point? And without any evidence to refute the rumors of no 1920x1200 its not looking good for ATI. Makes me glad I got in on two of the EVGA 7800GT/SLI mobo combos listed in the Deals section. I might get a DFI/ATI mobo to play with but it has no chance of getting into my game rig if CF can't do 1920x1200. Bah!
-
Quote:
Originally Posted by MaxT
What's so godly expesnive about it?
People are buying the $600 evga 7800gtx KO x 2 for their sli systems.
put it this way, the XL version is "suppose" to compete with the 7800GT assuming there won't be a 7800Ultra. Somewhere, people can get a 7800GT for about $369 with a free mobo at eggy. I dunno, you do the math. Right now I don't know how the Xl will perform. It might be meant to compete with the GTX, but who knows.
Secondly, just because people are willing to pay 600 dollars for a vid card doesn't make it inexpensive. It only means those people are willing to spend more than 99% of hte people out there.
-
i for one thinks ati should abandon the idea of dual vid cards setup, and focus more on 1 card that outperforms 2 cards, and for cheaper, they'd make more money that way
but if they do, they definitely need to make sure the chipset can fully handle 16x on both cards, and enough more for sound cards and etc.... none of this 8x/8x crap, and allow for the 4000x3000 res's
-
Molester, you think ATI making ONE video card that beats out 2 = more money for them? HAEL NO!
one top end card = $500, $600 max. two high end cards = $400-500, maybe $600 per
from a marketing standpoint; ATI does not need to kill nVidia; they merely need a slightly better product at the same price. So if ATI can make a pair of cards that compete with the 7800 series for about the same price they'll make the money just fine. Especially with people who want one card STILL PAYING the $400-600 range for the high end.
-
let me put it this way, let's say 1 nvidia high-end costs 400 bucks, 2 would cost 800 for sli, now, if ati could make a single card that beats the 2 nvidia, and sell it for say, 700, which would you buy? the 2 for 800, or 1 for 700 that beats the 2 for 800??
-
I would buy the one that gets me the most FPS in WZ21.. the end...
-
Quote:
Originally Posted by Molester
let me put it this way, let's say 1 nvidia high-end costs 400 bucks, 2 would cost 800 for sli, now, if ati could make a single card that beats the 2 nvidia, and sell it for say, 700, which would you buy? the 2 for 800, or 1 for 700 that beats the 2 for 800??
this would be nice and all, but the fact is makingone card that is faster then two others just isnt possible right now, i mean, thats like saying "why not make a 64pipe card with 600 mhz and a gb of 2000mhz ram, and then price it under the competition, i bet that well sell!"
im sure it would, problem is the state of the technology, and the fact that the companies have to hit a certain price/performance rate to make any money at all.
-
lol, u guys are goin too far w/ this, all i was saying is i'd rather them stick w/ single card
-
then do it, no one will try to stop you...
-
Quote:
Originally Posted by Molester
lol, u guys are goin too far w/ this, all i was saying is i'd rather them stick w/ single card
frankly so would i, but niether nvidia or ati would let the other company away with a market advantage like sli, hence ati moving now to crossfire.
-
-
Right now I have the option to upgrade to a new Mobo,gpu combo, I'm jumping to Pci-e of course, but I was going to get the Dfi nf4, but then I came in here and seen that 2 more weeks for the crossfire, I'm gona wait 2 week but thats it, if in 2 weeks they say ooo 2 more weeks, then i'm going nvidia also if I can't play at 1920*1200 on mu 2405 fpw it's a no thank you.
In my opinion ati is way behind on this, sli has been out and working for a hell of a long time now, and now the 700 has been out for months and no new cards from ati, let alone this crossfire board.... where is it.... hello ati, wake up, catch up or risk falling forever beind.
-
one can only assume ati has taken it's time so they can put out a chipset above all others, well, at least, that's my 2 cents
-
Quote:
Originally Posted by nn_step
not to be an ass, bt didnt you read the thread, dfi is making one called the CF-BT.
-
Ah hah "Revv23" that there looks like an ATI board for sure. Even looks similar to a cross-fire that BT was talking about before he left on vacation about 8 days ago. It could easily look similar since it has place for two video cards and the other ATI series seems only to have 1 long pci-ex video card slot according to the pictures I have seen on the net.
My understanding is that BT said that DFI was building Cross-fire board and no IGP and that the name was as yet not finalized at a LOL, CF-BT. Hehehehe.
Now I don't know a thing about resolutions or the like and that is not board related anyway but is something about chipset on the ATI master card so that horse is not my concern. We will all know more about the cards themselves after NDA and release is here.
The CF-BT will come after launch of ATI cards anyway so that piece of video puzzle will be out for all to see and cuss or discuss. They can then go ATI crossfire on whomever has a rocken board or go with Nvidia and that will be up to the individual user.
I wonder where that picture came from as it is a cute board. Hehehehe.
RGone...
-
The board looks to be the Sapphire "Halibut". I hope that it gets the good caps. I like the mosfets arrangement, but will wait and see what DFI brings.
-
Quote:
Originally Posted by markr
The board looks to be the Sapphire "Halibut". I hope that it gets the good caps. I like the mosfets arrangement, but will wait and see what DFI brings.
sapphire is not making the halibut, dfi originally had a RDX200, a crossfire version of thier lanparty board with better power managment.
now dfi has switched to the halibut reference design(CF-BT), though they probably wont have the IGP on it.\, and the coler scheme will e different.
Rgone: i am sorry my friend but i was unable to understand your post, while i admit that i am somewhat less then sober, so it may be the spirits or perhaps i am just not understanding your words... either way, i will try again to respond to your post tommorow.
edit - Rgone. I figured it out i think, im not 100% on this, but i am pretty sure that the CF-BT will be of halibut reference design.
-
Quote:
Originally Posted by Revv23
not to be an ass, bt didnt you read the thread, dfi is making one called the CF-BT.
Yes I have but I didn't see mention of when...
I heard soon and later but not when...
*edit I was using it as an example for a dual PCI-E x16 mobo with crossfire support*
-
Quote:
Originally Posted by MaxT
yeah, i was following that story over at rage3d.
If it's true, it would be the deal breaker for me.
I run all my games at 1920x1200 on my 24" CRT.
Something tells me that story is bogus though. ATI wouldn't be that stupid.
Only thing is, they specifically say that chip is used on the x8xx master cards, so it might not be true for the r520 based cards.
I am waiting 2 more weeks to see what ATI has to offer, or I am pulling the trigger and switching over to nVidia. :(
I also hope this is only true for the x800 cards. I have the exact same monitor you do and, like you, would probably go blind if I had to use 52hz at 1920x1200 :p:
-
Good news, one of my contacts to Sapphire have been promising a Grouper sample 2 months now but unfortunately haven't been able to deliver. I got in touch with another guy this weekend and should be receiving a board next week. I'll be testing it against DFI's nF4 Ultra.
-
Quote:
Originally Posted by Sampsa
Good news, one of my contacts to Sapphire have been promising a Grouper sample 2 months now but unfortunately haven't been able to deliver. I got in touch with another guy this weekend and should be receiving a board next week. I'll be testing it against DFI's nF4 Ultra.
Now theres some good news :cool:
They gonna supply you with any new gfx cards too? ;)
-
-
yay, more speculation
i still hold out on hopes the delay is due to wanting to release the best they can
-
No bias in that report :slap:
I think NVIDIA folks are still mad at ATI pulling this prank on them earlier this year :p:
I wonder when Tony gets back if he can release some more news on the boards he has been playing with.
-
Quote:
Originally Posted by Molester
yay, more speculation
i still hold out on hopes the delay is due to wanting to release the best they can
As much of a direct attack on ATI it is, a lot of the material expressed in those slides is not "speculation." On the contrary, a fair amount of the information is true. Perhaps you should have sifted through before you sought to make a blanket defense.
deception``
-
Sorry if this is a stupid question guys but can someone just clear it up for me?
Does the DFI CF-DR exist anymore or has it been replaced by the DFI CF-BT?
In other words are DFI releasing two differant boards, or has one taken the others place?
I ask because if you look around the web you can see the CF-DR listed for pre-order.
-
Quote:
Originally Posted by Afterburner
I just love how ATI not having nTune software is a liability... :rolleyes:
-
Quote:
Originally Posted by Afterburner
I'm not too crazy about Shader Model 3.0 or HDR, i think the cathedral or
building on the right without it in my opinion looks better detailed with the
toned down spot lights.
-
Quote:
Originally Posted by madpete
Sorry if this is a stupid question guys but can someone just clear it up for me?
Does the DFI CF-DR exist anymore or has it been replaced by the DFI CF-BT?
In other words are DFI releasing two differant boards, or has one taken the others place?
I ask because if you look around the web you can see the CF-DR listed for pre-order.
There will be DFI LanParty UT RXD200-CF-DR and it'll be out in the beginning of October. I should get one of these also to play with.
-
Quote:
Originally Posted by Sampsa
There will be DFI LanParty UT RXD200-CF-DR and it'll be out in the beginning of October. I should get one of these also to play with.
Right, thanks Sampsa. :toast:
-
Quote:
Originally Posted by deception``
As much of a direct attack on ATI it is, a lot of the material expressed in those slides is not "speculation." On the contrary, a fair amount of the information is true. Perhaps you should have sifted through before you sought to make a blanket defense.
deception``
But is also old old, really friggin old "bugs" reported. You think they'd fix something that major don't ya? :stick:
-
I want some unbiased benchmarks...and I want them yesterday.. And if someone from ATi is reading this.. Please have this goal meet...
on a side note..when do you guys think the DF-BT will be coming out?
-
Quote:
Originally Posted by nn_step
I want some unbiased benchmarks...and I want them yesterday..
Would be a bit of a novelty wouldnt it?
G
-
sapphire grouper has been reviewed by many sites and yet theres no pre-orders of any sort. The guy that made the decision to cut corners on those caps should be on the unemployment line or be shot. If that guy is the President or Chairman or whoever the top dog is, the company should just fold.
-
Quote:
Originally Posted by situman
sapphire grouper has been reviewed by many sites and yet theres no pre-orders of any sort. The guy that made the decision to cut corners on those caps should be on the unemployment line or be shot. If that guy is the President or Chairman or whoever the top dog is, the company should just fold.
I would prefer to see him publicly caned to death...
-
personnaly i think sapphire is wasting thier time with the high quality caps, everyone here is going to get the dfi after all of sapphire's BS, and everyone else wont know the difference.
at this point sapphire would be better off offering these boards low priced as possible with all features enabled and a bios that doesnt kill caps, personally, i dont care what brand caps are used, as long as the bios doesnt blow them.
-
Quote:
Originally Posted by REBEL900
I'm not too crazy about
Shader Model 3.0 or HDR, i think the cathedral or
building on the right without it in my opinion looks better detailed with the
toned down spot lights.
Well if i was a sniper and i was sniping far away targets it looks like HDR would create a long distance fog table at a shorter distance thus HDR would be useless. Secondly I like the way they cut out some of the quality ( lack of clouds and such to make the gamma bright??????)
-
Quote:
Originally Posted by Revv23
personnaly i think sapphire is wasting thier time with the high quality caps, everyone here is going to get the dfi after all of sapphire's BS, and everyone else wont know the difference.
at this point sapphire would be better off offering these boards low priced as possible with all features enabled and a bios that doesnt kill caps, personally, i dont care what brand caps are used, as long as the bios doesnt blow them.
I dont agree, i would probably go with sapphire if they sorted out the cap issue because i will never be able to afford a dual vid card system in the near future so have no need for the Crossfire on the DFI board, so would prefer a single vid card board.
I care about what brand of caps are used if they affect my overclock.
G
-
the key word being "if"
the only reason xbit even mention the cap issue was because they had a bad bios...otherwise, look at all of the great overclocking boards reviewed that didnt have problems...
-
Quote:
Originally Posted by Master_G
I dont agree, i would probably go with sapphire if they sorted out the cap issue because i will never be able to afford a dual vid card system in the near future so have no need for the Crossfire on the DFI board, so would prefer a single vid card board.
I care about what brand of caps are used if they affect my overclock.
G
I'd second that, plus crossfire unable to do better than 1600x1200@60Hz is just... Well, useless :fact:
-
Lowrun, the product hasn't even been reviewed officially, stop spreading misinformation... You don't know for SURE what the final capabilities are.
and you are OT to boot
-
or let me prove it one way or another.. I am the most impartial person here.. All I care about is how high I can overclock and how to get more FPS in WZ21...
-
Quote:
Originally Posted by nn_step
or let me prove it one way or another.. I am the most impartial person here.. All I care about is how high I can overclock and how to get more FPS in WZ21...
Without it blowing up your other 'bits', or itself, or both in the process :cool:
-
Quote:
Originally Posted by alpha0ne
Without it blowing up your other 'bits', or itself, or both in the process :cool:
hey that was an accident and it was in the persuit of unlocking an A64...
-
Wouldn't you be more dissapointed if he didn't blow up his other bits? That's Xtreme :)
-
Quote:
Originally Posted by nn_step
hey that was an accident and it was in the persuit of unlocking an A64...
Hehehehe well thats different coz its in a very good cause, you know, like curing all diseases, or making a certain bush permanently disappear :D
(looks over my shoulder, then hides under the bed............waiting for the knock on the door............or "them" kicking down the door :eek: :slash: )
-
Quote:
Originally Posted by alpha0ne
Hehehehe well thats different coz its in a very good cause, you know, like curing all diseases, or making a certain bush permanently disappear :D
(looks over my shoulder, then hides under the bed............waiting for the knock on the door............or "them" kicking down the door :eek: :slash: )
You know they don't read stuff like this because it gives them nose bleeds...
-
Quote:
Originally Posted by chew*
Well if i was a sniper and i was sniping far away targets it looks like HDR would create a long distance fog table at a shorter distance thus HDR would be useless. Secondly I like the way they cut out some of the quality ( lack of clouds and such to make the gamma bright??????)
Werd, the clouds virtually disapear with HDR..
-
Quote:
Originally Posted by REBEL900
Werd, the clouds virtually disapear with HDR..
I saw that screenie a long time ago and wasn't impressed, Hanners at Elite Bastards asured me that a static image like this is not really representative of what it looks like in game.
But yor right! the clouds look overexposed and the detail on the buildings look washed out, nVidia could have found a better pic to use than this.
There is such a thing as too much HDR! as Iv'e seen in past games that make things like stone and tile look like it was in shrink wrap or epoxied... blech! :slapass:
-
Quote:
Originally Posted by LowRun
I'd second that, plus crossfire unable to do better than 1600x1200@60Hz is just... Well, useless :fact:
Josh at PenStar Systems
has explored this Limit.
Quote:
there is a problem with CrossFire that we have been able to determine to be true. The resolution and refresh rates are being limited by this solution. For users hoping to buy a CrossFire setup and drive high end monitors and LCD's, they are going to be in for a disappointment. The above stated limits are there, and there is currently no work around that we can see.
This IS a rumour :fact: