-
Quote:
Originally Posted by
Glow9
I do, but do you get mine? Paying $100 more for something that gives 10% is pretty pointless. Especially when the faster card runs hotter and uses more power. It even costs you more in the long run. I just think value is going to outweigh minimal performance differences in the next while.
lol i am not arguing value for money in my previous post.......we are specifically talking about performance differences and fair comparisons
not dollar for dollar comparisons
i am not saying one is better than other
hey i run a 8600GT in my 24/7
its very fast for everything i need it for.....to me your GTX260 looks like a waste of money too but i am not arguing that lol
-
Quote:
Originally Posted by
Yukon Trooper
Yeah, you do have to wonder what they are thinking, or IF they were thinking. I think the insanity of trying to compete with ATi has caused them to compete against themselves, and is affecting their engineering choices. It just doesn't make sense. They tried that once before with the GX2 and proved this multi-GPU stuff is a complete failure on all levels, and now they are gonna go backwards after having the top performing single GPU solution and wanna do it again?
It's insanity. That is THE definition of insanity...Doing the same thing over and over again, and expecting different results. :ROTF:
-
Quote:
Originally Posted by
T_Flight
Yeah, you do have to wonder what they are thinking, or IF they were thinking. I think the insanity of trying to compete with ATi has caused them to compete against themselves, and is affecting their engineering choices. It just doesn't make sense. They tried that once before with the GX2 and proved this multi-GPU stuff is a complete failure on all levels, and now they are gonna go backwards after having the top performing single GPU solution and wanna do it again?
It's insanity. That is THE definition of insanity...Doing the same thing over and over again, and expecting different results. :ROTF:
What's worse is they are going to do the same thing they have done for the 7950GX2 and the 9800GX2. They'll bring it out, then do driver fixes for 3 months until the next architecture comes out, then they'll forget about it and leave the owners high and dry yet again. I hate to say but their track record on dual gpu records isn't very positive. Current 9800GX2 driver support is abismal by any standards.
-
Uh Oh, I just had another thought. I wonder how painfully expensive watercooling one of these beasts might be? Might take its own rad, pump, fans, and the block for it would be a nightmare.
The 1st Factory (EVGA) block I saw for one of those 9800GX2's was Aluminum, and was huge. I think they made it out of Al to save weight to keep from breaking off the poor PCI-e port hanging on for dear life. :D Of course Aluminum is useless in a loop containing Cu or brass, so that was a non-starter from the get go. Can anyone imagine what a block like that would weigh in Cu? It would be like 4 pounds or some rediculous number! Becasue the GPU's were facing each other, the cards were sandwitched around the big huge massively thick block. I did see a Danger Den Cu block but like I said, I would hate to see what it weighed. It's full cover for the 9800GX2 only. Double sided full cover block only.
If nVidia does go ahead and do this (which I seriously hope they don't), it would be wise to situate the GPU's on the outside so block mounting would be easier. Of course I hate to even give them any ideas for somerthing that's a bad idea to begin with.
-
If the reports on power savings from going to 55nm on GT200 are accurate, then the heat-dump might not be too bad for a water-cooled setup.
-
-
Quote:
Originally Posted by
Envydia007
why not name it GTX300.
Because GTX 300 series will be a next generation with G300 core .. lauch of that is planned to summer/fall 2009
-
Quote:
Originally Posted by
Yukon Trooper
If the reports on power savings from going to 55nm on GT200 are accurate, then the heat-dump might not be too bad for a water-cooled setup.
Yeah the 280 wasn't as bad as I thought it was gonna be. Guys had me scared of it before I ever got it. It's really not that bad on air, but it really comes alive with water. They have alot in them and you can get alot more out of them with water.
I've had mine up to max clocks already just to see it, but the heat is out of control at max clocks. It needs water badly. Water really does help them. Talonman did a thread in the Water forum, and he has awesome clcoks and temps. I can't even touch what he's getting but for only a few seconds loaded, and then have to back out.
I hope they get all these new cards to run a bit cooler, and maybe increase the shader clocking headroom a bit. I'm still a single card fan. If I need more I can always SLI another one. :)
-
Quote:
Originally Posted by
T_Flight
Talonman did a thread in the Water forum, and he has awesome clcoks and temps. I can't even touch what he's getting but for only a few seconds loaded, and then have to back out.
Can I get a link?
-
-
Single card setups are definitely the preferred method of attack.
As far as heat and lower fabrication processes go, I can't wait until we're all running high-end setups on 250W PSU's again. :p:
-
Quote:
Originally Posted by
[XC] Synthetickiller
I'm in the same boat.
I don't think the gtx265 is worth it.
I think you're looking at gtx285 or gtx295.
If you dislike SLI or don't want to spend that much, you're looking at gtx285 effectively. I'm waiting to see pricing before I pick. I will not be getting a gtx265.
Thats my 2 cents on the matter.
yeah think i will go for either gtx 285-295 depending on the difference i have to put only had me gtx260-216 7 weeks so hopefully wont have to put much to new card. we will see... good luck man.:up:
-
I'll be interested to see numbers and prices once the new cards are out
although, the card that is most likely catching my eye is the GTX265 (depending on power consumption/ heat output and folding power)
-
I just hope they dont release these and then drop all support when the 300 series comes along, thats what happened when I bought the 7950gx2
-
Quote:
Originally Posted by
shoopdawoopa
I just hope they dont release these and then drop all support when the 300 series comes along, thats what happened when I bought the 7950gx2
probably will happen again like 9800GX2, I think nvidia its shooting himself on the foot once again, while I'm anxious to see this monster, I think its a bad move, why I say its a bad move? well remember the 9800GX2? making the GT200 performance at start look mediocre? the GX2 made the nvidia statement about "100%>" performance increase from G80 (more like what we saw from G70>G80) statement a joke. What happen then? EOL... nvidia patron is to release single core ships at start if competition beats them bring on dual-ship and eol on next product line... thats my opinion. I could be wrong about this card tho :)!!
-
Quote:
Originally Posted by
LOE
well, about ATIs dual GPU approach - I can classify that as ONE CARD with 2 gpus, but the nvidia solution has 2 PCBS, so it is pretty doubtful if I can call this "a single card" it is 2 cards sticked together with only one PCI-E slot
I think that this thing about 2 pcbs glued together and blabla almost sounds like an elitist view; Having 2 PCBs make the card seem less impressive since it 'needs' two cards to perform, unlike just a single pcb. Practical aspects of heat and power aside, i don't see what the fuss is about. I mean, the q6600 was no amazing feat: Two dual cores jammed together. Yet you don't see people :banana::banana::banana::banana::banana:ing about that all day
-
Quote:
Originally Posted by
noinimod
I think that this thing about 2 pcbs glued together and blabla almost sounds like an elitist view; Having 2 PCBs make the card seem less impressive since it 'needs' two cards to perform, unlike just a single pcb. Practical aspects of heat and power aside, i don't see what the fuss is about. I mean, the q6600 was no amazing feat: Two dual cores jammed together. Yet you don't see people :banana::banana::banana::banana::banana:ing about that all day
/take off calm hat/
it means the nvidia design team can't get off their asses and design a unified pcb like ati. a dual pcb sandwich is a ham handed throe of desperation. It would make sense if say, they needed the extra pcb for additional vram or a beefier vreg circuit, but otherwise it just means they want to put out another card but don't want to do a whole lot of design work. which is probably the same attitude they'll take with drivers.
(don't compare to intel's core2 architecture, there aren't any parallels really)
/replace calm hat/
love you nvidia!
-
Quote:
Originally Posted by
BulldogPO
Yep. HDMI is way to go, not DisplayPort.
??
What is wrong with Displayport..?
.
-
Quote:
Originally Posted by
LOE
well, about ATIs dual GPU approach - I can classify that as ONE CARD with 2 gpus, but the nvidia solution has 2 PCBS, so it is pretty doubtful if I can call this "a single card" it is 2 cards sticked together with only one PCI-E slot
So you class a tower case that has 2 motherboards as two cases :shrug:
-
Quote:
Originally Posted by
X.T.R.E.M.E_ICE
So you class a tower case that has 2 motherboards as two cases :shrug:
Your analogy doesn't make sense.
It would be "So you class a tower case that has 2 motherboards as two computers"
And yes, you could certainly call them that.
-
Quote:
Originally Posted by
Xoulz
??
What is wrong with Displayport..?
.
here is the start
1) its slower than HDMI/dvi since it uses duel encryption (so it sends and encrypted message to the monitor/tv then gets an offset back) then sends a packet to let it display the frame
2) its got less bandwidth than dvi or HDMI and degrades more over distance
3) it is not an open standard u have to pay to use it
4) it offers no advantages for the user and no advantage for distributor only more blatancy
5) it cannot be converted to another input, without loosing HDCP making it useless if u only have native displayport
6) it blocks non HDCP content natively, so once it retches proliferation there will be no drm free media since live signing will be disabled (live sighing temporarily encrypts playback)
the list goes on farther if u look around, but just know that its bad and dont buy anything with it hopefully we can kill this like divx disks
-
Quote:
Originally Posted by
Envydia007
why not name it GTX300.
Dont worry, Im sure that they will at some point.
-
Quote:
Originally Posted by
Sly Fox
Your analogy doesn't make sense.
It would be "So you class a tower case that has 2 motherboards as two computers"
And yes, you could certainly call them that.
4870x2 has 2 GPU does that means it is 2 cards. I remember seeing a powercolor card i think it was that had brand a second pcb for output connection.
EDIT: I think people need to know the difference between PCB and Card. LOL. This debate is getting real old.
-
GeForce GTX 295 benchmarks out?
-
so its good at physX and g200 optimized games, it looks like it will make amd get off their ass
-
Quote:
Originally Posted by
BababooeyHTJ
Dont worry, Im sure that they will at some point.
If it was ati, they would start busting out some fractions. :clap::ROTF::rofl:
-
OBR time to post yours I guess :D!!!
-
I dunno, this doesn't look like 9800GX2 vs 3870X2, that was a big lead and hence nVidia could sell their cards at a nice ~$600.
This one is small and unimpressive. (Dead Space doesn't work on CF, bugger lol)
When you can get a 4850X2 for $280, OC it slightly so it's about the same as a 4870X2 (see FS review), it really upsets all of the scales.
But I'll see more real scores before making a conclusion. :D
-
I'm shocked to see it consumes less power than a 4870x2 according to those numbers. This means that a gtx285 can consume less than a 4870.
-
honestly its surprising to me that the 295 has such a lead in every single bar on those graphs, but it is entirely possible.... but hey I think 9.1 is gonna have to be good because of this so I am happy
-
GTX 295 has very good price ... by the way!
-
Quote:
Originally Posted by
OBR
GTX 295 has very good price ... by the way!
If it's around the $399 mark I will get one.
-
Quote:
Originally Posted by
OBR
GTX 295 has very good price ... by the way!
I hope! tho the only thing is that is getting me worried is the heat this beat will produce :/
-
Quote:
Originally Posted by
SNiiPE_DoGG
honestly its surprising to me that the 295 has such a lead in every single bar on those graphs, but it is entirely possible.... but hey I think 9.1 is gonna have to be good because of this so I am happy
Why is it surprising? Considering that GTX 260 SP192 in SLI is trades blows with HD4870X2
-
the total lead is what i think that hes talking about
-
Although they are NV optimized games, I have to give them a kudo for using a proper baseline.
-
Quote:
Originally Posted by
LOE
What nvidia does is making TWO almost identical cards and sticking them together, it is not a DUAL GPU solution, it is DUAL GFX CARD solution
got it?
Ohwait i buy one card with two cores and two pcb's,what the differance they are both an easy way to gain performance just two different approach.
both seem to be equally the low tech way out for both companys to have crasy fast cards,and not spend billions in r-n-d
-
Quote:
Originally Posted by
LOE
A quad core has 2 dual core chips inside one package - but you still call it ONE processor, not two
Acually the Quad has 2 x Dual core processors. I look at the picture of the 295 and it looks like a single card. How does having 2 PCB's inside make it bad. The reason why Nvidia didn't go for single PCB is that they would have to spend crapload of money on developing new design.
As the saying goes, If it isn't broke don't fix it, something like that any way.
-
LOE quit being a hater. The initial bench's look nice. Performance will only improve with newer drivers. I think TRI 280 SLI and Quad 295 SLI are going to be neck and neck however. The only reason to switch would be for lower power consumption or to free up a slot. I'm not sure how well these will overclock due to the thermal dynamics of the whole thing.
-
I agree LOE
its just that niether ati or nv has a single core in near future that matches thier x2 cards power.
that would take lots more money and time(plus it wont be easy to match)
-
LOE, Stop crying. Does it matter how they do the job if the job gets done?
-
Quote:
Originally Posted by
LOE
I am not a hater, I already said the 965 will have better performance and power consumption than the radeon x2
It's just lame making two separate cards and selling them as one piece of hardware
So If I take 2 GTX280 and glue them together with epoxy glue - will this make it ONE CARD? Nvidia pretty much did exactly the same thing, only used screws instead of glue
Nvidia made the 9800GX2 that way and they made this one this way. The Die on the chip is to big to put it on one PCB. Thermodynamics just does'nt allow for such a thing when you take into account off the other stuff they need on a pcb to run this G200 Core.
Some people like dual GPU solutions and it's good nvidia did finaly awnser to the 4870X2 with the 295. I dont think it's lame or a desperate attempt but rather a attempt to reach a certain market. Why are you hating on the design of it when it worked well with the 9800GX2?
-
Quote:
Originally Posted by
LOE
It's not about what is good or bad, it is about using the right terms
ATi X2 is a single (dual GPU) card
the 295 is DUAL (single GPU) CARDS
just to illustrate my point, and I'm outta here, before someone collapses in nerve breakdown
http://img392.imageshack.us/img392/7355/duallllmt8.jpg
Oh, and just in case someone wonders - I am an NVIDIA USER :)
A dual GPU card is any card with two GPU's and one PCI Express Blade/Connector. These 295's are sharing bandwith and memory as with 280's in SLI do not.
-
No there's no new tech. You were basicly saying all this card is was 2 280's in SLI on one card. But in SLI with the 280's, one card's memory and bandwith is used and the other card's GPU is Processing also, but not memory. With this solution (295) the bandwidth, memory, and GPU is all shared.
-
What is this, the Gamespot forum? Good job on the idiot proof illustration, LOE.
-
As a former 9800gx2 owner I advise no one to buy this card.
-
My 9800GX2 works fine and only costed $279 (EVGA) for only about 15% drop in performance compared to SLI 260's (AFAIK)... AND it works on my Intel Chipset... so it's GREAT for me!
-
Quote:
Originally Posted by
LOE
It's the same thing I said, a quad is 2 dual cores, so whats your point :)
I look at the picture of the 295 and see two cards stuck to one cooler, and as I just said, I don't really have a problem with that, simply cuz I don't believe in SLI, and will stay a single card user until they find a better way to do multi chip GPUs
I just say this seems like the lamest and most desperate way to compete
I had a 4870x2 and it uses Xfire if you want to get technical.
-
289 watts!! jesus this card is a winner... definitely going to buy one!
-
well.... i wont step up if it aint worth it lets see what the numbers are before we jump to conclusions but im more than happy with my gtx260-216 also depends very much on price to. when these cards out ?
-
It should put up some really nice folding numbers.
-
OBR.
If time permits would you mind checking the memory IC part numbers on the 295.
Last I recall NV were using Samsung ICs which if correct NV using a 1242Mhz clock, would probably be using K4J52324QH-HJ08 which is a 1200MHz 0.8ns 2.05v bin. If so they won't have all that much headroom for overclocking.
Otherwise the only other alternative would be K4J52324QH-HJ7A, 1300MHz 0.77ns 2.05v bin. Curiosity has the best of me :D
Just a small request if permits.
Thanks in advance.
-
Quote:
Originally Posted by
jasonelmore
But in SLI with the 280's, one card's memory and bandwith is used and the other card's GPU is Processing also, but not memory..
:shrug: :confused:
Also, for all intents and purposes, the 9800gx2 IS SLI. There is a ribbon that connects the two, just like SLI. You can also disable SLI in the control panel with it.
Now dont misunderstand me, I dont hate this implementation, but lets call a Spade a Spade... ;)
-
Quote:
Originally Posted by
jasonelmore
No there's no new tech. You were basicly saying all this card is was 2 280's in SLI on one card. But in SLI with the 280's, one card's memory and bandwith is used and the other card's GPU is Processing also, but not memory. With this solution (295) the bandwidth, memory, and GPU is all shared.
http://www.techpowerup.com/reviews/Z...front_full.jpg
-
this card will make GT300 card look bad just as the 9800GX2 did for GT200 at launch :/
http://resources.vr-zone.com/image_d...77927f6515.jpg
soo what is this? less specification than previous announced? is this fake or not its really getting confusing. Can someone confirm this (points at OBR :D)?
http://forums.vr-zone.com/showthread.php?t=366972
-
If those specs are true, the 295 is nothing but 2 GTX260 216s with a die shrink slammed together...
You can use the EDIT feature instead of double posting. ;)
-
Quote:
Originally Posted by
jas420221
If those specs are true, the 295 is nothing but 2 GTX260 216s with a die shrink slammed together...
You can use the EDIT feature instead of double posting. ;)
Those specs are not right. It has already been confirmed that it will definately have 240 shaders X 2. not 216. and from what i'm hearing the stock clock's are not right either.
-
So then its a GTX280 with a die shrink slammed together... ;)
Can you please elaborate on your point about the 2nd cards' memory not being used in SLI... I know its not shared, but AFAIK, it more certainly is USED.
-
Quote:
Originally Posted by
dan7777
well.... i wont step up if it aint worth it lets see what the numbers are before we jump to conclusions but im more than happy with my gtx260-216 also depends very much on price to. when these cards out ?
Which card do you plan on getting if you do upgrade?
-
Quote:
Originally Posted by
jasonelmore
Those specs are not right. It has already been confirmed that it will definately have 240 shaders X 2. not 216. and from what i'm hearing the stock clock's are not right either.
where :D?
-
Quote:
Originally Posted by
jas420221
So then its a GTX280 with a die shrink slammed together... ;)
Oh no! You mean it's just like every other dual-GPU card ever made? :shocked:
-
Quote:
Originally Posted by
trinibwoy
Oh no! You mean it's just like every other dual-GPU card ever made? :shocked:
From Nvidia...yes. :rofl:
-
Quote:
Originally Posted by
jas420221
So then its a GTX280 with a die shrink slammed together... ;)
Can you please elaborate on your point about the 2nd cards' memory not being used in SLI... I know its not shared, but AFAIK, it more certainly is USED.
No not quite. It uses the 260's architechture. It still uses a 448 Bus instead of a 280's 512 interface. Bandwidth and the amount of memory is relative to 2X260's thats why you see the memory at 1.792GB instead of a solid 2GB.
BUT.. and this is a big but.. It has 240 Shader Processors per core instead of 192 or 216.
What i mean was the GPU's shared memory and bandwidth on both cards was scaled through the PCI Express Slot and not the SLI connector. Just the GPU is scaled through the SLI connector. Now that both cards are on one single card they are scaled togather through one single bus.
Ya got all of that trem?
-
Thanks for the explanation...you should type what you mean next time. ;)
Trem... LOL
I sure do jasonelmore, unberninja, Proamplitude, and w/e other names you used over at EOCF via Russian Proxy after you were permanently banned for being a complete idiot...
-
Yeah Tremglox formally from EOCF, you can drop the act. Rev busted you out.
-
Quote:
Originally Posted by
jas420221
From Nvidia...yes. :rofl:
Nope, not just Nvidia. There is no theoretical or measured benefit to AMD's approach. It still takes up two slots and is still based on crossfire and still draws boatloads of power. It's just as slammed together as anything Nvidia ever did with their GX2's. Convincing yourself otherwise is just delusional.
-
If you insist Trini...
Once sideport is needed and enabled, then it will have its architectural advantages, yes? I know that has little to do with 2 vs 1pcb but...
-
Doesn't make any sense to launch another high-end single gpu so soon. With GTX295, the GT300 would be kind of useless.. Especially in that timeframe.
Then again, nVidia already made that mistake, with GTX280..
-
Quote:
Originally Posted by
trinibwoy
Nope, not just Nvidia. There is no theoretical or measured benefit to AMD's approach. It still takes up two slots and is still based on crossfire and still draws boatloads of power. It's just as slammed together as anything Nvidia ever did with their GX2's. Convincing yourself otherwise is just delusional.
After market cooling solutions is almost a no go with GX2 design, seems like a really good reason to prefer the AMD's approach to me and nothing delusional at all :shrug:
-
Quote:
Originally Posted by
LowRun
After market cooling solutions is almost a no go with GX2 design, seems like a really good reason to prefer the AMD's approach to me and nothing delusional at all :shrug:
Yeah you're right, if you're looking at after-market cooling solutions the single-PCB design will be easier to work with. But we were talking about the relative technical (in)elegance of the two approaches. Neither provides any tangible advantage to scalability or performance of multi-GPU.
And jas, sideport was promising on the X2 but it turned out just to be another high-bandwidth link. And when they realized that PCIE 2.0 provided enough bandwidth already sideport became redundant and was disabled. There needs to be more integration before multi-GPU moves to the next level. It's a moving target though since on-die bandwidth and processing needs will continue to outstrip any sort of inter-die communication or data-sharing approach.
-
dont know if anyone saw this yet from expreview
http://en.expreview.com/2008/12/16/g...html#more-1646
shows some real potential
-
Quote:
Originally Posted by
trinibwoy
And jas, sideport was promising on the X2 but it turned out just to be another high-bandwidth link. And when they realized that PCIE 2.0 provided enough bandwidth already sideport became redundant and was disabled. There needs to be more integration before multi-GPU moves to the next level. It's a moving target though since on-die bandwidth and processing needs will continue to outstrip any sort of inter-die communication or data-sharing approach.
Tell me something I dont know.. :up:
-
Quote:
Originally Posted by
DeanZ
performce looking good :up: all depends now on price realy.
-
-
Quote:
Originally Posted by
adamsleath
ROFL!!!
-
Quote:
Originally Posted by
jas420221
If you insist Trini...
Once sideport is needed and enabled, then it will have its architectural advantages, yes? I know that has little to do with 2 vs 1pcb but...
You guys view the sideport as this holy grail of performance unlocking, which is always just over the horizon.
The sideport is for facilitating faster inter-gpu transfers, so there's less stalling when a lot of resource transfers need to occur often. Most games have profiles created to stop these transfers anyway, so the hardware implementation is only really useful in a generalized case without a profile. It doesn't necessarily speed anything up when compared to a game with proper CF profile support where transfers are already minimal.
-
Deanz, that link doesnt work
-
Quote:
Originally Posted by
Sampsa
Let's hope sooooo...
-
YES, reviews will be tommorow ... in limited numbers.
-
and what does that mean 'limited numbers'
-
Limited websites, hand picked i assume...people they trust under NDA etc.
-
Review will be only on few magazines ... wait for tommorow
-
Quote:
Originally Posted by
Richard Dower
Limited websites, hand picked i assume...people they trust under NDA etc.
With Nvidia games list.
With Nvidia game settings.
With Nvidia etc...
-
Quote:
Originally Posted by
AbelJemka
With Nvidia games list.
With Nvidia game settings.
With Nvidia etc...
hmmm so sites that reviewed nvidias favourite 5 games will recieve these new card :ROTF: i already see guru :D
-
Quote:
Originally Posted by
AbelJemka
With Nvidia games list.
With Nvidia game settings.
With Nvidia etc...
Ridiculous. NVIDIA can't force anyone to run the settings and games they want. They may make recommendations but that's it.
-
Quote:
Originally Posted by
Sr7
Ridiculous. NVIDIA can't force anyone to run the settings and games they want. They may make recommendations but that's it.
In Wonderland they can't...
"You want to review our new card bfore everyone? Ok but we're strongly suggested that you make the test with thoses games and with thoses settings. You can't? My bad no more card available to review for ur site" :D
-
Will be out for sale on the fisrst day of CES and the MSRP will be around 500€
-
im guessing the cards will be well overpriced worth the money ? hmmmmm i will hold tight think its guna be a stormy ride this one lol
-
Quote:
Originally Posted by
AbelJemka
In Wonderland they can't...
"You want to review our new card bfore everyone? Ok but we're strongly suggested that you make the test with thoses games and with thoses settings. You can't? My bad no more card available to review for ur site" :D
Like AMD doesn't do that, oh wait. :rolleyes:
-
1 Attachment(s)
Found this picture, shows more than the picture from guru3D
Looks good :yepp:
PRJ
-
Quote:
Originally Posted by
GoriLLakoS
Will be out for sale on the fisrst day of CES and the MSRP will be around 500€
Any reliable source for this? 500€ sounds about what I would expect though, like $599 in USD MSRP I would guess. Of course europe's high taxes and low availability at launch will probably push them at least to 550 EUR, but let's just hope NV won't pull another 8800 Ultra as far as price and availability goes. :p: