MMM
Page 3 of 7 FirstFirst 123456 ... LastLast
Results 51 to 75 of 164

Thread: 3870 x2 review

  1. #51
    Xtreme Member
    Join Date
    Nov 2006
    Location
    Netherlands
    Posts
    472
    Quote Originally Posted by mascaras View Post
    Games with HD3870X2 need to suport CF (like Two HD3870 in crossfire mode)


    btw:

    + 2 reviews

    http://www.fpslabs.com/reviews/video...-review/page-2

    http://en.expreview.com/?p=219



    regards
    That's the same reviews as posted on page 1 of this thread.

    Let's see more coming in today (hopefully)....
    System Specs: * i7 2700K @ 4.8 Ghz * Zalman CPNS9900-A LED * Asus Maximus IV Extreme -Z * 16 GB Corsair Dominator GT CMT16GX3M4X2133C9 * Sapphire HD7970 crossfire * Creative X-Fi Titanium Fatality Pro [PCI-E] * Corsair AX 1200W * WDC WD1002FAEX + WDC WD1002FAEX * Optiarc AD 5240S * Dell U3010 @ 2560 x 1600 [DVI-D] * Steelseries 7G * Logitech G9 * Steelseries SX * Coolermaster Stacker STC T01 * Logitech Z-5500 * Sennheiser HD598 * Windows 7 Ultimate x64 SP1*

  2. #52
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by Shintai View Post
    Kickass card? Its CF on a card. Could just buy 2 HD3870..they would even be cheaper. And the heat..ouch...

    Its like adding 2 G92s on a card and call it kickass. It just aint. Living with all the CF/SLI bugs, specially with multi monitors is too horrible.

    Now give me a single GPU card instead. And not another R600 heatmonster.

    It needs to be same or cheaper than 2 HD3870 cards. Not more...
    This card is much more performance then 8800 ultra, more features, more cheaper.
    Deal with it Mr. Intel/Nvidia biased

    Performance diffrence in terms of R680 to 8800Ultra
    Bioshock 1280*1024 = 4% Slower
    Bioshock 1920x1200 = 24% Faster
    Bioshock 2560*1600 = 39% Faster

    COJ 1280*1024 = 11% Faster
    COJ 1920x1200 = 24% Faster
    COJ 2560*1600 = 10% Faster

    COJ 1280*1024 4AA 16AF = 13% Faster
    COJ 1920x1200 4AA 16AF = 23% Faster

    Lost Planet 1280*1024 = 27% Slower
    Lost Planet 1920x1200 = 30% Slower
    Lost Planet 2560*1600 = 37% Faster

    Lost Planet 1280*1024 4AA 16AF = 18% Slower
    Lost Planet 1920x1200 4AA 16AF = 18% Slower
    Lost Planet 2560*1600 4AA 16AF = 10% Faster

    Crysis 1280*1024 = 13% Slower
    Crysis 1920x1200 = 2% Slower
    Crysis 2560*1600 = 12% slower

    COD4 1280*1024 = 42% Faster
    COD4 1920x1200 = 32% Faster
    COD4 2560*1600 = 26% Faster

    COD4 1280*1024 4AA 16AF = 27% Faster
    COD4 1920x1200 4AA 16AF = 20% Faster
    COD4 2560*1600 4AA 16AF = 16% Faster

    NFS: Pro 1280*1024 = 32% Faster
    NFS: Pro 1920x1200 = 38% Faster

    NFS: Pro 1280*1024 4AA 16AF = 72% Slower
    NFS: Pro 1920x1200 4AA 16AF = 67% Slower

    Serious Sam 2 1280*1024 HAA 16AF = 30% Faster
    Serious Sam 2 1920x1200 HAA 16AF = 45% Faster
    Serious Sam 2 2560*1600 HAA 16AF = 78% Faster

    UT3 1280*1024 = 7% Faster
    UT3 1920x1200 = 24% Faster
    UT3 2560*1600 = 37% Faster

    F.E.A.R. 1600*1200 = 20% Faster
    F.E.A.R. 2048*1536 = 20% Faster

  3. #53
    Xtreme Enthusiast
    Join Date
    Sep 2006
    Posts
    881
    Quote Originally Posted by GoThr3k View Post
    how can this be disapointed? its cheaper, performs better, and you can finally decode HD movies...
    But you can get an factory oc'd GTX for about the same money though. Hopefully it's just driver problems.

    As for Crysis can't use more than 128 shaders, what about people running 8800GT in SLI, that's 224 shaders total and they seems get a big boost in performance in SLI.

  4. #54
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    Quote Originally Posted by v_rr View Post
    This card is much more performance then 8800 ultra, more features, more cheaper.
    Deal with it Mr. Intel/Nvidia biased

    @ V-RR , the FPS its not ALL, we need to know if there´s any breaks while playing games , like with HD3870 crossfire .

    with HD3870 CF i have a lot of MAX FPS and good Fps average , but i also have a lot of breaks ! (thats the real "problem" of the CF & SLi)

    average Fps and Max FPS for me its not the most important thing , but if the games are 100% playble without breaks ( driver optimizations) , and unfortnaly in the Reviews we cant see that!!


    the HD3870X2 have a new chip PEX8547 that its suposed to Optimize the Crossfire , lets wait for more game tests (from XS users)


    regards
    Last edited by mascaras; 01-22-2008 at 08:32 AM.

    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

  5. #55
    Xtreme Enthusiast
    Join Date
    Nov 2006
    Location
    Jonesboro, AR
    Posts
    750
    Quote Originally Posted by naokaji View Post
    its a issue with crossfire specific to crysis. future drivers should fix that though.

    the card does run rather hot, 79C gpu temp means to me = aftermarket cooler needed. also powerconsumption is 25W higher than with a 8800Ultra.
    I agree. I have some MAJOR performance issues with Crossfire in DX10 Crysis. It doesn't run AFR mode in DX9 either, but super tiling.
    I am very pleased with the results on this card. Imagine how it will do on a decent water loop and OCed? Insane.
    Q6600 G0 @ 3.8 (475x8) 1.496V || ASUS Maximus Formula X38 || 4x2GB Geil Evo One DDR2-950 5-5-5-15 ||
    Sapphire 4870X2 857/1000 || XSPC Delrin full coverage block || MCP655 || D-Tek Fuzion Nozzled ||
    Black Ice 480GTX 8 120mm Yate Loon push/pull || Silverstone DA850 PSU 70A +12V || Auzentech Prelude ||
    Vantage P15071 in Vista x64

  6. #56
    Xtreme Addict
    Join Date
    Dec 2005
    Posts
    1,035
    Quote Originally Posted by Scimitar View Post
    I wonder if they ran Crysis 1.1 with all the hotfixes installed. I would think the Crysis #s will be going up. Smoking performance is some of those other games.

    Nice 3dmark 06, looks like a fun card to benchmark with.

    Edit:

    Check out the performance in Call of Juarez, it's actually playable in DX10 for the first time!


    In their pic it is written pre-release time demo

    EDIT: In the pconline review i mean.
    Last edited by Tonucci; 01-22-2008 at 08:06 AM.

  7. #57
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    The FPS Labs review uses a Foxconn N68S7AA (nvidia 680i) SLI Motherboard which I believe uses PCIe 1.1, not 2.0 (correct me if I am wrong). From what I read the 3870X2 needs a PCIe 2.0 PCIe. Also, this is no different then the 7950gx2 vs x1950xtx last time around.



    Source


    As you know, on the board ATI Radeon HD 3870 X2 is two graphics processor. According to the information available at the reference motherboard ATI Radeon HD 3870 X2 will be installed 1 GB of memory GDDR3, which operates at a frequency of 2 GHz processors and related 256 - bit tyres. The product is meant for connecting to the bus PCI Express 2.0, which interact with the responsibility of the special chip-bridge. The truth seems to be in this role will be used PEX6347 not, as expected, but PEX8548. At the given scheme, the bridge is at the centre of charges between the two processors. Near each processor can be found on four chip memory.

    The following illustration shows that the transition from PCI Express 1.1 for PCI Express 2.0 provides a noticeable increase in productivity - around 20-30%, depending on the application and the screen resolution.
    source
    [SIGPIC][/SIGPIC]

  8. #58
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by OBR View Post
    Ultra is discontinued product on EOL!
    so? there are loads and loads of ultras still available...
    does it matter? its making things even worse for nvidia as the gts will be their fastest card and the 3870x2 will beat that one even more

    Quote Originally Posted by OBR View Post
    difference between Ultra and GTS-512 is very small
    no, not very small... the difference between all those cards is relatively small... you pay for only 10 or 20% extra performance, thats why i never buy high end cards

    but if you want the best of the best and have a high res display then you need a high end card. compare high res and aa and you will see the ultra pull away. thats where the 3870x2 shines as well.

    Quote Originally Posted by OBR View Post
    2x GT in SLI are better for the same price like R680 ...
    id say yes...BUT for that you need a buggy POS overpriced 680 or 780 board

    i expect the 3870x2 price to go down, atm two 3870s cost below 350 euros in germany, and thats with gddr4! i think the 3870x2 will go down to 350 euros in a few weeks... well i hope so!

  9. #59
    Xtreme Member
    Join Date
    Jun 2006
    Location
    Portugal
    Posts
    117
    The results are, more or less, the expected. In my opinion, they are not bad, not bad at all, but dont forget the "age" of the ULTRA.
    Sorry for my poor english.

  10. #60
    Xtreme Member
    Join Date
    May 2006
    Posts
    313
    Quote Originally Posted by awdrifter View Post
    As for Crysis can't use more than 128 shaders, what about people running 8800GT in SLI, that's 224 shaders total and they seems get a big boost in performance in SLI.
    2 GT is NOT 224 shaders, it's 2*128 shaders rendering different images.

    just like E6600 is not 4,8ghz(ok, here maybe, but not at stock )
    System : E6600 @ 3150mhz, Gigabyte DS3, 4gb Infineon 667mhz, Amd-Ati X1900XT

  11. #61
    Xtreme Mentor
    Join Date
    Jul 2004
    Location
    Brooklyn, NY
    Posts
    2,771
    Quote Originally Posted by Unbornchild View Post
    The hardware was ready in early December (or even before that), has been posted several times here and there...
    I'm sure it's the drivers they're still working on to get them better.
    hahahhaaaa I was being sarcastic. OOps my bad.
    Asus Rampage Formula X48
    Intel Q9650 @ 4.33GHZ
    OCZ Platinum DDR2-800
    Palit 4870x2
    Creative Xi-Fi Extreme Music
    Corsair HX1000
    LL 343B Case
    Thermochill 120.3
    2xMCP355
    KL 350AT
    KL 4870X2 FC WB
    DD Chipset Block

  12. #62
    Xtreme Addict
    Join Date
    Jun 2007
    Location
    Thessaloniki, Greece
    Posts
    1,307
    Quote Originally Posted by OBR View Post
    if R680 have it +/- the same performance like Ultra, is comparable to GTS-512because difference between Ultra and GTS-512 is small ... GTS-512 is cheaper then this R680 ...

    You can choose : R680 - brutal heat, performance like Ultra (oced GTS-512), noisy cooler for 600USD! Or GTS-512 with oced the same performance, great cooler for 400USD ... what is better?
    The euro price also has VAT so you need to remove that first before making the conversion
    Seems we made our greatest error when we named it at the start
    for though we called it "Human Nature" - it was cancer of the heart
    CPU: AMD X3 720BE@ 3,4Ghz
    Cooler: Xigmatek S1283(Terrible mounting system for AM2/3)
    Motherboard: Gigabyte 790FXT-UD5P(F4) RAM: 2x 2GB OCZ DDR3 1600Mhz Gold 8-8-8-24
    GPU:HD5850 1GB
    PSU: Seasonic M12D 750W Case: Coolermaster HAF932(aka Dusty )

  13. #63
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    a) Having two dies and thus improving YIELD on the manufacturing process is better than going for a monolithic design with 30% yield. ATI probably learned from this with their R600.

    b) CrossFire interconnect seems to be handled internally. The drivers take care of all that - it's not like Crysis or other games that handle CF bad will say "Oh no 3870 X2 time to stop working" Windows sees the card as a single card, not a crossfired set of GPU's.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  14. #64
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    On ATI has 320SPs @ 741 MHz (1 MADD/clock) = 474 GFlops
    On nV has 128SPs @ 1350 MHz (1 MADD/clock) = 345 GFlops

    ATI chip uses a VLIW architecture =5 ops per clock cycle to execute sharders in parallel (instruction pairing).
    nV chip uses 2 ops per clock cycle to execute sharders in parallel (don't remember off hand).

    Vectorial hardware (ATI) = good shader compiler to extract parallelism from code
    Scaler hardware (nV) = shader compiler efficiency not necessary

    Is this pretty much the situation?
    Last edited by Eastcoasthandle; 01-22-2008 at 10:47 AM.
    [SIGPIC][/SIGPIC]

  15. #65
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    Quote Originally Posted by cegras View Post

    The drivers take care of all that - it's not like Crysis or other games that handle CF bad will say "Oh no 3870 X2 time to stop working" Windows sees the card as a single card, not a crossfired set of GPU's.
    of course windows "sees" the card as single card (CF its internal) ,windows only "see" Dual card if you have Two HD3870X2 in CF , but in games it seems that Single HD3870X2 its the same thing as HD3870 Crossfire (need to suport CF)


    a good example of that :


    HD3870X2 VS 8800Ultra

    NFS: Pro 1280*1024 4AA 16AF = 72% Slower
    NFS: Pro 1920x1200 4AA 16AF = 67% Slower

    another example:

    my friend is testing one since yesterday , In Loast Coast ( Video stress test ) i have same performance with Single HD3870 as HD3870X2



    regards
    Last edited by mascaras; 01-22-2008 at 10:51 AM.

    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

  16. #66
    Xtreme Member
    Join Date
    Aug 2007
    Location
    Montenegro
    Posts
    333
    Quote Originally Posted by awdrifter View Post
    Kinda dissapointed, it beats out the 8800Ultra on the older games, but not new games like Crysis where we need the performance boost. If it's $399 it would be a great deal, but at $449, I might as well get the 8800GTX and oc that.
    ROFL , nvidia fan here everyone-In denial as always. hahahaha

    Dissapointed? Use some windex on those glasses dude bcuz u dont know what ure talking about hahaha. He said dissapointed. Have you looked at the ultra prices dude? You complain about the extra 50 bucks but yet the ultra is what? 700? Man, u just started my day on the mean side...IGNORANCE man.
    Internet will save the World.

    Foxconn MARS
    Q9650@3.8Ghz
    Gskill 4Gb-1066 DDR2
    EVGA GeForce GTX 560 Ti - 448/C Classified Ultra
    WD 1T Black
    Theramlright Extreme 120
    CORSAIR 650HX

    BenQ FP241W Black 24" 6ms
    Win 7 Ultimate x64

  17. #67
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by cegras View Post
    a) Having two dies and thus improving YIELD on the manufacturing process is better than going for a monolithic design with 30% yield. ATI probably learned from this with their R600.

    b) CrossFire interconnect seems to be handled internally. The drivers take care of all that - it's not like Crysis or other games that handle CF bad will say "Oh no 3870 X2 time to stop working" Windows sees the card as a single card, not a crossfired set of GPU's.
    Eh?

    So having bigger PCB, double the memory, bridge chip and extra I/O etc on each chip is "better yield"?

    Who is saving money? Because its not you and me

    Also GPUs are easy to get high yield from, even if massively huge.

    Also CF, even over the bridge chip as it is. Is still corssfire with all its ups and downs. So yes, Windows actually see it as a Crossfire setup. And dongleless aswell. Just because CF is always on in drivers and you cant set it in CCC doesnt mean its any different.
    Crunching for Comrades and the Common good of the People.

  18. #68
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    Quote Originally Posted by Eastcoasthandle View Post
    On ATI has 320SPs @ 741 MHz (1 MADD/clock) = 474 GFlops
    On nV has 128SPs @ 1350 MHz (1 MADD/clock) = 345 GFlops

    ATI chip uses a VLIW architecture =5 ops per clock cycle to execute sharders in parallel (instruction pairing).
    nV chip uses 2 ops per clock cycle to execute sharders in parallel (don't remember off hand).

    Vectorial hardware (ATI) = good shader compiler to extract parallelism from code
    Scaler hardware (nV) = shader compiler efficiency not necessary

    Is this pretty much the situation?
    I think I read somewhere, googling up 'Why does ATI have more SP but does worse?' there was a post in the nvidia forums that went like this:

    Gah, I can't find it.

    The point was, if I remember correctly, if the drivers are written right in the best case scenario ATI's R600 architecture can do twice the work of a G80, but in 'normal' scenarios the G80 does more operations per second.

    Quote Originally Posted by Shintai View Post
    Eh?

    So having bigger PCB, double the memory, bridge chip and extra I/O etc on each chip is "better yield"?

    Who is saving money? Because its not you and me

    Also GPUs are easy to get high yield from, even if massively huge.

    Also CF, even over the bridge chip as it is. Is still corssfire with all its ups and downs. So yes, Windows actually see it as a Crossfire setup. And dongleless aswell. Just because CF is always on in drivers and you cant set it in CCC doesnt mean its any different.
    Yeah, of course it's better yield. Yield focuses on the actual core itself. It's easier to slap two cores together than re-engineer a brand new 'native' dual core - which due to it's size is likely to run into a ton of problems in the manufacturing process. If you can make a bunch of lesser chips at a higher yield rate, why not slap two together and get more performance if you do it right? I assume you know the rudiments of chip manufacturing, so I won't lecture you about that. Which is what my whole point was based upon anyways.

    I told you already, the first samples of the R600 were somewhere in the 20-30% yield range due to a too ambitious design and the pure size of the chip.
    Last edited by cegras; 01-22-2008 at 11:06 AM.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  19. #69
    Xtreme Member
    Join Date
    Aug 2007
    Location
    Montenegro
    Posts
    333
    What I dont understand is how people judge this card only based on a game that struggles to even run smooth on tripple sli with 3 ultras and then you come in here and claim it isnt good enough. Lmao, so crysis is the only game people play? I bought the game the first week it came out and I only played it a few times , finished it , and I thought it wasn't all that great considering the hype...Besides the game was rushed out so the performance should have been alot better...

    Crysis fans, you dont count trust me. And btw, I happen to own a 8800gtx and im not a fanboy to neither company. But its funny how some of you defend your cards as if ure worshipping it. Its just hilarious to me.
    Last edited by RedBull78; 01-22-2008 at 11:09 AM.
    Internet will save the World.

    Foxconn MARS
    Q9650@3.8Ghz
    Gskill 4Gb-1066 DDR2
    EVGA GeForce GTX 560 Ti - 448/C Classified Ultra
    WD 1T Black
    Theramlright Extreme 120
    CORSAIR 650HX

    BenQ FP241W Black 24" 6ms
    Win 7 Ultimate x64

  20. #70
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by cegras View Post
    I think I read somewhere, googling up 'Why does ATI have more SP but does worse?' there was a post in the nvidia forums that went like this:

    Gah, I can't find it.

    The point was, if I remember correctly, if the drivers are written right in the best case scenario ATI's R600 architecture can do twice the work of a G80, but in 'normal' scenarios the G80 does more operations per second.



    Yeah, of course it's better yield. Yield focuses on the actual core itself. It's easier to slap two cores together than re-engineer a brand new 'native' dual core - which due to it's size is likely to run into a ton of problems in the manufacturing process. If you can make a bunch of lesser chips at a higher yield rate, why not slap two together and get more performance if you do it right? I assume you know the rudiments of chip manufacturing, so I won't lecture you about that. Which is what my whole point was based upon anyways.

    I told you already, the first samples of the R600 were somewhere in the 20-30% yield range due to a too ambitious design and the pure size of the chip.
    GPUs just dont like to be paired with one or more. Its in their nature. Same reason SLI/CF will most likely always forever use cores*memory size plus any CF/SLI overhead penalty. So X2 cards from AMD and nVidia is simply yet another "let the customer pay".

    If they only had 20-30% yield it would be due to speed and power usage with 1 speedbin card only. Because you can easily reuse GPUs that dont match top bin. For the G80 the Ultra, GTX and GTS series are a nice example there. Or GT and GTS512.
    Crunching for Comrades and the Common good of the People.

  21. #71
    Xtreme Enthusiast
    Join Date
    Jan 2007
    Location
    QLD
    Posts
    942
    Quote Originally Posted by GoThr3k View Post
    how can this be disapointed? its cheaper, performs better, and you can finally decode HD movies...

    Anyone with an 8800Ultra, or 3870x2 for that matter doesnt need to decode movies. Oh yes, 1080p really brings +3ghz Quad cores to a crawl

  22. #72
    Xtreme Cruncher
    Join Date
    Oct 2006
    Location
    Boston, Massachusetts
    Posts
    2,224
    Soon as 3870 X2 hits stateside, I will have 2 of em on Asus Maximus

  23. #73
    Xtreme Cruncher
    Join Date
    Jan 2005
    Location
    England
    Posts
    3,554
    Quote Originally Posted by ChaosMinionX View Post
    Soon as 3870 X2 hits stateside, I will have 2 of em on Asus Maximus
    +1 (but for my DFI )

    My Free-DC Stats
    You use IRC and Crunch in Xs WCG team? Join #xs.wcg @ Quakenet

  24. #74

  25. #75
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Vancouver
    Posts
    1,073
    Quote Originally Posted by Scubar View Post
    A little disappointed, it does beat the Ultra overall but if it cant win in Crysis then its not going to get my vote. Thats the 1 game that people would like to play at a decent framerate and its just not cutting it. Lets just hope the 8800GX2 ( not worthy of being called 9800 ) can do better.

    I do agree that this cards design is better than the GX2 as having 1 pcb means it will be alot easier for those of us who want to watercool our gpus.
    It is the drivers currently hampering it in crysis, which takes time to overcome when nvidia tuned their drivers for that specific game
    " Business is Binary, your either a 1 or a 0, alive or dead." - Gary Winston ^^



    Asus rampage III formula,i7 980xm, H70, Silverstone Ft02, Gigabyte Windforce 580 GTX SLI, Corsair AX1200, intel x-25m 160gb, 2 x OCZ vertex 2 180gb, hp zr30w, 12gb corsair vengeance

    Rig 2
    i7 980x ,h70, Antec Lanboy Air, Samsung md230x3 ,Saphhire 6970 Xfired, Antec ax1200w, x-25m 160gb, 2 x OCZ vertex 2 180gb,12gb Corsair Vengence MSI Big Bang Xpower

Page 3 of 7 FirstFirst 123456 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •