Page 1 of 2 12 LastLast
Results 1 to 25 of 26

Thread: What performance do you get in World In Conflict?

  1. #1
    I am Xtreme
    Join Date
    Jul 2007
    Location
    The Sacred birth place of Watercooling
    Posts
    4,689

    What performance do you get in World In Conflict?

    Post rig info:
    Driver's and Os
    Min avg Max fps
    Game settings:
    Screen resolution:

    Q6600 @ 2.4ghz
    2gb ocz ram 1066mhz 5-5-5-15
    Xfx 8800 gtx @ stock
    Window's vista 64 bit
    163.44 driver's

    All setting set to maxed @ 1680 x 1050
    16x AF 16X AA

    Min 11 fps avg 24 max 46 (benchmark)

    I get better fps on multiplayer for some reason
    Last edited by disruptfam; 09-23-2007 at 08:44 PM.
    Quote Originally Posted by skinnee View Post
    No, I think he had a date tonight...

    He and his EK Supreme are out for a night on the town!

  2. #2
    Xtreme Addict
    Join Date
    Apr 2005
    Location
    Houston, TX
    Posts
    1,196
    I'll get back to you on this one. But you forgot one important bit of info, the screen resolution you're running at.

    *EDIT*
    This is just the built in timedemo/benchmark.

    See sig for system specs running 163.71 beta drivers.







    Last edited by Major_A; 09-23-2007 at 10:52 AM.
    C2Q Q9550 @ 4.03Ghz/4GB OCZ DDR 1066 @ 902Mhz /BFG GTX 260/Gigabyte GA-EP45-UD3P/2 X Samsung F1 1TB/Samsung F1 750GB/Samsung SH-S223Q/Corsair TX650/X-Fi Titanium Fata1ity/Swiftech H20-220/Logitech Z5300 Speakers/Samsung SyncMaster 2253BW Monitor/Win 7 Ultimate x64

  3. #3
    Banned
    Join Date
    Aug 2007
    Posts
    1,014
    good story, BUT TO SHORT , 5.5hours is just way to short

  4. #4
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    I love games that come with a prebuilt benchmark for easy comparision between different setups, drivers etc. More of that plz. Gonna install this game now and report back with same settings Major_A used but I fear with my gfx card there will be a huge slowdown. xD

    EDIT: Somehow I think this game is a lot CPU dependant and prolly utilizes dual core. Same settings used as Major_A but C2D @ 3.75GHz and 7900GTO 512MB 690/850MHz. Rest of the specs in signature and also using 163.71 beta drivers.
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	wic-benchmark2.jpg 
Views:	211 
Size:	42.4 KB 
ID:	64806  
    Last edited by RPGWiZaRD; 09-23-2007 at 11:44 AM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  5. #5
    Xtreme Addict
    Join Date
    Apr 2005
    Location
    Houston, TX
    Posts
    1,196
    Man your system is raping mine with no lube. I would think a 8800GTS is better than a 7900GTO but that C2D is tearing it up.
    C2Q Q9550 @ 4.03Ghz/4GB OCZ DDR 1066 @ 902Mhz /BFG GTX 260/Gigabyte GA-EP45-UD3P/2 X Samsung F1 1TB/Samsung F1 750GB/Samsung SH-S223Q/Corsair TX650/X-Fi Titanium Fata1ity/Swiftech H20-220/Logitech Z5300 Speakers/Samsung SyncMaster 2253BW Monitor/Win 7 Ultimate x64

  6. #6
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Quote Originally Posted by Major_A View Post
    Man your system is raping mine with no lube. I would think a 8800GTS is better than a 7900GTO but that C2D is tearing it up.
    Would be interesting to see a C2D vs AMD comparision with same gfx card and ram in this game. Guess it differs this much only when using physics setting at high. Wouldn't be suprised if the difference was a lot smaller otherwise.
    Last edited by RPGWiZaRD; 09-23-2007 at 11:53 AM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  7. #7
    Xtreme Addict
    Join Date
    Apr 2005
    Location
    Houston, TX
    Posts
    1,196
    I tried setting the physics to medium and it made a whooping 1 AVG FPS difference. Now I'm confused, I just turned on 2XAA and turned up the AF to 16X and the performance is basically unchanged.





    Would that mean I'm CPU limited at 1280X1024 with the current settings?
    C2Q Q9550 @ 4.03Ghz/4GB OCZ DDR 1066 @ 902Mhz /BFG GTX 260/Gigabyte GA-EP45-UD3P/2 X Samsung F1 1TB/Samsung F1 750GB/Samsung SH-S223Q/Corsair TX650/X-Fi Titanium Fata1ity/Swiftech H20-220/Logitech Z5300 Speakers/Samsung SyncMaster 2253BW Monitor/Win 7 Ultimate x64

  8. #8
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Quote Originally Posted by Major_A View Post
    I tried setting the physics to medium and it made a whooping 1 AVG FPS difference. Now I'm confused, I just turned on 2XAA and turned up the AF to 16X and the performance is basically unchanged.





    Would that mean I'm CPU limited at 1280X1024 with the current settings?
    Yea I tried the physics settings on low but didn't change the result even 1 fps. On the other hand I got a ~13.5% FPS drop going from AA:none & AF 4x to AA2x and AF16x (avg 42 FPS vs 37) and further drop with AA4x by ~27.3% (avg 42 FPS vs 33) so yea looks like you're CPU limited.

    Ran the benchmark in windowed mode in 800x600 now and kept an eye on cpu usage and it stayed between 60~80% pretty much all the time and both cores were utilized quite similarly. I'm so glad most games are finally starting to take advantage of dual core CPUs.
    Last edited by RPGWiZaRD; 09-23-2007 at 02:58 PM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  9. #9
    I am Xtreme
    Join Date
    Jul 2007
    Location
    The Sacred birth place of Watercooling
    Posts
    4,689
    I have quad core does it use all four core's?

    My fps are low for my setup.i would think.Maybe because i using the 163.44 drivers???
    Where do you download those beta driver's by the way?

    cheers
    Quote Originally Posted by skinnee View Post
    No, I think he had a date tonight...

    He and his EK Supreme are out for a night on the town!

  10. #10
    Xtreme Addict
    Join Date
    Apr 2005
    Location
    Houston, TX
    Posts
    1,196
    http://www.nzone.com/object/nzone_do...etadriver.html

    Run the game windowed and open up your Task Manager and look to see if all the cores are being loaded. If I want to find out if a program is multi-threaded I run Throttlewatch. Either one will work.
    C2Q Q9550 @ 4.03Ghz/4GB OCZ DDR 1066 @ 902Mhz /BFG GTX 260/Gigabyte GA-EP45-UD3P/2 X Samsung F1 1TB/Samsung F1 750GB/Samsung SH-S223Q/Corsair TX650/X-Fi Titanium Fata1ity/Swiftech H20-220/Logitech Z5300 Speakers/Samsung SyncMaster 2253BW Monitor/Win 7 Ultimate x64

  11. #11
    Xtreme Addict
    Join Date
    Apr 2005
    Location
    Houston, TX
    Posts
    1,196
    Quote Originally Posted by CandymanCan View Post
    i dont think its very fair to use "medium" no AF low AA, no shadows ect.
    The title of the thread is What performance do you get in World In Conflict? not Let's Compare WiC Timedemo Scores. I'll run the benchmark again in a bit on "Very High".
    C2Q Q9550 @ 4.03Ghz/4GB OCZ DDR 1066 @ 902Mhz /BFG GTX 260/Gigabyte GA-EP45-UD3P/2 X Samsung F1 1TB/Samsung F1 750GB/Samsung SH-S223Q/Corsair TX650/X-Fi Titanium Fata1ity/Swiftech H20-220/Logitech Z5300 Speakers/Samsung SyncMaster 2253BW Monitor/Win 7 Ultimate x64

  12. #12
    I am Xtreme
    Join Date
    Jul 2007
    Location
    The Sacred birth place of Watercooling
    Posts
    4,689
    it seems to max 1 core 100% and the other 3 core's don't go over 20% so i dont think it fully utilies four core's

    also how to take a screenshot of your settings and so forth?I Cant print screen it don't work

    cheers
    Quote Originally Posted by skinnee View Post
    No, I think he had a date tonight...

    He and his EK Supreme are out for a night on the town!

  13. #13
    Xtreme Addict
    Join Date
    Apr 2005
    Location
    Houston, TX
    Posts
    1,196
    Run on "Very High", no other options changed.







    Warning be damned!




    My point being is, if people here are like "oh your systems owns mine, thinking there system is lacking" Oh your cpu limited blah blah, the only reason is because he is using low details with no AA/AF hardly. Im trying to prove a point thats no matter how fast his cpu is he wont beat a system with a 8800 in it, his fps is what im getting with the details maxed, and his are on medium and only 4xAA.
    I understand what you're saying.

    t seems to max 1 core 100% and the other 3 core's don't go over 20% so i dont think it fully utilies four core's

    also how to take a screenshot of your settings and so forth?I Cant print screen it don't work

    cheers
    To take a screenshot hit the PrtScn button, open up Paint and press Ctrl+V to paste. There's your screenshot. Or if you want to do it the faster way like I have been download FRAPS and use it's screenshot function. The unregistered version (like mine) will only take .bmp screenshots so you still have to convert them over so Imageshack/Photobucket don't complain about the size.

    *EDIT*
    To get technical when I update my driver I always change one "stock/standard" setting in the control panel. I always change Texture Filtering Quality from Quality to High Quality. I've done this since my Ti4400 and have only really seen a difference in a few games. I do know that this takes away a few FPS in benchmarks.
    Last edited by Major_A; 09-23-2007 at 08:31 PM.
    C2Q Q9550 @ 4.03Ghz/4GB OCZ DDR 1066 @ 902Mhz /BFG GTX 260/Gigabyte GA-EP45-UD3P/2 X Samsung F1 1TB/Samsung F1 750GB/Samsung SH-S223Q/Corsair TX650/X-Fi Titanium Fata1ity/Swiftech H20-220/Logitech Z5300 Speakers/Samsung SyncMaster 2253BW Monitor/Win 7 Ultimate x64

  14. #14
    I am Xtreme
    Join Date
    Jul 2007
    Location
    The Sacred birth place of Watercooling
    Posts
    4,689
    Print screen and paste in Paint works for me ? I dont think there is any game that uses 4 core's. I think 4 cores are useless atm
    useless atm but down the track game developers will take good use of the extra core's

    hopefully sooner rather than later.
    Quote Originally Posted by skinnee View Post
    No, I think he had a date tonight...

    He and his EK Supreme are out for a night on the town!

  15. #15
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    FFS CandymanCan, don't take me for a fool k thx. I know exactly what you're trying to say but not every1 is like you think.

    This ain't no "my e-penis is bigger than yours thread". I used the same settings only since Major_A was using them. Of course I know the lower detail settings and resolution used the bigger advantage for me but why do you always have to stick to max quality settings anywas? It's just a matter of what kinda comparision you want to make. And since you're an AMD owner it's only natural for you to wanna compare using highest detail settings so that the disadvantage will be as low as possible since GPU becomes the bottleneck so don't come in here and keep saying I'm posting here to show off "look at how 1337z0r my C2D 3.75GHz is and 0wnz yer AMD" since you're no better yourself in that case, trying to show off your 1337 shiny DX10 8800GTS card but I also assume you don't like people ramping about their hardware and especially C2D CPU, but let me tell you this, I hate that myself as much as you do. Again the only reason those settings were used was since Major_A were using them, otherwise I had thought default settings or maxed out settings had been easier to use as a reference so that people doesn't have to check the screenshots very closely to ensure all settings are the same as otherwise the comparisions are obsolete.

    Also Btw you do know RPG wizard that A opteron 170 is a duel core cpu lol.. Judging from what your saying to Major A it sounds like your saying he has a single core cpu ?
    Check my signature a bit closer plz, I also have an Opteron 165 @ 2.8GHz setup. I would shoot myself being a "Xtreme Addict" if I didn't know that an Opteron 170 was a dual core CPU. I meantioned the dual core CPU support only to add to the point "this game is very cpu dependant" and dual core support is a great thing I wanna see more of. Thought it was worth meantioning in case any1 would be wondering if it supported it or not. I basicly comment on the dual core support in all new pc games threads now as many people are interesting what the cpu usage is like, did it with BioShock, MoH and now WiC so far.

    If you dropped 27% using 4xAA then increasing that to 8X or higher i assume you would drop another 27% which means that 42 average would be only 19fps, meaning your video card limited, and were cpu limited. Guess you cant have it both ways lol
    Sorry but Geforce7 series only supports up to 4x AA otherwise I had tested higher. And yet again of course you will be either CPU and GPU limited in a game (if not RAM limited then perhaps) I never said I wasn't GPU limited, I took it for granted saying I got FPS drop by higher AA and AF settings that people here on XS forums would interpret that as "he (me that is) is GPU limited then" instead of me saying only "you're getting CPU limited" while I'm not. Of course it's so clear that if I'm not CPU limited I'm GPU limited so there's no reason to meantion it. And yes I need a GPU upgrade to balance this rig more properly but I'm gonna wait for a while still as I think my 7900GTO does a decent job still and being a 19" CRT owner I only stick to 1280x960 res. But perhaps later on I'll upgrade to a 9800GTS or 2950 Pro or something like that, the future will decide that. But at least I think I can wait for a little bit longer to get even more performance for same price.

    Im trying to prove a point thats no matter how fast his cpu is he wont beat a system with a 8800 in it, his fps is what im getting with the details maxed, and his are on medium and only 4xAA
    Again, you're referring to max detail settings is like the only valid benchmark scenario but that's YOUR opinion. I have an advantage at lower detail settings and you at highest detail settings, but I wouldn't automaticly claim any of us like the "winner" unless both wins in both scenarios. But again this ain't a benchmark score result comparision and rampings thread like Major_A pointed out, it's only a performance comparision between different setups and hardware. Myself isn't only interested in GPU performance since I'm a BETA tester of PCSX2 Playstation 2 emulator for PC which is HIGHLY CPU dependant depending on games but in most games if you have at least somewhat balanced CPU and GPU performance you'll easily get CPU limited and GPU impact therefore becomes even 0%. Between my AMD @ 2.8GHz and Intel @ 3.75GHz setup there's usually a 71~77% FPS advantage even or ~40% clock for clock advantage for Intel setup (That's of course when comparing with the same 7900GTO card in both setups). Why am I meantion this? Well because it tells, not every1 have same interests, for me CPU performance is highly interesting as well so therefore I just cannot accept your scenario = GPU performance is always the only right thing to bench. Again I'm not trying to show off, for me it's important to stay objective and logical when posting in a forums, that's the 2 important adjectives for me.

    Major A yours scores are gonne be like mine pretty much, my point here being is RPG wizard, im just curious what he will get with the details i have on those pics.
    No prob, I'm here to contribute if you thought otherwise. Ask and ye shall get. These are with the same settings CandymanCan used but please remember it's AA4x as that's the max I'm allowed to use with my 7900GTO unfortunately. And these are of course with the 24/7 setting I always use, I always keep any system I have running at maxed out stable 24/7 overclock speeds like in my sig.
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	wic-benchmark5.jpg 
Views:	156 
Size:	41.9 KB 
ID:	64826  
    Last edited by RPGWiZaRD; 09-24-2007 at 07:54 AM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  16. #16
    I am Xtreme
    Join Date
    Jul 2007
    Location
    The Sacred birth place of Watercooling
    Posts
    4,689
    with dx10 rendering that take more load off the cpu to gpu yeah?If i turn it off then the cpu will be used more?
    Quote Originally Posted by skinnee View Post
    No, I think he had a date tonight...

    He and his EK Supreme are out for a night on the town!

  17. #17
    Xtreme Addict
    Join Date
    Apr 2005
    Location
    Houston, TX
    Posts
    1,196
    No basically DX10 makes a video card work harder with little to no visual differences (Disclaimer - this is an over-generalization). When Far Cry came out and the nVidia Geforce 6X00 series came out people wanted to see what the new tech HDR looked like. Well as soon as you turned on HDR on Far Cry with a 6X00 card the performance went to crap. All new tech on the first gen of a card (DX10 and 8X00/HD 2X00) the performance will likely not be as good as the second generation.

    So in short - play in DX9 mode if you can.
    C2Q Q9550 @ 4.03Ghz/4GB OCZ DDR 1066 @ 902Mhz /BFG GTX 260/Gigabyte GA-EP45-UD3P/2 X Samsung F1 1TB/Samsung F1 750GB/Samsung SH-S223Q/Corsair TX650/X-Fi Titanium Fata1ity/Swiftech H20-220/Logitech Z5300 Speakers/Samsung SyncMaster 2253BW Monitor/Win 7 Ultimate x64

  18. #18
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Yeah I'm sure DX10 was made with future in mind. While first gen cards may have trouble taking advantage of the DX10 features, it doesn't have to mean it will be like this forever. Obviously the cards have to reach a certain level of performance capability/complexity b4 the benefits of DX10 can start to kick in. It's been like that with many other features in history of hardware so I don't see any reason for DX10 to not be like that either. Personally I'm not the kind of guy that likes to buy first gen stuff, I always prefer waiting for the more mature parts to arrive b4 buying. That's why I'm not buying a 8800GTS and why I didn't buy a 7800GT and waited for 7900 series instead and now I'm waiting for NVIDIA 9800 or ATI X2950 series. I believe in the benefits choosing some stuff from "the 2nd wave" is better in the long run and since new series appears so soon on market these days and especially when we're talking graphics card market no long waiting have to be made if you always stick to 2nd gen stuff. which is releaseed how often these days? Every 1~1½ half year or so perhaps and I'm sure most of you wouldn't upgrade sooner unless you're rich and can afford more frequent updates.

    Of course it's great not every1 is thinking this way or else NVIDIA and ATI/AMD would be cursing at their customers for holding back and getting no sales.
    Last edited by RPGWiZaRD; 09-24-2007 at 10:51 AM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  19. #19
    Xtreme Addict
    Join Date
    Apr 2005
    Location
    Houston, TX
    Posts
    1,196
    I was going to wait until the nVidia 9X00 generation too but Bioshock was making my 7600GT cry. I knew that a ton of kickass games were coming out soon too so I bought the 8800GTS. No complaints so far except I'm still paying for the card.
    C2Q Q9550 @ 4.03Ghz/4GB OCZ DDR 1066 @ 902Mhz /BFG GTX 260/Gigabyte GA-EP45-UD3P/2 X Samsung F1 1TB/Samsung F1 750GB/Samsung SH-S223Q/Corsair TX650/X-Fi Titanium Fata1ity/Swiftech H20-220/Logitech Z5300 Speakers/Samsung SyncMaster 2253BW Monitor/Win 7 Ultimate x64

  20. #20
    Xtreme Cruncher
    Join Date
    Feb 2007
    Posts
    594
    C2D @ 3.1GHz, X1950XT 256mb @ 662, 2052

    Major A's settings & Max settings
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	Resize of wic.jpg 
Views:	71 
Size:	43.0 KB 
ID:	64879   Click image for larger version. 

Name:	Resize of wicmax.jpg 
Views:	70 
Size:	42.4 KB 
ID:	64880  

  21. #21
    I am Xtreme
    Join Date
    Jul 2007
    Location
    The Sacred birth place of Watercooling
    Posts
    4,689
    hey xcel whats that program in the corner monitoring gpu temps?
    Looks like it might come in handy can you link me to it?

    cheers
    Quote Originally Posted by skinnee View Post
    No, I think he had a date tonight...

    He and his EK Supreme are out for a night on the town!

  22. #22
    I am Xtreme
    Join Date
    Oct 2005
    Posts
    4,682
    Run teh becnhmark with your visual settings at the most minimum hehehehe

    I did.........that was hysterical........................

  23. #23
    Xtreme Addict
    Join Date
    Apr 2005
    Location
    Houston, TX
    Posts
    1,196
    I haven't had an ATi card since my 9800 Pro but that is probably ATi Tray Tools or something ATi specific.
    C2Q Q9550 @ 4.03Ghz/4GB OCZ DDR 1066 @ 902Mhz /BFG GTX 260/Gigabyte GA-EP45-UD3P/2 X Samsung F1 1TB/Samsung F1 750GB/Samsung SH-S223Q/Corsair TX650/X-Fi Titanium Fata1ity/Swiftech H20-220/Logitech Z5300 Speakers/Samsung SyncMaster 2253BW Monitor/Win 7 Ultimate x64

  24. #24
    Xtreme Cruncher
    Join Date
    Feb 2007
    Posts
    594
    Quote Originally Posted by disruptfam View Post
    hey xcel whats that program in the corner monitoring gpu temps?
    Looks like it might come in handy can you link me to it?

    cheers
    Yep it's Ati Tray Tools. There is a similiar program for Nvidia cards called Nvidia Tray Tools, although I have no idea if it has the same features as ATT. I hate the slow loading of Catalyst Control Center and besides ATT comes with all the same settings as CCC plus much more. You turn on the OSD with a predefined hotkey, comes in handy and displays the fps, clock speeds, temps, time, used memory, resolution and renderer. You can config it to show other stuff as well such as cpu & HD temp.

    The fps displays in my pics are a bit messed up since the gpu isn't rendering when the test is over. Normally during the test or in game it works just fine.

  25. #25
    I am Xtreme
    Join Date
    Jul 2007
    Location
    The Sacred birth place of Watercooling
    Posts
    4,689
    Run teh becnhmark with your visual settings at the most minimum hehehehe

    I did.........that was hysterical........................
    lol i'm going to try now
    Quote Originally Posted by skinnee View Post
    No, I think he had a date tonight...

    He and his EK Supreme are out for a night on the town!

Page 1 of 2 12 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •