MMM
Page 5 of 8 FirstFirst ... 2345678 LastLast
Results 101 to 125 of 188

Thread: NVIDIA Says AMD Reduced Image Quality Settings HD 6800 Series For Better Performance

  1. #101
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Quote Originally Posted by slaveondope View Post
    Ive got your IQ control right here.....



    Haven't seen a jaggy line yet.
    Hahaha, that's what I call an AA-technique without sacrificing performance.

    The rest of the thread's discussion is so zZZzzZZzzZZz.... Move on people.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  2. #102
    Registered User
    Join Date
    Dec 2008
    Posts
    38
    Its not like nvidia has not been quality of doing similar shady things in the past. Whats the problem.

  3. #103
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,128
    Quote Originally Posted by SKYMTL View Post
    Left is a GPU rendering the scene. At the right is the software renderer which my its very nature doesn't have ANY optimizations.
    It's a reference renderer which defines what exactly the scene should look like?

    Software rasterization by it's very nature does use the same optimizations as hardware rasterization does - depending on the method used of course. The only difference between the two is that the CPU does exactly the same as the ROPs and TMUs do hardwired, and at the same time the CPU emulates the shaders.

  4. #104
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Calmatory View Post
    It's a reference renderer which defines what exactly the scene should look like?

    Software rasterization by it's very nature does use the same optimizations as hardware rasterization does - depending on the method used of course. The only difference between the two is that the CPU does exactly the same as the ROPs and TMUs do hardwired, and at the same time the CPU emulates the shaders.
    From my understanding, they tried to use a method that is independent from the AMD / NVIDIA driver stacks.

  5. #105
    Xtreme Member
    Join Date
    Apr 2010
    Location
    Budaors, Hungary.
    Posts
    143
    Quote Originally Posted by Vardant View Post
    So you obviously have proof, if you're so adamant about this, even though NV admitted putting it in and said it's off by default and can't be turned on through CP, right?
    So you think that after they went trough the process of developing, testing and implementing their own version in ForceWare and of course asking reviewers formally to kindly test with it on, nVIDIA will disable it (losing about ~10% free performance is selected titles) for end-users whom have no visible indication nor in the ForceWare CP, nor in games for that matter what the status of this optimization is. We shall believe this, because they said so?

    Actually an article would be nice testing the last 2 or 3 drivers from nVIDIA with the switching utility to see whats what. Any reviewers up for it?

    If I'm proven wrong, shame on me. On the other hand if they are pulling a "Hey look!" pointing in the other direction while doing the same, then shame on them.

    Call me what you want, but I'm skeptical... situation kind of reminds me of the good ol' 3DMark03 times.

    "We are going to hell, so bring your sunblock..."

  6. #106
    Xtreme Member
    Join Date
    Aug 2007
    Location
    Aarhus, Denmark
    Posts
    314
    I was a bit surprised when I replaced my Radeon HD5870 with a GTX 580. Surprised at the difference in image quality. I went from a GTX 7900 ( I think ) --> Radeon 9800 --> 4870 --> 4870x2 --> 5870 and then a GTX 580.
    I switched to a GTX 580 due to the "lack" of general performance increase in the 6000 series and also due to the debate about image quality.
    So when I first fired up the GTX 580 to play some COD Black ops I was really surprised at what I saw. I didn't expect any major difference going in but boy was I surprised. The image is just so much sharper and the colors so much fuller. Maybe there is a better technical explanation, but I'm just describing what I see on he screen. Thinking that it could just be COD Black ops I tried another game ... Left 4 Dead 2. I play L4D2 a lot and since it's not a TWIMTBP title (and steam complains about my Graphics adapter being unknown) I didn't expect much difference to show. But boy was I wrong AGAIN !!! The same thing goes ... much much sharper picture, more vibrant colors etc. and I'm talking about a game I have logged hundreds and hundreds of hours in playing on high end Radeon Hardware. No matter what game I try it's the same story, much better image quality and more vibrant colors. You could argue that it's just a matter of adjusting settings in the control panel and it could be, but since I just got the GTX 580 I haven't changed any settings in the control panel and it's at default settings.
    I'm a bit surprised about this and also a bit disappointed, because since the GTX 7xxx days and the GeForce FX + 3DMark 2003 days ATI has always been a guarantee for image quality for me. I hope that AMD/ATI will find their way onto the right path again and not play these games of cat and mouse where they do AF correctly in AF tester (when it's detected) but differently in games etc.. Picture quality is more important to me than FPS and I hope AMD will realize that too. So what if they loose 10% performance by rendering things the right way, it's a lot better than the bad press they get on this.
    AMD Ryzen 9 5900X
    ASRock Radeon RX 7900 XTX Phantom Gaming OC
    Asus ROG Strix B550-F Gaming Motherboard
    Corsair RM1000x SHIFT PSU
    32 GB DDR4 @3800 MHz CL16 (4 x 8 GB)

    1x WD Black SN850 1 TB
    1 x Samsung 960 250 GB
    2 x Samsung 860 1 TB
    1x Segate 16 TB HDD

    Dell G3223Q 4K UHD Monitor
    Running Windows 11 Pro x64 Version 23H2 build 22631.2506

    Smartphone : Samsung Galaxy S22 Ultra

  7. #107
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by sutyi View Post
    We shall believe this, because they said so?
    No we should believe you instead, a random guy on the internet without a shred of evidence

  8. #108
    Xtreme Member
    Join Date
    Apr 2006
    Location
    Dresden
    Posts
    139
    Quote Originally Posted by Toysoldier View Post
    I was a bit surprised when I replaced my Radeon HD5870 with a GTX 580. Surprised at the difference in image quality. I went from a GTX 7900 ( I think ) --> Radeon 9800 --> 4870 --> 4870x2 --> 5870 and then a GTX 580.
    I switched to a GTX 580 due to the "lack" of general performance increase in the 6000 series and also due to the debate about image quality.
    So when I first fired up the GTX 580 to play some COD Black ops I was really surprised at what I saw. I didn't expect any major difference going in but boy was I surprised. The image is just so much sharper and the colors so much fuller. Maybe there is a better technical explanation, but I'm just describing what I see on he screen.
    That's just absolute nonsense and based on a placebo effect at best. There is no difference in colours and sharpness whatsoever.

    About the AF issue: Hardly anyone would notice a difference in a blind test that does not pick particular scenes. However I do agree that AMD has to improve the AF quality.

  9. #109
    Xtreme Member
    Join Date
    Apr 2010
    Location
    Budaors, Hungary.
    Posts
    143
    Quote Originally Posted by trinibwoy View Post
    No we should believe you instead, a random guy on the internet without a shred of evidence



    Joke aside I'm still waiting for someone to do some testing on nVIDIAs FP16 demotion in current drivers, but nobody seems to care.

    "We are going to hell, so bring your sunblock..."

  10. #110
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Katzenschleuder View Post
    That's just absolute nonsense and based on a placebo effect at best. There is no difference in colours and sharpness whatsoever.
    You are incorrect.

    The default settings in the respective control panels for NVIDIA and AMD cause a difference in the overall color saturation, sharpness and gamma. AMD's settings tend to be on the cooler side of the spectrum while NVIDIA's are slightly warmer.

    You wouldn't see this if you went from one AMD product to another or one NVIDIA card to another. However, it is quite evident when going from one company to another.

    From my experience based on switching out cards on a regular basis, the statement you quoted is bang on.

  11. #111
    Xtreme Addict
    Join Date
    Mar 2005
    Location
    Rotterdam
    Posts
    1,553
    Quote Originally Posted by SKYMTL View Post
    You are incorrect.

    The default settings in the respective control panels for NVIDIA and AMD cause a difference in the overall color saturation, sharpness and gamma. AMD's settings tend to be on the cooler side of the spectrum while NVIDIA's are slightly warmer.

    You wouldn't see this if you went from one AMD product to another or one NVIDIA card to another. However, it is quite evident when going from one company to another.

    From my experience based on switching out cards on a regular basis, the statement you quoted is bang on.
    This true.

    Though from my experience its mostly saturation only. Not sure about sharpness, that might be something to do with the quality settings on both brands which differ in default mode as you mentioned.
    Gigabyte Z77X-UD5H
    G-Skill Ripjaws X 16Gb - 2133Mhz
    Thermalright Ultra-120 eXtreme
    i7 2600k @ 4.4Ghz
    Sapphire 7970 OC 1.2Ghz
    Mushkin Chronos Deluxe 128Gb

  12. #112
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    Quote Originally Posted by sutyi View Post
    Joke aside I'm still waiting for someone to do some testing on nVIDIAs FP16 demotion in current drivers, but nobody seems to care.
    Someone should investigate a problem you've made up on the basis, that NVIDIA is evil, it must be true?

    Keep wondering why nobody cares,

  13. #113
    Xtreme Mentor
    Join Date
    Feb 2004
    Location
    The Netherlands
    Posts
    2,984
    Quote Originally Posted by Toysoldier View Post
    I was a bit surprised when I replaced my Radeon HD5870 with a GTX 580. Surprised at the difference in image quality. I went from a GTX 7900 ( I think ) --> Radeon 9800 --> 4870 --> 4870x2 --> 5870 and then a GTX 580.
    I switched to a GTX 580 due to the "lack" of general performance increase in the 6000 series and also due to the debate about image quality.
    So when I first fired up the GTX 580 to play some COD Black ops I was really surprised at what I saw. I didn't expect any major difference going in but boy was I surprised. The image is just so much sharper and the colors so much fuller. Maybe there is a better technical explanation, but I'm just describing what I see on he screen. Thinking that it could just be COD Black ops I tried another game ... Left 4 Dead 2. I play L4D2 a lot and since it's not a TWIMTBP title (and steam complains about my Graphics adapter being unknown) I didn't expect much difference to show. But boy was I wrong AGAIN !!! The same thing goes ... much much sharper picture, more vibrant colors etc. and I'm talking about a game I have logged hundreds and hundreds of hours in playing on high end Radeon Hardware. No matter what game I try it's the same story, much better image quality and more vibrant colors. You could argue that it's just a matter of adjusting settings in the control panel and it could be, but since I just got the GTX 580 I haven't changed any settings in the control panel and it's at default settings.
    I'm a bit surprised about this and also a bit disappointed, because since the GTX 7xxx days and the GeForce FX + 3DMark 2003 days ATI has always been a guarantee for image quality for me. I hope that AMD/ATI will find their way onto the right path again and not play these games of cat and mouse where they do AF correctly in AF tester (when it's detected) but differently in games etc.. Picture quality is more important to me than FPS and I hope AMD will realize that too. So what if they loose 10% performance by rendering things the right way, it's a lot better than the bad press they get on this.
    as SKYMTL said, this difference in color and sharpness would have everything to do with these companies using different color profiles (warmth, saturation and all that). so by touching a few sliders you could obtain either one's look.

    Ryzen 9 3900X w/ NH-U14s on MSI X570 Unify
    32 GB Patriot Viper Steel 3733 CL14 (1.51v)
    RX 5700 XT w/ 2x 120mm fan mod (2 GHz)
    Tons of NVMe & SATA SSDs
    LG 27GL850 + Asus MG279Q
    Meshify C white

  14. #114
    Xtreme Member
    Join Date
    Apr 2010
    Location
    Budaors, Hungary.
    Posts
    143
    Quote Originally Posted by Vardant View Post
    Someone should investigate a problem you've made up on the basis, that NVIDIA is evil, it must be true?

    Keep wondering why nobody cares,
    Oh yeah nVIDIA is most definitely the Green Goblin of graphics.

    "We are going to hell, so bring your sunblock..."

  15. #115
    Xtreme Addict
    Join Date
    Jul 2006
    Location
    Between Sky and Earth
    Posts
    2,035
    As mentioned in the other topic:

    Quote Originally Posted by XSAlliN View Post
    Please don't post any "so called news" like this anymore...

    nVidia CEO says ATi sux

    AMD CEO says nVidia sux

    A CEO or other representative of some company, bashing another company - that's not news... just a political strategy appreciated by haters.
    If you think there is some truth about this - XS has a section at Off-Topic -> Tech Talk ... but don't trash news section with their politics intended for bashing one each-other by trash-talking products from an "adversary (as in competition)".

  16. #116
    Xtreme Member
    Join Date
    Jan 2007
    Posts
    211
    Quote Originally Posted by XSAlliN View Post
    As mentioned in the other topic:



    If you think there is some truth about this - XS has a section at Off-Topic -> Tech Talk ... but don't trash news section with their politics intended for bashing one each-other by trash-talking products from an "adversary (as in competition)".


    Spot on.

  17. #117
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,341
    Quote Originally Posted by Toysoldier View Post
    I was a bit surprised when I replaced my Radeon HD5870 with a GTX 580. Surprised at the difference in image quality. I went from a GTX 7900 ( I think ) --> Radeon 9800 --> 4870 --> 4870x2 --> 5870 and then a GTX 580.
    I switched to a GTX 580 due to the "lack" of general performance increase in the 6000 series and also due to the debate about image quality.

    so pls tell me what lack of performance increase you are referring to? you mean the update of the 57xx series which is now 68xx series... to me it looks like you just didn't look/read any decent review and didn't search enough threads here to really know that you should wait a bit longer to purchase a card unless you are a real nv fanboy, certainly a 580..... within 2 weeks it will be 150$ cheaper does these 2 smileys look also green and blue on your new nv580 or do they have much more brigther color now.......

    now to the point, yes you could see a difference but that has more to do with profile settings, drivers and cards then anything else, just like monitors have a big influence...
    Last edited by duploxxx; 11-23-2010 at 11:53 AM.
    Quote Originally Posted by Movieman View Post
    Fanboyitis..
    Comes in two variations and both deadly.
    There's the green strain and the blue strain on CPU.. There's the red strain and the green strain on GPU..

  18. #118
    Xtreme Member
    Join Date
    Apr 2006
    Location
    Dresden
    Posts
    139
    Quote Originally Posted by SKYMTL View Post
    You are incorrect.

    The default settings in the respective control panels for NVIDIA and AMD cause a difference in the overall color saturation, sharpness and gamma. AMD's settings tend to be on the cooler side of the spectrum while NVIDIA's are slightly warmer.

    You wouldn't see this if you went from one AMD product to another or one NVIDIA card to another. However, it is quite evident when going from one company to another.

    From my experience based on switching out cards on a regular basis, the statement you quoted is bang on.
    This discussion here is about the anisotrope filtering quality. So it is just ridicules to state on this topic that IHVx delivers "much sharper" images with "colors so much fuller" than IHVy here.

    Yeah, there are some differences in the default (uncalibrated) video signal profiles. SO WHAT?!

    Argh... I really should stop wasting my time here!

  19. #119
    Xtreme Member
    Join Date
    Aug 2007
    Location
    Aarhus, Denmark
    Posts
    314
    Quote Originally Posted by duploxxx View Post
    so pls tell me what lack of performance increase you are referring to? you mean the update of the 57xx series which is now 68xx series... to me it looks like you just didn't look/read any decent review and didn't search enough threads here to really know that you should wait a bit longer to purchase a card unless you are a real nv fanboy, certainly a 580..... within 2 weeks it will be 150$ cheaper does these 2 smileys look also green and blue on your new nv580 or do they have much more brigther color now.......

    now to the point, yes you could see a difference but that has more to do with profile settings, drivers and cards then anything else, just like monitors have a big influence...
    Compared to my old HD5870 I don't think the 6870 was that big a deal and yes I did read reviews and lots of them. You can mock me all you want, but saying I'm a NV fanboy after all the ATI cards I have owned is just plain stupid on your part. And why should I wait ? If I have the money to burn why shouldn't I buy a new graphics card whenever I want to ?
    You can turn and twist it all you want, there is a difference and in some cases it's HUGE and judging by your comments you didn't read what I wrote.

    Fact of he matter is I didn't write what I did to taunt anyone or to mock anyone, but simply to report what I observed going from a long line of Radeon cards to a Nvidia card.
    Last edited by Toysoldier; 11-23-2010 at 01:46 PM.
    AMD Ryzen 9 5900X
    ASRock Radeon RX 7900 XTX Phantom Gaming OC
    Asus ROG Strix B550-F Gaming Motherboard
    Corsair RM1000x SHIFT PSU
    32 GB DDR4 @3800 MHz CL16 (4 x 8 GB)

    1x WD Black SN850 1 TB
    1 x Samsung 960 250 GB
    2 x Samsung 860 1 TB
    1x Segate 16 TB HDD

    Dell G3223Q 4K UHD Monitor
    Running Windows 11 Pro x64 Version 23H2 build 22631.2506

    Smartphone : Samsung Galaxy S22 Ultra

  20. #120
    Xtreme Member
    Join Date
    Aug 2007
    Location
    Aarhus, Denmark
    Posts
    314
    Quote Originally Posted by Katzenschleuder View Post
    That's just absolute nonsense and based on a placebo effect at best. There is no difference in colours and sharpness whatsoever.

    About the AF issue: Hardly anyone would notice a difference in a blind test that does not pick particular scenes. However I do agree that AMD has to improve the AF quality.
    That's absolute nonsense. There is a difference in the default settings on a NV card vs an ATI card. You should try using a Radeon card for some weeks and then switch to a NV card and you would see for yourself. I'm not saying the ATI card can't be made to look like the nVidia card but it doesn't by default and far from it. And when ATI starts to detect certain AF test programs and then use a better AF method than the driver normally do that's when things start to get out of control.
    AMD Ryzen 9 5900X
    ASRock Radeon RX 7900 XTX Phantom Gaming OC
    Asus ROG Strix B550-F Gaming Motherboard
    Corsair RM1000x SHIFT PSU
    32 GB DDR4 @3800 MHz CL16 (4 x 8 GB)

    1x WD Black SN850 1 TB
    1 x Samsung 960 250 GB
    2 x Samsung 860 1 TB
    1x Segate 16 TB HDD

    Dell G3223Q 4K UHD Monitor
    Running Windows 11 Pro x64 Version 23H2 build 22631.2506

    Smartphone : Samsung Galaxy S22 Ultra

  21. #121
    Xtreme Member
    Join Date
    Apr 2006
    Location
    Dresden
    Posts
    139

    Talking

    Quote Originally Posted by Toysoldier View Post
    That's absolute nonsense. There is a difference in the default settings on a NV card vs an ATI card. You should try using a Radeon card for some weeks and then switch to a NV card and you would see for yourself. I'm not saying the ATI card can't be made to look like the nVidia card but it doesn't by default and far from it. And when ATI starts to detect certain AF test programs and then use a better AF method than the driver normally do that's when things start to get out of control.
    All right. Then let us determine this in a controlled test.

    Which screenshot has been taken by a RV870 and which by a GF110?
    http://img94.imageshack.us/img94/6471/dirt2a.jpg
    http://img220.imageshack.us/img220/48/dirt2b.jpg
    http://img819.imageshack.us/img819/8263/metaf.jpg
    http://img254.imageshack.us/img254/4315/metb.jpg
    http://img38.imageshack.us/img38/2775/mwabz.jpg
    http://img202.imageshack.us/img202/7807/mwby.jpg
    http://img841.imageshack.us/img841/7946/vana.jpg
    http://img534.imageshack.us/img534/1826/vanbi.jpg

    Oh and please don't forget to show us where exactly "The image is just so much sharper and the colors so much fuller." based on these comparisons!

    PS.: Note that the image quality settings used here are equal to Radeaon 5870 default and GeForce 580 default as you stated in your comparison.
    Last edited by Katzenschleuder; 11-23-2010 at 03:44 PM.

  22. #122
    Xtreme Addict
    Join Date
    Feb 2006
    Location
    Vienna, Austria
    Posts
    1,940
    first dirt 2 is 870, second gtx 460, comparison not possible because 460 lacks tesselation
    second game - don't see a difference
    MW - no clue (gamma difference?! / can't compare IQ due to different brightniess settings
    vantage: both screenshots show banding in certain areas...
    Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
    Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX


    Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
    Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX


    Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
    256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB


    Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD

  23. #123
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,341
    Quote Originally Posted by Toysoldier View Post
    Compared to my old HD5870 I don't think the 6870 was that big a deal and yes I did read reviews and lots of them. You can mock me all you want, but saying I'm a NV fanboy after all the ATI cards I have owned is just plain stupid on your part. And why should I wait ? If I have the money to burn why shouldn't I buy a new graphics card whenever I want to ?
    You can turn and twist it all you want, there is a difference and in some cases it's HUGE and judging by your comments you didn't read what I wrote.

    Fact of he matter is I didn't write what I did to taunt anyone or to mock anyone, but simply to report what I observed going from a long line of Radeon cards to a Nvidia card.
    the comment I provided was you pointing out that there is no added value going from 5870 to 6870 ---> off course not that is the hole point, 6870 is a new price range in the market it is not intended to replace the 58xx series, you still don't get it....and if you already own a 5870 i can't think of any reason to spend another +500$ to buy a new card without waiting 2-3 more weeks, that is what I call consumer.
    Quote Originally Posted by Movieman View Post
    Fanboyitis..
    Comes in two variations and both deadly.
    There's the green strain and the blue strain on CPU.. There's the red strain and the green strain on GPU..

  24. #124
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by Katzenschleuder View Post
    All right. Then let us determine this in a controlled test.

    Which screenshot has been taken by a RV870 and which by a GF110?
    http://img94.imageshack.us/img94/6471/dirt2a.jpg
    http://img220.imageshack.us/img220/48/dirt2b.jpg
    http://img819.imageshack.us/img819/8263/metaf.jpg
    http://img254.imageshack.us/img254/4315/metb.jpg
    http://img38.imageshack.us/img38/2775/mwabz.jpg
    http://img202.imageshack.us/img202/7807/mwby.jpg
    http://img841.imageshack.us/img841/7946/vana.jpg
    http://img534.imageshack.us/img534/1826/vanbi.jpg

    Oh and please don't forget to show us where exactly "The image is just so much sharper and the colors so much fuller." based on these comparisons!

    PS.: Note that the image quality settings used here are equal to Radeon 5870 default and GeForce 580 default as you stated in your comparison.
    I have to admit this is a very good, informative rebuttal. I now have to wonder how the IQ will look with MLAA+EQAA.
    Last edited by Eastcoasthandle; 11-23-2010 at 04:04 PM.
    [SIGPIC][/SIGPIC]

  25. #125
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    One complaint about ATI I've had for a while is that their games are substantially darker for the same ingame Source gamma setting. Made some dark, gritty HL2 mods hard to play.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

Page 5 of 8 FirstFirst ... 2345678 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •