Page 3 of 5 FirstFirst 12345 LastLast
Results 51 to 75 of 111

Thread: dx10 to use software aa resolve

  1. #51
    Banned
    Join Date
    Oct 2006
    Posts
    963
    dil you cant twist this in favour of the 8800's lol. they performed poorly overall when shader based aa was enabled. your logic on this one is flawed. heres what the reviewer of that benchmarking session concluded..... Here's the chart we really care about most—average frame rates. The peak value you're able to achieve for a split second and the slow stutter you suffer for perhaps just a moment aren't nearly as important as the frame rate you maintain over time. The HD 2900 XT gives the 8800 GTX a good run for its money. Costing about $150 less, ATI's new card manages to equal the larger, more expensive, yet more power-efficient DX10 offering from Nvidia. At 1280x720 with no AA applied, it's even a good deal faster. so the hd2900xt competes with the gtx in dx10 and is far cheaper. i know which card i would recommend although on another certain forum, all advice about the hd was smeared by nvidia lovers leading many to buy the gts640 thinking it was the best bang for buck. not any more.... here comes dx10....
    Last edited by purecain; 07-01-2007 at 02:39 PM.

  2. #52
    Xtreme Member
    Join Date
    Jun 2005
    Location
    Bulgaria, Varna
    Posts
    447
    Thanks!
    I've repeated the test pattern from the site, and I've got definitely lower hit enabling AA, from what is shown there. Maybe it's because of the newer driver.

    But that or either way, G80 have [much] more pixel/texture pushing power for good use here and there... in opposite to the lack of math crushing payload.

  3. #53
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by purecain View Post
    dil you cant twist this in favour of the 8800's lol. they performed poorly overall when shader based aa was enabled. your logic on this one is flawed.
    The performance issue isn't due to shader based AA, the low performance hit shows that directly. The issue lies elsewhere, as made obvious by the larger hit incurred by the R600.

    I'd really like to hear the logic behind you trying to say otherwise, it's pretty obvious my statement rings true, and that goes by your own favorite benchmark to point out. Care to explain why, if you wish to tell me it was because shader aa was enabled, that the hd2900xt took the larger hit?
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  4. #54
    Banned
    Join Date
    Oct 2006
    Posts
    963
    the hd2900xt was the leader overall beating the gtx. is that good enough for you....
    more maths=hd2900xt floating point power>gtx lags behind with its aa hardware useless in new tech....
    Last edited by purecain; 07-01-2007 at 02:44 PM.

  5. #55
    Xtreme Enthusiast
    Join Date
    May 2007
    Location
    Ireland
    Posts
    940
    http://www.extremetech.com/article2/...2147119,00.asp

    ammm....the 2900xt wins in nearly all the cases here and its still on immature drivers.....the 8800's are nearly 9 months old.....the 2900 has something going for it ..... isnt coj an nvidia sponsored game? isnt it Ment to be optimized for 8800's?

  6. #56
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by DilTech View Post
    The performance issue isn't due to shader based AA, the low performance hit shows that directly. The issue lies elsewhere, as made obvious by the larger hit incurred by the R600.

    I'd really like to hear the logic behind you trying to say otherwise, it's pretty obvious my statement rings true, and that goes by your own favorite benchmark to point out. Care to explain why, if you wish to tell me it was because shader aa was enabled, that the hd2900xt took the larger hit?
    There are many reasons why in that test the hit is noticed.
    -accessing HD
    -some vista related program running in the background for a brief moment
    -Driver issue
    -etc

    As someone else just posted, results will vary and it appears that the performance hit that you make mention to might be part of a larger picture that "results will vary". I am sure if those test were run again you will get different min and max.
    Last edited by Eastcoasthandle; 07-01-2007 at 02:57 PM.
    [SIGPIC][/SIGPIC]

  7. #57
    Xtreme Member
    Join Date
    Jun 2005
    Location
    Bulgaria, Varna
    Posts
    447
    By the way, there is another functionality that R600 now offloads to the ALU array, and that is all the cube-map and texture projection lookups. In the previous generations, there was a dedicated hardware in the texture units for essentially free processing of those, in a contrary of the current situation.

  8. #58
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by Papu View Post
    http://www.extremetech.com/article2/...2147119,00.asp

    ammm....the 2900xt wins in nearly all the cases here and its still on immature drivers.....the 8800's are nearly 9 months old.....the 2900 has something going for it ..... isnt coj an nvidia sponsored game? isnt it Ment to be optimized for 8800's?
    COJ is a ATi game, not a NVidia.

    http://img122.imageshack.us/img122/7964/dsc00127ob2.jpg

    Also, the nvidia driver wasn't optimized for COJ, remember, they made these changes to the game right before release(it was still optimized for the old engine methods), making the situation similar to what happened to the R600 in the lost planet demo.
    Last edited by DilTech; 07-01-2007 at 03:19 PM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  9. #59
    Banned
    Join Date
    Oct 2006
    Posts
    963
    whats your point dil, it changes nothing. the hd2900xt shouldnt compete with the gtx but it does and then some. with the GTX the architecture isnt built for shader based aa resolve. as you so kindly pointed out on numerous occasions it has dedicated aa hardware and plenty of it, unfortunate it will be useless in dx10.

  10. #60
    Xtreme Enthusiast
    Join Date
    May 2007
    Location
    Ireland
    Posts
    940
    Quote Originally Posted by DilTech View Post
    COJ is a ATi game, not a NVidia.

    http://img122.imageshack.us/img122/7964/dsc00127ob2.jpg

    Also, the nvidia driver wasn't optimized for COJ, remember, they made these changes to the game right before release(it was still optimized for the old engine methods), making the situation similar to what happened to the R600 in the lost planet demo.
    sorry my bad :P

  11. #61
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by purecain View Post
    whats your point dil, it changes nothing. the hd2900xt shouldnt compete with the gtx but it does and then some. with the GTX the architecture isnt built for shader based aa resolve. as you so kindly pointed out on numerous occasions it has dedicated aa hardware and plenty of it, unfortunate it will be useless in dx10.
    Correction, in some dx10 titles. Currently, we have only 3 titles with DX10 to go by, and 2 of the 3 use hardware AA resolves... The only title that uses shader aa resolves was COJ, which happens to be an ATi title.

    Kind of blows your entire theory out of the water, doesn't it?

    p.s. the 8800gts takes a smaller hit with shader aa resolves then the hd2900xt as well going by those benchmarks, so obviously the g80 can handle shader aa resolves just fine. That's the whole point!
    Last edited by DilTech; 07-01-2007 at 03:45 PM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  12. #62
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Um, there is no information to suggest that COJ is GITG. However, there is information that suggests that COJ belongs to TWIMTBP
    [SIGPIC][/SIGPIC]

  13. #63
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by Eastcoasthandle View Post
    Um, there is no information to suggest that COJ is GITG. However, there is information that suggests that COJ belongs to TWIMTBP
    Did you not see that screenshot?

    http://img122.imageshack.us/img122/7964/dsc00127ob2.jpg

    That's all the information one needs!

    The box clearly states ATi. There's no mention of nvidia anywhere. It was originally a TWIMTBP game, but then they switched to ATi. The NZone france site just never took it down, it's been off the american NZone site for quite some time.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  14. #64
    Banned
    Join Date
    Oct 2006
    Posts
    963
    we've already been through this. i cant be bothered arguing but hardware based aa resolve wont be used as it has issues with hdr and doesnt offer the programmability obtained through the use of shader based aa resolve.
    just google and read up as we cant keep going round in circles. my theory and that of ati's is a sound one. unless you know something the tech people at ati dont.....

  15. #65
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by DilTech View Post
    Did you not see that screenshot?

    http://img122.imageshack.us/img122/7964/dsc00127ob2.jpg

    That's all the information one needs!

    The box clearly states ATi. There's no mention of nvidia anywhere. It was originally a TWIMTBP game, but then they switched to ATi. The NZone france site just never took it down, it's been off the american NZone site for quite some time.
    I am sorry but you provided nothing more then advertising.

    "NVIDIA has a long standing relationship with Techland and their publisher Ubisoft. In fact, the original European version of Call Of Juarez that was launched in September 2006 is part of the "The Way Its Meant To Be Played" program. As a result of the support Techland and Ubisoft receives for being part of the "The Way Its Meant To Be Played" program, NVIDIA discovered that the early build of the game that was distributed to the press has an application bug that violates DirectX 10 specifications by mishandling MSAA buffers, which causes DirectX 10 compliant hardware to crash. Our DevTech team has worked with Techland to fix that and other bugs in the Call of Juarez code...
    Source

    You post a pic of a ati sticker that anyone can put on the box. However, it's not so easy to add Nvidia's logo as the game is loading.

    How do you explain a ati sticker on the box vs Nvidia's logo as the game starts? If all you have is a ati sticker, I am sorry but thats not enough to confirm that TWIMTBP is not being used here to some extent. I do not want to get off topic here but the game is clearly intented to use TWIMTBP. Was their any tweaks added to get the game to work properly under DX10, it's possible. However, this only proves that TWIMTBP and DX10 really don't mix well together IMO. If that's true there is really no foul play here. And as you noted in that 1 example the G80 is not taking much of a hit.

    Quote Originally Posted by DilTech View Post
    I don't think you caught what I said... Allow me to repeat it.

    COJ was originally a TWIMTBP title, that changed earlier this year. It released there almost a year before it did in america.
    Also, that's not a sticker. Read the fine print on that box, you'll see mentions of ATi/AMD's logos and none of NVidia. It's all right there. Therefore, the American version, and the DX10 version that came with it is NOT TWIMTBP, but ATI GITG.

    p.s. click the link you posted, scroll to the bottom, and see this image...

    Again, there is no proof of GITG however, there is plenty to show TWIMTBP. Also, the in game logo clearly marks this game for Nvidia. Something you have not addressed. Regardless of where the game was or who is selling it is clear it's a TWIMTBP game that may/may not have been tweak to run DX10 properly.
    The only thing you have proven (if you read the article) was that a benchmarking program was added. Again, please read the article as it discusses the benchmarking program and Techland clearly indicate that AMD is distributing the benchmark, not developing the game.
    Last edited by Eastcoasthandle; 07-01-2007 at 04:24 PM.
    [SIGPIC][/SIGPIC]

  16. #66
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by purecain View Post
    we've already been through this. i cant be bothered arguing but hardware based aa resolve wont be used as it has issues with hdr and doesnt offer the programmability obtained through the use of shader based aa resolve.
    just google and read up as we cant keep going round in circles. my theory and that of ati's is a sound one. unless you know something the tech people at ati dont.....
    Something you seem to be forgetting, these people don't just create engines for one game, they want them to be licensed, and as such they need performance. They always look for good trade offs, if the performance increase is greater then the quality loss, then they'll usually take it. A prime example was the textures in doom 3, the amount of things lacking in HL2(doorknobs in the form of flat textures on the doors)... If you take unnecessary performance hits, you're less likely to have your engine licensed.

    So, in practice, shader aa is capable of more accurate AA, but only in certain situations. Because of this, it's unlikely to be used too often until DX10.1, as DX10.1 is set to update a LOT on this method of AA.

    You still haven't answered why you keep saying the gtx/gts cards don't perform with shader aa yet they takes a smaller hit then the ati R600 though.

    Quote Originally Posted by Eastcoasthandle View Post
    I am sorry but you provided nothing more then advertising.


    Source

    You post a pic of a ati sticker that anyone can put on the box. However, it's not so easy to add Nvidia's logo as the game is loading


    I do not want to get off topic here but the game is clearly intented to use TWIMTBP. Was their any tweaks added to get the game to work properly under DX10, it's possible. However, this only proves that TWIMTBP and DX10 really don't mix well together IMO.
    I don't think you caught what I said... Allow me to repeat it.

    COJ was originally a TWIMTBP title, that changed earlier this year. It released there almost a year before it did in america.
    Also, that's not a sticker. Read the fine print on that box, you'll see mentions of ATi/AMD's logos and none of NVidia. It's all right there. Therefore, the American version, and the DX10 version that came with it is NOT TWIMTBP, but ATI GITG.

    p.s. click the link you posted, scroll to the bottom, and see this image...

    Last edited by DilTech; 07-01-2007 at 04:04 PM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  17. #67
    Banned
    Join Date
    Oct 2006
    Posts
    963
    ok you are just happily going to ignore all information available from devs and instead come to your own conclusion. thats fine, just innacurate.... linier aa is a thing of the past... dx10.1 isnt going to bring old features and methods back, not when they have a new solution.... hooooooly jeeeeebus....

  18. #68
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by purecain View Post
    ok you are just happily going to ignore all information available from devs and instead come to your own conclusion. thats fine, just innacurate.... linier aa is a thing of the past... dx10.1 isnt going to bring old features and methods back, not when they have a new solution.... hooooooly jeeeeebus....
    Never said they were, now you're merely reading incorrectly.

    I stated DX10.1 contains upgrades to shader aa. One of which is made to heavily boost performance using shader aa. Infact, a majority of DX10.1 is upgrades to AA. Thus, since hardware aa is faster, it makes more sense to stick with them until DX10.1 comes out and makes shader aa more plausible.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  19. #69
    Banned
    Join Date
    Oct 2006
    Posts
    963
    link please, we can elaborate on this a little further then.....
    until then heres what ati have to say...
    we asked Richard Huddy, Worldwide Developer Relations Manager of AMD's Graphics Products Group, to go into more detail about why the Radeon HD 2000-series architecture has been optimised for shader-based AA rather than traditional multi-sample AA. He told us that 'with the most recent generations of games we've seen an emphasis on shader complexity (mostly more maths) with less of the overall processing time spent on the final part of the rendering process which is "the AA resolve". The resolve still needs to happen, but it's becoming a smaller and smaller part of the overall load. Add to that the fact that HDR rendering requires a non-linear AA resolve and you can see that the old fashioned linear AA resolve hardware is becoming less and less significant.' Huddy also explained that traditional AA 'doesn't work correctly [in games with] HDR because pixel brightness is non-linear in HDR rendering.'

    While many reviews of the HD 2900XT have made unflattering comparisons between it and Nvidia's GeForce 8800-series, Huddy was upbeat about AMD's new chip. 'Even at high resolutions, geometry aliasing is a growing problem that can only really be addressed by shader-based anti-aliasing. You'll see that there is a trend of reducing importance for the standard linear AA resolve operation, and growing importance for custom resolves and shader-based AA. For all these reasons we've focused our hardware efforts on shader horsepower rather than the older fixed-function operations. That's why we have so much more pure floating point horsepower in the HD 2900XT GPU than NVIDIA has in its 8800 cards... There's more value in a future-proof design such as ours because it focuses on problems of increasing importance, rather than on problems of diminishing importance.
    heres a review of the actual game in dx10 form- http://www.geforce3d.net/index/node/35?page=0%2C5
    Last edited by purecain; 07-01-2007 at 05:04 PM.

  20. #70
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by purecain View Post
    link please, we can elaborate on this a little further then.....
    This is like the 4th time I've posted this for you, the other 3 were on EOCF.

    http://www.elitebastards.com/cms/ind...1&limitstart=2

    One of the main improvements touted by Microsoft in DirectX 10.1 is improved access to shader resources - In particular, this involves better control when reading back samples from multi-sample anti-aliasing. In conjunction with this, the ability to create customised downsampling filters will be available in DirectX 10.1.
    Again looking towards improvements on the image quality front, DirectX 10.1 will also see the introduction of full application control over anti-aliasing. This will allow applications to control the usage of both multi-sample and super-sample anti-aliasing, as well as giving them the ability to choose sample patterns to best suit the rendering scenario in a particular scene or title. Finally, these changes in DirectX 10.1 give the application control over the pixel coverage mask, a mask which is used to help to quickly approximate sampling for an area of pixels. This in particular should prove to be a boon when anti-aliasing particles, vegetation, scenes with motion blur and the like. All of this additional control handed to the application could allow for anti-aliasing to be used much more wisely and effectively, and controlled by game developers themselves, rather than the current 'all or nothing' implementation available, which basically amounts to a simple on-off switch.

    To add further to the additional focus on anti-aliasing in DirectX 10.1, support for a minimum of four samples per pixel (in other words, 4x anti-aliasing) is now required (Although this doesn't necessarily mean that support for 2x anti-aliasing in hardware and drivers is a thing of the past).
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  21. #71
    Banned
    Join Date
    Oct 2006
    Posts
    963
    please feel free to point out any part of what you've posted being in nvidia's favour.... for i cant see any....

  22. #72
    Banned
    Join Date
    Oct 2006
    Posts
    963
    here's the details...... wade through em http://v3.espacenet.com/textclam?DB=...N=US2006188163
    remember ati's hd2900 has more floating point power than the 8800 gtx.......
    Last edited by purecain; 07-01-2007 at 05:27 PM.

  23. #73
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by purecain View Post
    please feel free to point out any part of what you've posted being in nvidia's favour.... for i cant see any....
    It wasn't suppose to be in favor of nvidia, it was pointing out why we probably won't see shader aa go mainstream until DX10.1. Because DX10.1 improves access to shader resources, which will allow more to be done with it, as well as increase performance using shader aa. For now, it's faster performance wise to use hardware aa.

    Quote Originally Posted by purecain View Post
    here's the details...... wade through em http://v3.espacenet.com/textclam?DB=...N=US2006188163
    remember ati's hd2900 has more floating point power than the 8800 gtx.......
    I don't need shader AA explained to me...

    Remember the floating point power on the R600 is only high when it's at 100% efficiency. The problem with the R600 design is that it's rarely at peak efficiency, and actually usually less than 60% shader-wise.

    We can see clear as day that in COJ, which uses shader AA, that the 8800s take a smaller hit using shader aa then the R600. So, again I say it, where are you going with this?
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  24. #74
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by DilTech View Post
    It wasn't suppose to be in favor of nvidia, it was pointing out why we probably won't see shader aa go mainstream until DX10.1. Because DX10.1 improves access to shader resources, which will allow more to be done with it, as well as increase performance using shader aa. For now, it's faster performance wise to use hardware aa.



    I don't need shader AA explained to me...

    Remember the floating point power on the R600 is only high when it's at 100% efficiency. The problem with the R600 design is that it's rarely at peak efficiency, and actually usually less than 60% shader-wise.

    We can see clear as day that in COJ, which uses shader AA, that the 8800s take a smaller hit using shader aa then the R600. So, again I say it, where are you going with this?
    Which doesn't make it a GITG based game after all now does it
    Also, make sense why the Nvidia logo pops up when you start COJ. All that has been proven here is that
    -COJ is a TWIMTBP based
    -some adjustments were made to allow for shader aa in DX10. Again, I believe that TWIMTBP doesn't mix with DX10
    -ATI was allowed to distribute the benchmark for COJ
    -the performance hit can be linked to HD access, Vista program running in the background, etc.
    -HD2900XT overall does better in COJ then G80 in Vista.
    No foul play at all from the evidence presented.
    Last edited by Eastcoasthandle; 07-01-2007 at 05:53 PM.
    [SIGPIC][/SIGPIC]

  25. #75
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by Eastcoasthandle View Post
    Which doesn't make it a GITG based game after all now does it
    Also, make sense why the Nvidia logo pops up when you start COJ. All that has been proven here is that
    -COJ is a TWIMTBP based
    -some adjustments were made to allow for shader aa in DX10. Again, I believe that TWIMTBP doesn't mix with DX10
    -ATI was allowed to distribute the benchmark for COJ
    -the performance hit can be linked to HD access, Vista program running in the background, etc.
    -HD2900XT overall does better in COJ then G80 in Vista.
    No foul play at all from the evidence presented.
    The ATi and AMD logos on the us box, plus the lack of NVidia logo on the box make it not a TWIMTBP title. The french release was TWIMTBP, but the french release wasn't the DX10 version, it was DX9, and came out back in september. The american version isn't TWIMTBP, and neither is the DX10 upgrade.

    Why would ATi be the only ones allowed to distribute a benchmark of a NVidia sponsored game? Why isn't CoJ on the american NZone page? It's on the french page because the french release back in september was TWIMTBP.
    http://www.nzone.com/object/nzone_tw...gameslist.html

    Does anyone here have the american version of CoJ? I don't think the beginning has the TWIMTBP logo for the us version...
    Last edited by DilTech; 07-01-2007 at 06:03 PM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

Page 3 of 5 FirstFirst 12345 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •