Page 8 of 21 FirstFirst ... 56789101118 ... LastLast
Results 176 to 200 of 525

Thread: Intel Q9450 vs Phenom 9850 - ATI HD3870 X2

  1. #176
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Hornet331 View Post
    sorry for the bad phrasing, i was not laughing about his work. I was laughing about myself for doing more work then necessary.

    btw. jack you really have two 4870x2 incoming?
    Ummmmm, yup.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  2. #177
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by gosh View Post
    Just test these new cards that you have ordered. I wouldn’t be surprised if you will see very similar FPS on 1680x1050, AMD would probably be a little higher on 1920x1200.
    AMD would also have higher on min FPS. I know that it can differ depending on what race track that is used. If it is a track that isn’t heavy then this will favor Intel.
    If the memory on the computer are slow than this will hurt AMD more than Intel

    EDIT: You don’t need to test two 4870X2 cards, it’s enough with one. Why did you buy two when you only(?) are playing at 1920x1200 ?
    Here is my prediction .... These cards are strong enough there that there will be no GPU bottlenecks, AMD will be 10-30% behind in all games at 1920x1200 ...

    Seriously, look at this link:http://www.xtremesystems.org/forums/...d.php?t=198126

    Then go through each review and check what CPU they used to test this card....all of them, all of them use an Intel processor. Is this a marketing ploy by Intel? Or is it because when you review a GPU you want the fastest CPU you can use to avoid any problems with the CPU holding back FPS?

    Here is an example past and present:

    Legit Reviews 4870 X2: http://www.legitreviews.com/article/766/3/
    Legit Reviews 1900XTX: http://www.legitreviews.com/article/293/3/

    Firingsquad 4870 X2: http://www.firingsquad.com/hardware/...view/page3.asp
    Firingsquad 1900XTX: http://www.firingsquad.com/hardware/..._xtx/page4.asp

    In fact, do a google of the 1900 XTX Radeon reviews when it launched, and check the 'test setup' sections of all the reviews, all of them are AMD... why is this? Is it some kind of marketing scam?

    Nope... these guys are just doing their jobs and reviewing cards for their performance and to do that the reference platform must be using the fastest gaming CPU of the time.

    Even Anandtech (the many people try to discredit as Intel paid pumpers ) does the same thing...

    Anand's example: http://www.anandtech.com/video/showdoc.aspx?i=2695&p=7

    All the indicators in the data I have seen on this card is that the Phenom will bottleneck a 4870X2 ... and in somecases as high as 2560x1600. This is not a jab at the Phenom (again, it is a fine CPU), it's just being pragmatic because, despite your concerns that I do not know what I am talking about, I do in fact understand how the hardware works.... all to well.

    Jack
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  3. #178
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Quote Originally Posted by JumpingJack View Post
    Then go through each review and check what CPU they used to test this card....all of them, all of them use an Intel processor. Is this a marketing ploy by Intel? Or is it because when you review a GPU you want the fastest CPU you can use to avoid any problems with the CPU holding back FPS?
    This is a very interesting discussion. Why are ALL using Intel?

    When I read one test I want information that is usable for me. Reading all these tests using almost exactly the same hardware (there are some AMD users out there). Is this informative for the readers?
    If you have read one review you have read them all if you want to check the processor, the only difference between reviews is game performance in different games and maybe how the card compares to other video cards.

    Now, how do these sites finance their work? Most of them if not all are advertising for hardware. In EU Intel is sued for paying or having rebates to hardware sellers if they didn’t sell AMD processors. I don’t think that these sites who test hardware want to upset companies where their income comes from.
    Even if it was like you are saying, that Intel performs better on high res. It is very strange that ALL are using Intel testing games. Also they are using very good processors but most of us know that the processor doesn’t have that big impact playing games.
    Now you may not buy this conclusion if you are one Intel fanboy. But there is another type of test that is VERY hard to find. That is one test that test how different processors perform when games are set to high settings or testing with processors (comparing) slow and fast processors on high res. Testing processors they are almost always on low res or not that high game settings. You get the feeling that you need one very fast processor because the difference is big.

    This is a money thing; I think that these sites need hardware sellers to advertise. I think that they want the user who reads the review to get the feeling that he/she needs fast hardware. Also they want to make the review interesting and graphs that has bars that look almost the same isn’t that interesting to look at.
    AMD with its AM2 socket isn’t something that hardware makers like (what I think). You don’t need to by as much hardware. AMD is also the reason why hardware prices have been lower. Now when this new platform exist it is much easier to upgrade. Before I needed to by about 5 computers every year. Now I only need to buy 3. I can shift hardware between computers.

    If you think of reviews in a consumer perspective these reviews are bad.
    Last edited by gosh; 08-14-2008 at 05:58 AM.

  4. #179
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Gosh -- you missed the point.

    Google Radeon 1900XTX - and go to any review you want and check the brand of the CPU they used to review. The 1900XTX was launched and reviewed before Core 2 Duo was introduced. Back then, it was the A64 vs P4 .... we all know who wins that match up. Why would every reviewer under the sun want to use the fastest CPU of the time to do their review of a graphics card? This is the crux of the discussion. Did AMD pay them to do this? Doubtful, did Intel have the money to pay them not to do that? Most definitely. Then why did they all use AMD CPUs?

    This conspiracy theory is not healthy to maintain. It just isn't true.

    Trust me, the graphics division of ATI did not want them to use Phenom CPUs to do a comparitive review against nVidia, all it would have shown is that in most cases the 4870 runs neck and neck with 3 or 4 other cards, mostly nVidia. The 4870 X2 would have been a tie, not a win.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  5. #180
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Quote Originally Posted by JumpingJack View Post
    Gosh -- you missed the point.

    Google Radeon 1900XTX - and go to any review you want and check the brand of the CPU they used to review. The 1900XTX was launched and reviewed before Core 2 Duo was introduced. Back then, it was the A64 vs P4 .... we all know who wins that match up.
    No I didn't. You know that AMD was very strong for those that built their own computers a couple of years back. They had one big market share for home users and maybe that processor was the big money maker for hardware sellers.
    The impact that internet had then wasn’t that big as it is now. When anandtech presented numbers about Nehalem the response was enormous. More and more companies are using internet as their main marketing area. I don’t think that the Intel marketing section skips this area.
    It isn’t hard to draw these conclusions if you think about it. Quad users are very low compared to other users. This market is also not comparable with other markets. It is hard to find markets that only have two contenders worldwide. If Intel and AMD hadn’t been US companies I can assure you that actions had been taken because we all need competition.

    It is also interesting to see that the reviews made in English differ some with reviews in other languages. It seems that German sites are much more positive to AMD compared to US sites.

  6. #181
    Xtreme Member
    Join Date
    Nov 2007
    Posts
    178
    Quote Originally Posted by gosh View Post
    It is also interesting to see that the reviews made in English differ some with reviews in other languages. It seems that German sites are much more positive to AMD compared to US sites.
    Sorry i cant share your opinion. German reviewers are not more or less positive then US Sites. Maybe we count more on Price/Performance and we dont go with the attitude that a 5 % Performance penalty makes the different between the most sucking piece of hardware and the world best super duper silicon you can get. Everything else is the same.

    But i found out that some german software actually does very good with the Phenom compared to an Intel rig especially the Nero Suite (Video Encoding).

    And at least some forum members (in different forums) still think there is a irregular penalty for AMD processors due the intel compiler in some benchmark and software cases.

    @Jack

    Just wanted to remember you clarifying the numbers given from overclobersclub with your test systems

    http://www.overclockersclub.com/revi...l_q9450/12.htm
    http://www.overclockersclub.com/revi...l_q9450/14.htm
    http://www.overclockersclub.com/revi...el_q9450/8.htm
    Last edited by Boschwanza; 08-14-2008 at 06:47 AM.

  7. #182
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Quote Originally Posted by Boschwanza View Post
    Sorry i cant share your opinion. German reviewers are not more or less positive then US Sites. Maybe we count more on Price/Performance and we dont go with the attitude that a 5 % Performance penalty makes the different between the most sucking piece of hardware and the world best super duper silicon you can get.
    Sorry, that was what I meant. I have only read some in german forums, and the discussion there is much more "in the reallity".

    Finding one review using AMD with high performance video cards on high res is impossible. The test on overclockers is the only I have found. Other than that I have seen some results from forum members but there is alwayas difficult to know exactly but they have all pointed to the same direction.

    The test that jack did showed the same
    http://www.xtremesystems.org/forums/...5&postcount=33
    http://www.xtremesystems.org/forums/...9&postcount=35
    http://www.xtremesystems.org/forums/...1&postcount=62
    http://www.xtremesystems.org/forums/...9&postcount=73
    Last edited by gosh; 08-14-2008 at 06:59 AM.

  8. #183
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by gosh View Post
    No I didn't. You know that AMD was very strong for those that built their own computers a couple of years back. They had one big market share for home users and maybe that processor was the big money maker for hardware sellers.
    The impact that internet had then wasn’t that big as it is now. When anandtech presented numbers about Nehalem the response was enormous. More and more companies are using internet as their main marketing area. I don’t think that the Intel marketing section skips this area.
    It isn’t hard to draw these conclusions if you think about it. Quad users are very low compared to other users. This market is also not comparable with other markets. It is hard to find markets that only have two contenders worldwide. If Intel and AMD hadn’t been US companies I can assure you that actions had been taken because we all need competition.

    It is also interesting to see that the reviews made in English differ some with reviews in other languages. It seems that German sites are much more positive to AMD compared to US sites.
    Actually, AMD still has a huge marketshare in retail. They have historically averaged 40 to as high as 80%.

    I don't speak German, but does it surprise you that German sites will be more amenable to AMD products -- I mean if I were a German site reviewing AMD products I too would feel compelled to give positive impressions, if for anything, to help promote a friendly partner ... but data is data is data. So long as it is reproducible, the truth is the truth, so long as they give all the info to reproduce the information then they should be getting similar results.

    EDIT: Internet sites also have a degree of accountability if they consistently fudge or over inflate the numbers or skew the information, people soon catch on and soon readership drops off, hits go low and advertising dollars/euro's will start to diminish.

    The best way to judge for yourself is do what I did, simply build two systems side by side ... it is quite a nice way to learn more about the topic of computing. I bought 2 Phenoms myself, for many reasons, one reason is the architecture is still fascinating, and different. Comparing and contrasting and experimenting is one way to dig deeper into the rigs.

    Now I agree -- the internet is a commercial space, and it would be foolish for companies not to use the internet for such purposes, but I seriously doubt Intel planted the Nehalem samples with Anand ... it is not good to put work in progress out. Intel stands to hurt itself more than anything letting Nehalem out in the wild through an Osbourne effect.

    Though I cannot prove they didn't, nor can you prove that they did. This is a very Scientia like position to take

    Similarly, one could argue the leaked Deneb data was also planted by AMD ... don't but I would make the same argument against that as well. I think what happened is some tier 2 supplier/MB maker/OEM floated an engineering sample under the table in both cases.

    Who knows ... who cares .... but without concrete proof I would hold off accusing either AMD or Intel for using these people as puppets.

    But again, go back before March 2006 and count the number of sites that used AMD to review video cards ... why did they do that? AMD certainly did not have the money compounded to pay them off, yet every last one used AMD CPUs to review video cards ...
    Last edited by JumpingJack; 08-14-2008 at 04:03 PM.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  9. #184
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    JumpingJack: So you think that there isn’t any interest among readers to know how video cards perform using AMD processors?
    One thing that you can be certain of next time nVidia or ATI is going to release a new card is that there will be at least 10+ reviews using intel processor to test the card. Good imagination among reviewers don't you think?

  10. #185
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by gosh View Post
    JumpingJack: So you think that there isn’t any interest among readers to know how video cards perform using AMD processors?
    One thing that you can be certain of next time nVidia or ATI is going to release a new card is that there will be at least 10+ reviews using intel processor to test the card. Good imagination among reviewers don't you think?
    No, there are plenty of people who would like to see it no doubt, me being one of them.

    But the reviewers are reviewing video cards, not processors. If they used a Phenom, most likely at best it would show the 4870 X2's tied with the nVidia 280 and 260 since the games will all bunch up to the limit of the CPU.

    They use the Intel CPU not to advertise Intel CPUs, the use the fastest available platform so that the cards will demonstrate the performance differences without any CPU bottlenecking. Even with Intel CPUs, there is some bottlenecks being shown through the reviews.

    Many sites have to use over-clocked Intel quads to make sure that the CPU does not rail to the same value. Even the nVidia 280 GTX can open up some bottlenecks.

    I have two 4870 X2s on the way, I will provide you examples of this. At the same resolutions, settings, etc. The phenom paired with the 4870 X2s will most likely produce lower scores in some (likely most, if not all) the test runs compared to the Intel CPUs.

    It isn't about imagination, it is about doing the right thing and reviewing the cards correctly. Just like before the C2D was launched, all the reviewers used AMD processors to review video cards, not because they liked AMD or wanted to promote AMD, but because the Athlon 64 just B!tch-slapped the Pentium 4 up, down and sidways at gaming... why would you use a P4 to run a high end game with a high end video card. It would be pointless... it is just that today, that role is reversed.
    Last edited by JumpingJack; 08-14-2008 at 05:10 PM.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  11. #186
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Quote Originally Posted by JumpingJack View Post
    But the reviewers are reviewing video cards, not processors. If they used a Phenom, most likely at best it would show the 4870 X2's tied with the nVidia 280 and 260 since the games will all bunch up to the limit of the CPU.
    Do you think that AMD isn't able to run 8800GT? That was a fast card not that long ago

    If one game is only using one thread then I agree with you, the cpu would bottleneck the system if the video card is very fast. But do you really think that games that are using two or more cores effectively are going to be main bottleneck on advanced graphics?
    And this thing about the CPU or GPU bottlenecks the system. When application runs they are using different types of work, they sometimes(?) is using the CPU and sometimes it sends data to the GPU. It is very hard to balance workloads and use all hardware in order to hide bottlenecks. Threads need to be synchronized. If one thread is running very fast but that thread needs to wait for another thread it doesn’t matter, the slower thread will delay it. If you have bigger threads (they do more) then it is harder to estimate how fast the work is done. If you have one main thread than it might be easier. I think crysis and some other games are done like that. You will se work among other cores but one core is going crazy, this isn’t going to use the CPU as much (all four cores). New games may use other techniques, they need to do that in order to take advantage of the all the cores.
    How the game performs depends on the design. Maybe one game does the all the calculation of how the "image" is displayed using four threads. When all these threads are ready they may use one thread that sends the data to the video cards, when this thread is sending data the speed needs to be high. In between it may not be that much traffic(only memory but they are probably trying to optimize for cache hits). Games may use buffers etc to speed up but when I look at how hard the CPU is working it is very rare to see 100% on one core.

    EDIT. About anand and nehalem. OF COURSE intel knew that they got the processor, the journalist would otherwise had done a crime and maybe some responsible persons at intel would be kicked because the security was that bad.
    Last edited by gosh; 08-14-2008 at 05:41 PM.

  12. #187
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by gosh View Post
    Do you think that AMD isn't able to run 8800GT? That was a fast card not that long ago

    If one game is only using one thread then I agree with you, the cpu would bottleneck the system if the video card is very fast. But do you really think that games that are using two or more cores effectively are going to be main bottleneck on advanced graphics?
    Wow we are right back where we started aren't we ... A game using 4 threads effectively can bottleneck the game. Both the Phenom and the QX9650 do this in Lost Planet.... Lost Planet is a multithreaded game and pushes all 4 cores. All that data I showed you earlier was multithreaded on all 4 cores for both Phenom and QX9650. The cave scene demonstrates that it is a bottleneck.

    The Phenom will most likely bottleneck the 4870 X2 (I say most likely because there is no data yet, I will have that in a few days), even a QX9770 is holding back the X2 in some scenarios based on data around the web.

    EDIT: Your favorite, GRID is also CPU limited ... badly. Which I dropped by the store to day and grabbed a copy. This is one fascinating game.


    And this thing about the CPU or GPU bottlenecks the system. When application runs they are using different types of work, they sometimes(?) is using the CPU and sometimes it sends data to the GPU. It is very hard to balance workloads and use all hardware in order to hide bottlenecks. Threads need to be synchronized. If one thread is running very fast but that thread needs to wait for another thread it doesn’t matter, the slower thread will delay it. If you have bigger threads (they do more) then it is harder to estimate how fast the work is done. If you have one main thread than it might be easier. I think crysis and some other games are done like that. You will se work among other cores but one core is going crazy, this isn’t going to use the CPU as much (all four cores). New games may use other techniques, they need to do that in order to take advantage of the all the cores.
    How the game performs depends on the design. Maybe one game does the all the calculation of how the "image" is displayed using four threads. When all these threads are ready they may use one thread that sends the data to the video cards, when this thread is sending data the speed needs to be high. In between it may not be that much traffic(only memory but they are probably trying to optimize for cache hits). Games may use buffers etc to speed up but when I look at how hard the CPU is working it is very rare to see 100% on one core.

    EDIT. About anand and nehalem. OF COURSE intel knew that they got the processor, the journalist would otherwise had done a crime and maybe some responsible persons at intel would be kicked because the security was that bad.
    I am not sure why this is sohard to understand ... the GPU is responsible for one of the workloads, it only needs to know a small amount of information to complete it's task. The CPU is responsible for completely different workloads, and it only needs to send information to the GPU before the next frame is rendered.

    If one waits on the other then the hold up is a bottlneck at that computational resource. It is easy to see this in the data.... if you increase the resolution and the FPS does not change the CPU is the culprit conversely if the FPS changes when you change the resolution, the GPU is the bottleneck.

    You even pointed it out yourself when you linked techreport on lost planet.

    Computationally, even with 4 threads, Intel runs gaming code faster... significantly faster. There are many good reasons for this. In fact, I am reviewing a paper for a study that will be published at RealWorldTech that goes through the reason why Intel runs game code 20 to 50% than an equivalently clocked Phenom.
    Last edited by JumpingJack; 08-14-2008 at 07:28 PM.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  13. #188
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    I think you should read some about game programming (DirectX etc) instead of reading papers about this and that. Then you will understand more

  14. #189
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Yeah, I will try to get to this tonight... if not definitely tomorrow night.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  15. #190
    Xtreme Addict
    Join Date
    Jul 2007
    Location
    California
    Posts
    1,461
    I don't get what your argument is, gosh... Are you saying that bandwidth is the deciding factor in CPU performance? I doubt that a 386 would be wicked fast (by today's standards) even if we got it running on a HT 3.0 equivalent bus. Intel must be pretty foolish to sell its CPUs at such a high price when they don't perform as well as Phenom eh?

    If you can show me an example of a Desktop PC being bottlenecked by FSB, your argument might have some merit.
    On the opposite side, there are plenty of examples of games being bottlenecked by CPU... Try dropping in a 3000+ and a 4870X2 and run Crysis. You'll get the same FPS from 640x480 to 1900x1200. Your arguments about Intel's problems with FSB are looking like a bunch of garbage because you have no concrete proof.

    AMD must be so unlucky because SOFTWARE IS PROGRAMMED TO FAVOR INTEL. Even with 100% LOAD on all cores, I fail to see one example of where Phenom is faster.
    1.7%

  16. #191
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by gosh View Post
    I think you should read some about game programming (DirectX etc) instead of reading papers about this and that. Then you will understand more
    I have read a lot of papers about this.... threading the processor and the computational result is part of the CPU duty.... it doesn't change the fact that Intel produces a faster result on 4 threads overall.

    However, why don't you cite your references (as I have done) .... I am always curious to read up.

    However, if you are going to go into detail on thread locking and blocking, that is moot, I am well read on that, and has nothing to do with the ability of a CPU or GPU to be the hold up in the graphical output of 3D gaming.

    Jack
    Last edited by JumpingJack; 08-14-2008 at 07:29 PM.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  17. #192
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by gosh View Post
    Do you think that AMD isn't able to run 8800GT? That was a fast card not that long ago
    Ooops, just saw this.... and yes, when nVidia launched the G80 core, AMD CPUs bottlenecked that card in some cases, and most cases with the GTX:

    http://www.firingsquad.com/hardware/...ling/page5.asp

    As an example.... now, you seem to have a hard time connecting the observe FPS with what is limiting FPS be it a CPU or a GPU.

    Those of you with AMD CPUs who were planning on upgrading to GeForce 8800 will want to look over our performance results on the preceding pages carefully, especially if you planned on upgrading to the GeForce 8800 GTX. In many cases you’ll find that the GeForce 8800 GTX is so powerful that it is CPU-bound with AMD’s flagship FX-62 in games like Quake 4, and Source engine games like Dark Messiah and Half-Life 2 Lost Coast.
    But again Gosh, it is not hard to understand.

    This is the basics of gaming processing on a platform today. The GPU has local memory with ubber gobs of BW to the GPU (because GPUs are such high throughput beasts), all the data they need to render the scene is located locally in the video RAM (textures, vertices, etc). The only information the GPU needs is details of where objects are located in 3D space, particles, baddies, etc.

    So a game runs, the CPU is responsible for calculating the data needed for a frame as described above... the GPU then recieves this data and processes the frame to render it, when the GPU finishes it is ready to recieve the next frame of data from the CPU, and so forth and so on...

    The CPU calculates its information on the frame anyway it wants, single, double quadruple threaded, the GPU does not care -- it simply needs the next frame of data from the CPU.

    Two scenarios. The CPU finishes it's calcuation before the GPU finishes the prior frame ... thus the CPU waits. The other scenario is that the GPU finishes the rendering the frame before the CPU is ready to send the information, the GPU waits.

    It is that simple.

    Before you go on about programming and game programming and stuff.... forget about the code. Just answer a simple question.

    If I run a game a 1024x768 and measure the FPS, then run exactly the same game at 1600x1200 and again measure the FPS, should the FPS go up or down with resolution?

    jack
    Last edited by JumpingJack; 08-14-2008 at 07:57 PM.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  18. #193
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Wooop.... 4870 X2 is gonna be here tomorrow:
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	order.jpg 
Views:	217 
Size:	60.3 KB 
ID:	83629  
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  19. #194
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Boschwanza View Post

    @Jack

    Just wanted to remember you clarifying the numbers given from overclobersclub with your test systems

    http://www.overclockersclub.com/revi...l_q9450/12.htm
    http://www.overclockersclub.com/revi...l_q9450/14.htm
    http://www.overclockersclub.com/revi...el_q9450/8.htm
    Ok.... so I did WIC for tonight. This is not a good compare the following is different.

    OS: I am using XP, he is using Vista Ultimate (does not specify 32 or 64 bit)
    DX: I am using DX9, he is using DX10
    Graphics card: I am using a 8800 GTX, he is using an 8800 GT -- drivers are most likely different (I am using Forceware 169.21 -- an older version)
    Monitor: I am using a max of 1650x1080, so no 1900x1200 data (I can generate this, but need to swap monitors)
    Phenom clock speed: I left mine at 2.5 GHz (default for 9850), he is using B2 2.3 GHz Phenom, doesn't really matter really he is comparing all his processors with games that are GPU bound anyway.

    Also, if you are accustomed to my other screen shots... --> I put the taskbar to autohide so I could see the entire WIC screen in the 1280x1024 mode.

    I do match his different resolutions (except 1900x1200), very high graphics settings 0x AA and 16X FA. After my first run on 2.5 Ghz / DDR2-800 for the Phenom, I also ran DDR2-1067 so you will see two sets of screen dumps for Phenom.


    QX9650 @ 2.67 GHz (333x8) DDR2-1067 8800 GTX in order of resolution



    Phenom 9850 @ 2.5 GHz (200x7.5) DDR2-800 8800 GTX



    Phenom 9850 @ 2.5 Ghz (200x7.5) DDR2-1067 8800 GTX



    QX9650 @ 2.67 GHz
    1024x768. Max = 123 Ave = 50 Min = 21
    1280x1024 Max = 104 Ave = 47 Min = 22
    1650x1080 Max = 95 Ave = 46 Min = 21

    Phenom @ 2.5 Ghz DDR2-800
    1024x768. Max = 69 Ave = 27 Min = 10
    1280x1024 Max = 69 Ave = 27 Min = 10 (this is odd, exactly the same)
    1650x1080 Max =70 Ave = 28 Min = 10

    Phenom @ 2.5 GHz DDR-1067
    1024x768. Max = 67 Ave = 28 Min = 11
    1280x1024 Max = 68 Ave = 27 Min = 9
    1650x1080 Max =72 Ave = 29 Min = 9


    Using DDR2-1067 vs DDR2-800 did not help the Phenom ... I will do a DDR2-800 run on the QX9650 tomorrow.

    EDIT: If you are interested, here is WIC with all the GPU restrictive assets removed:

    This is much much less of a fair compare.... all setting set to low, only the CPU affecting items were enabled or set to high, for example physics quality is set to high. A full disclosure of settings is available upon request.

    QX9650 @ 2.5 GHz, DDR2-800 (in this case, all my baseline data is there)



    Phenom 9850 @ 2.5 GHz, DDR2-800



    Second Edit: He major fubar'ed his 9450 setup:

    http://www.overclockersclub.com/revi...images/666.htm this probably would not affect his results, but I am surprised he made a stable system. He mixed PC-6400 memory with PC-8000 memory. Shockingly he seemed to have been able to get it to run at 463 Mhz speed, or DDR2-926 speed wow. Obviously he relaxed his timings from factory to hit this.

    jack
    Last edited by JumpingJack; 08-14-2008 at 11:33 PM.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  20. #195
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Follow up on WIC...

    here is Phenom 9850 @ 3.0G DDR2-1067 with the Overclockers Club settings (i.e. very high settings 0X AA 16X AF), same deltas apply though (OS, graphics card etc)



    Here is the QX9650 @ 3.0G, all else the same as the Phenom:



    Phenom @ 3.0 GHz DDR-1067
    1024x768. Max = 81 Ave = 33 Min = 12
    1280x1024 Max = 72 Ave = 32 Min = 13
    1650x1080 Max =75 Ave = 32 Min = 13

    QX9650 @ 3.0 GHz DDR2-1067
    1024x768. Max = 112 Ave = 47 Min = 22
    1280x1024 Max = 101 Ave = 46 Min = 21
    1650x1080 Max = 92 Ave = 45 Min = 23

    So even on an 8800 GTX, OCC's settings would never let you see any difference between any CPUs... the phenom shows about 4 FPS improvement, so there is some CPU interactions going on.
    Last edited by JumpingJack; 08-14-2008 at 11:35 PM.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  21. #196
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    wow... i didn't realize what a bottleneck a 2,5ghz phenom is for wic... from 1024x768 to 1650x1080 totaly cpu bound....

    But its a bit strange to see that the phenom gets higher avarg fps the higher the resolution goes up, sure one fps is within the margin of error, but non the less its kinda odd to see.

    Jack would you mind to run this test several times, e.g. five times or so, and give us the avarage of the runs? But only if it is not to much work. Thx in advance

  22. #197
    Xtreme Member
    Join Date
    Nov 2007
    Posts
    178
    The additional 1920x Settings would be nice, thanks a lot i appreciate your work.

    So far your scores are pretty close to what overclubersclub discovered , they are just getting earlier into gpu bound situation with the 8800GT then you.

  23. #198
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Boschwanza View Post
    The additional 1920x Settings would be nice, thanks a lot i appreciate your work.

    So far your scores are pretty close to what overclubersclub discovered , they are just getting earlier into gpu bound situation with the 8800GT then you.
    I think you can generalize this to a yes. The most curious one is the Company of Hero's run -- I do not have opposing forces, if I can I will stop by the store and pick up a copy.

    I will repeat what I do on the 4870 X2, UPS tracker is showing an afternoon delivery

    jack
    Last edited by JumpingJack; 08-15-2008 at 06:22 AM.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  24. #199
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Hornet331 View Post
    wow... i didn't realize what a bottleneck a 2,5ghz phenom is for wic... from 1024x768 to 1650x1080 totaly cpu bound....

    But its a bit strange to see that the phenom gets higher avarg fps the higher the resolution goes up, sure one fps is within the margin of error, but non the less its kinda odd to see.

    Jack would you mind to run this test several times, e.g. five times or so, and give us the avarage of the runs? But only if it is not to much work. Thx in advance
    I have low-res stuff multiple times, it takes alot of work to generate screen shots so for a first pass, I will simply quote the numbers. I will do it later tonight.

    Jack
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  25. #200
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    About World in Conflict

    http://www.yougamers.com/articles/45...terview-page6/

    YouGamers: Let's talk about high end. Do you have a threaded engine for multicore processor support?

    Westberg: Yes, we do. On the CPU side, Intel has been very supportive helping us out, because it's a big step moving to a threaded architecture. We've been working with them, and we scale quite well, but if you have a quad-core you won't run the game four times as fast, because it's really hard to reach that. Also, if you have a quad-core, each of those four cores is pretty fast, and we still have to scale down to this 2 GHz machine that's our low-end spec for everyone to be able to run the game.

    So we have this 2GHz processor here, and then we have four CPU's that are twice as fast, so we have eight times more [processing power] over here. It's hard to scale all the way [across that]. It does scale, so a dual core runs faster than a single core, and a quad-core runs even faster.

    What we do thread is the entire physics update on a separate thread, we thread our shadow volume updates, [...] particle updates and tree updates. And then the obvious things like everybody is probably doing, like sound, voice over IP and things like that. But the four I mentioned are the [threaded processes we render] frame to frame.

Page 8 of 21 FirstFirst ... 56789101118 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •