The dev kit is next to the pc. It's connected to the pc for debgging...
Intel i7 3770K @ 4.5ghz
Asus P8Z77-V
8GB Crucial 1866Mhz CL9
AMD Sapphire Radeon HD 6970
Crucial RealSSD M4 128GB
2x WD Raptor X
Enermax Galaxy 1000W DXX
NEC LCD2690WUXi
Yamaha RX-V667 Receiver
Monitor Audio Vector 5.1
AMD high end: 18 month + old 7970, dual GPU version of 18 month + old 7970 with buggy drivers that deliver half the frames to briefly to be observed.
NVIDIA high end: GTX690, GTX770, GTX780, GTX Titan
AMD discrete market share: ~35% NVIDIA discrete market share: ~65%
AMD: Lost $1.2B in 2012 NVIDIA: Net income doubled to $581m in 2012 after record quarters
AMD + ATi market cap: $2.81b NVIDIA market cap: $8.85b ( and IIRC NVIDIA could purchase AMD with cash in bank)
GTX670: Outsold HD7970 and 7950 combined.
AMD professional graphics marketshare: None NVIDIA professional market share: All of it
I think I got it right.
Intel 990x/Corsair H80 /Asus Rampage III
Coolermaster HAF932 case
Patriot 3 X 2GB
EVGA GTX Titan SC
Dell 3008
Intel i7 3770K @ 4.5ghz
Asus P8Z77-V
8GB Crucial 1866Mhz CL9
AMD Sapphire Radeon HD 6970
Crucial RealSSD M4 128GB
2x WD Raptor X
Enermax Galaxy 1000W DXX
NEC LCD2690WUXi
Yamaha RX-V667 Receiver
Monitor Audio Vector 5.1
This is really bizarre actually, you guys do make a good point that if the game devs are supposedly optimizing for AMD hardware, why in the hell are these games running on NV GPUs? I think it may actually have more to do with Microsoft's PC emulator than their actualy hardware readiness or anything like that. Their emulator probably runs better (for now) on Nvidia's drivers.
Just wait until someone hacks the xbox one, dumps the VHD's and we a can all run them on our PC's in Hyper-V
Desktop
[Asus Rampage III Gene] [i7 920 D0] [12GB OCZ3B2000C9LV6GK] [HIS HD 5970] [SeaSonic X750 Gold ] [Windows 7 (64bit)] [OCZ Vertex 30GB x3 Raid0] [Koolance CPU 360] [XSPC Razer 5970] [TFC 360 rad, D5 w/ Koolance RP-450X2]
HTPC
[Origen AE S10V] [MSI H57M-ED65] [ i5-661 w/ Scythe Big Shuriken] [Kingston HyperX LoVo 4GB ] [ SeaSonic X650 Gold ] [ OCZ Vertex 30GB SSD ] [ SAMSUNG Spinpoint 640GB 7200 RPM 2.5"][Panasonic UJ-225 Blu-ray Slot Burner] [ Ceton InfiniTV4]
so in other words, both these devices should be super easy to hack. and the xbox one , will run windows 7. the xbox one os, will run on pc. theirfor will run on ps4 hardware, and at the same time, so will windows 7.
only thing a bit more difficult would be unix os filesystem . which if sony dev kits are on windows. shouldn't be difficult at all to make a ps4 that runs xbox one os and games and windows 7 (off external hard drive)
MM Duality eZ modded horizon (microres bracket). AMD 8120 4545Mhz 303x15 HTT 2727 1.512v load. 2121Mhz 1.08v idle. (48hour prime95 8k-32768 28GB ram) 32GB GeIL Cosra @ RAM 1212Mhz 8-8-8. 4870x2 800/900 load 200/200 idle. Intel Nic. Sabertooth 990fx . 4x64GB Crucial M4 raid 0 . 128GB Samsung 840 pro. 128GB OCZ Vertex 450. 6x250GB Seagate 7200.10 raid 0 (7+ years still running strong) esata raid across two 4 bay sans digital. Coolit Boreas Water Chiller. CoolerMaster V1000. 3x140MM back. 1x120MMx38MM back. 2x120MMx38MM Front. 6x120MM front. 2x120MM side. silverstone fan filters. 2x120MMx38MM over ram/PWM/VRM , games steam desura origin. 2x2TB WD passport USB 3.0 ($39 hot deal score) 55inch samsung 1080p tv @ 3 feet. $30 month equal payments no int (post xmas deal 2013)
I think running dev. kits on the show floor has more sense than running actual unreleased hardware in case someone brave will want to nick one. In case of dev. kit they only stealing PC hardware and some probably encoded and locked piece of software. With real XBONE they would have a lot of interesting silicon on hand.
Besides, MS had real hardware on the stage running at least some games they showed during presentation. Confirmed by one of devs working on Ryse ...
RiG1: Ryzen 7 1700 @4.0GHz 1.39V, Asus X370 Prime, G.Skill RipJaws 2x8GB 3200MHz CL14 Samsung B-die, TuL Vega 56 Stock, Samsung SS805 100GB SLC SDD (OS Drive) + 512GB Evo 850 SSD (2nd OS Drive) + 3TB Seagate + 1TB Seagate, BeQuiet PowerZone 1000W
RiG2: HTPC AMD A10-7850K APU, 2x8GB Kingstone HyperX 2400C12, AsRock FM2A88M Extreme4+, 128GB SSD + 640GB Samsung 7200, LG Blu-ray Recorder, Thermaltake BACH, Hiper 4M880 880W PSU
SmartPhone Samsung Galaxy S7 EDGE
XBONE paired with 55'' Samsung LED 3D TV
Did anyone else see the PS4 crash on multiple occasions at E3? Knack and Assassins Creed were the ones I saw go down in flames. I'm still confused about the screen it crashed to, because that looks to me like what crashed was a game being streamed, which would explain why Knack was lagging pretty bad at times even though it was only running 30fps.
Now this is just speculation, but since Microsoft had to demo XB1 games on PC hardware, and there is a chance PS4 was actually streaming the games... the logical explanation would be that AMD doesn't have the hardware for either Microsoft or Sony ready. uh oh...
"If the representatives of the people betray their constituents, there is then no resource left but in the exertion of that original right of self-defense which is paramount to all positive forms of government"
-- Alexander Hamilton
2x Dual E5 2670, 32 GB, Transcend SSD 256 GB, 2xSeagate Constellation ES 2TB, 1KW PSU
HP Envy 17" - i7 2630 QM, HD6850, 8 GB.
i7 3770, GF 650, 8 GB, Transcend SSD 256 GB, 6x3 TB. 850W PSU
OK
GTX690> Launched a year before 7990, still a far better card to have. Market leader.
GTX Titan> Launched 4 months ago, no competition. Market leader.
GTX 780> Launched a month ago, no competition. Market leader.
So in the top three slots, NVIDIA has no competition at all. While I'm sure somewhere a fanboy smoked enough to ignore the FCAT reviews, games that just don't scale at all, and $1000 price for two chips you can buy for under $400 a piece, I've never seen a forum post with 7990 owners talking about their cards. (where the forums were full of 690 threads)
The only place AMD is in some "photo finish struggle" with NVIDIA is in Rory Read's Lunesta fueled dreams. In the real world NVIDIA's been making the rules all year and pretty much owned last year as well.
The fact their mid range refresh turned out to trump AMDs high end offering pretty much says it all. If they had opted to sell Titans in the consumer market last year instead of selling them for $5k each in the professional market, things would look worse than they do already.
When the best you can say is "Yay, our flagship GPU is in a dead heat for 3rd place in the GPU market!", and you haven't launched a part for over 18 months, things aren't "too close to call" on who the "leader" is".
You could say they're in a close performance race with intel too- if you don't look at the chips intel made since 2011.
Last edited by Rollo; 06-16-2013 at 04:38 AM.
Intel 990x/Corsair H80 /Asus Rampage III
Coolermaster HAF932 case
Patriot 3 X 2GB
EVGA GTX Titan SC
Dell 3008
*UPDATED* Michael Wilford, Studio Director of Twisted Pixel Games, said the hardware running his game at the show was the decision of his company, not Microsoft. Wilford also told Jonathan Blow to blow it out his a** for criticizing the hardware his team is using to demo the game. So, what we have here is simply a matter of a developer that chose to run his game on hardware / software of his choice.
Not entirely convinced yet. Not even sure what it means TBH!
Intel Core i7-3770K
ASUS P8Z77-I DELUXE
EVGA GTX 970 SC
Corsair 16GB (2x8GB) Vengeance LP 1600
Corsair H80
120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
Corsair RM650
Cooler Master Elite 120 Advanced
OC: 5Ghz | +0.185 offset : 1.352v
well no mr wilford, there is a story. you are at e3 baisically representing microsoft, the fact is you decided not to use the Dev kit or "comparable" hardware to showcase a system for microsoft says something. all you would have to say is something to the effect of, the games coding is extremely unoptimized and we needed as much power to brute force the game at this point. Without that kind of explanation, one could hypothesis that the game was being run on far superior hardware because the graphics settings were "cranked" up and the final product will not look nearly as good what was being showcased at E3 wich is both a misrepresentation for microsoft, and your product for xbox one
yeah because most of the sales are in the high end segment of the market.......*rolles eyes*yeah I saw tons of posts from 690 owners on various forums, mostly ing that they couldnt get them to scale worth a damn or at all on various games, funny enough, 7990 owners where rare but those that did post enlarge where not having those problems.
When the best you can say is "Yay, our flagship GPU is in a dead heat for 3rd place in the GPU market!", and you haven't launched a part for over 18 months, things aren't "too close to call" on who the "leader" is".
you should check where most of nvidia's numbers come from, or most sales of videocards in general come from....i think you will find its in the low to mid range segment, at least that was my experiance for all the years i worked retail and computer repair/upgrade/build/exct jobs......overall very few people bought high end or even mid high end cards, those that did tended to either be one of 2 types, loyal to a brand(like 3dfx.....all the poor fools who where so upset when 3dfx commited seppuku.....) and the kind who just want whatever is fastest for their buck.
really anymore, gpus arent the big problem with game perf, its the fact games are enlarge just really poor console ports rather then being optimized for modern hardware.
yes we get it, AMD sucks and dosnt make anything worth anybody owning, we can see you have a titan and 990x, and we are happy for you, if that makes you happy, great more power to you.You could say they're in a close performance race with intel too- if you don't look at the chips intel made since 2011.
I happen to own one of these systems thats worthless in your eyes, and funny enough, my 8350@4.6 and dual 7870's have yet to let me down with any modern game, and infact in crysis 3 my system was noticeably smoother then my friends 920@4.1 with his dual 660ti's, his benches better in most older titles though(mine is faster in bf3 though, another game that uses an engine that likes more cores)
I understand your dislike of products you dont use and dont want to use, I understand your need to belittle AMD and anybody who uses their products in your quest to justify the price you paid for your titan, its ok, you dont need to keep waving your epeen around to show us all how big and shiny it is.....we get it.....
on a side note: alot of the tasks I do, run quite noticeibly faster on even my other system, an 8120@4.5, then they do on my friends 920 system, stuff like encoding large batches of audio files with dbpoweramp(dosnt like virtual cores/hyperthreading at all).
I still build systems, and have built a good number of sandy and ivy based intel rigs, with both nvidia and amd videocards, and though they are fun to play with, I just dont see the value in buying one for myself when I have 2 current gen amd rigs that I will just be able to toss new cpu's into when they drop without replacing my board as well....with intel, I already have a few people I built ivybridge rigs for complaining that they have read they would need a new board to get a new chip....when i warned them about that before they built.....(funny enought none of the 2011 users i built for are complaining or looking to upgrade....probably because they are happy with the oc i set them up with....)
....
as to this story, as i said on another site, this is just stupid, i mean they should have used AMD 8300 cpu's, with 7850 or 7870 videocards.....back it with fast ssd's and you can emulate the xbone experience using similar hardware even if you ran it at far higher clocks.....
also note: no need to quad channel the ram really, just toss 32gb in, set fancy cache up so it can cache the files being used to ram and away you go.(fancycache is great!!!)
sure a server platform could have been even better, but, would have also cost alot more, and sometimes you can run into issues with videocards and such in them(i have been doing this a long time, cant tell you how many high end workstations i have made with server/workstation grade hardware where i ran into issues you wouldnt see in a server because the server would at most use the video built into the board, or more common the console or serial management ports rather then even having a kb/mouse/monitor hooked up. (once spent a month back and forth with a server board vendor getting a bios bug fixed that kept videocards put into the boards slot from outputting video....that was"fun", to their credit they did get it fixed and also unlocked some features that where missing we needed at no extra charge)
either way, i really cant wait for these new consoles to launch, I think it will be good for everybody be it intel/nvidia or amd/amd, dx11 will get used(finely), multi threading will get used(because dev's have no other choice now!!!) that means perhaps companies like arenanet will fix their games to use more then 3 cores efficiently, thus making their games run alot better.
cant tell you how many games I expect to see get serious patching and updates as they are being ported to consoles this time around!!!
And yet the drivers since 306.xx have gotten consistently worse to currently 320.18 being almost completely garbage. The higher your frame rate the more you're affected by stutter. 100% reproducible and often 100% at the same spots in games.
I've tested this on:
Rampage II Extreme, 965X CPU, 6GB & 9GB RAM, 1 SSD with clean Windows 7 64bit install
Rampage IV Extreme, 3960X CPU, 16GB & 32GB RAM, 1 SSD with clean Windows 64bit install
GPUs: GTX 280, 580, 680, Titan
2-way SLI GTX 580, 680, Titan
3-way SLI GTX 680, Titan
PSUs: Seasonic X-1250W, Enermax Revolution 1250W, Platimax 1500W
Resoutions: 1920 x 1080 @60Hz, 120Hz, 130Hz and 2560 x 1600 @60Hz
To rule out hidden potential problems, everything was run without overclocks.
The bad part is that with the newer cards I can't go further back than driver version 314.09. If I could go back to 306.xx it wouldn't bother me so much. I've been using their products since the 8800 GTX. I haven't seen it this bad ever.
And the AMD Defense League arrives to clear up any confusion caused by facts and those pesky professional reviews, to let us know it's all a hoax and AMD is producing "darn good parts" that are "good enough, if not better".
Would take to much time to address all the fallacies above, so I'll just say, "Anyone who believes AzureSky should buy some AMD parts as they need the cash and have lots of old, outmatched part in the retail channel.".
Intel 990x/Corsair H80 /Asus Rampage III
Coolermaster HAF932 case
Patriot 3 X 2GB
EVGA GTX Titan SC
Dell 3008
Yeah I have noticed a lot of kernel crashing on 320.18 on Quad SLI 295s.
Have just downgraded to 314.22 to see if there's a change.
-PB
-Project Sakura-
Intel i7 860 @ 4.0Ghz, Asus Maximus III Formula, 8GB G-Skill Ripjaws X F3 (@ 1600Mhz), 2x GTX 295 Quad SLI
2x 120GB OCZ Vertex 2 RAID 0, OCZ ZX 1000W, NZXT Phantom (Pink), Dell SX2210T Touch Screen, Windows 8.1 Pro
Koolance RP-401X2 1.1 (w/ Swiftech MCP35X), XSPC EX420, XSPC X-Flow 240, DT Sniper, EK-FC 295s (w/ RAM Blocks), Enzotech M3F Mosfet+NB/SB
Last edited by Chickenfeed; 06-17-2013 at 05:04 AM.
Feedanator 7.0
CASE:R5|PSU:850G2|CPU:i7 6850K|MB:x99 Ultra|RAM:8x4 2666|GPU:980TI|SSD:BPX256/Evo500|SOUND:2i4/HS8
LCD:XB271HU|OS:Win10|INPUT:G900/K70 |HS/F:H115i
so, your saying i cant game, i must get microstuttering and poor fps.....funny, i know microstutter and can clear it up when it happens using radonpro....
i couldnt give less of a about FCAT, because games run fast and smooth for me 99% of the time, if all you care about is benching, yup avoid anything not nvidia/intel, because amd totally sucks for benching.
i know according to probably 90+% of the people here on XS my systems a piece of trash, thats why I dont post often here, I dont want another TPU experiance where if your not running what the majority run, your given nothing but and your opinions and experiences are blown off as invalid.
Im really sorry that I dont have horrible microstutter and fps, Im really sorry that I dont see a huge perf gain moving to intel/nvidia, I know thats a horrible thing to say since clearly the benchmarks show I am wrong and that my system is a steaming pile of trash nobody could possibly have any use for.
Im happy with it, for me its fast, stable and allows me to do what I want to do on a daily basis, be that gaming or encoding or just watching videos and surfing (or, sometimes all of them at once.....).
In a way I do wish amd would just drop out of the retail x86 market since that would remove their trash products from the market and leave only the best products for everybody to use, then there could be less bickering about brand, since there would be only 1 choice.
paulbagz: lies, all nvidia drivers and hardware are and always have been/always will be flawless, any problems with their products are always the fault of game developers or ms or or or....(anybody but nvidia)
Well, no one of thoses games are ready now, all games shown ( PS4 and Xbox ) are still in developpement... The devkits versions was surely the most advanced and most easy to debugg in real time. ( maybe the
As for the hardware they was running, well the guys ( MS or the games studio ) have surely a contract with HP for rent PC, laptop etc for the show. They have just add what HP can offer to them for the games demo ..
I like how peoples overact about things like that. I will be a bit more worried if games are so badly optimized you need 2 Titan and 1 3960x for make them run at 30fps and they was run PC versions instead of the xbox devkits.
Last edited by Lanek; 06-18-2013 at 06:27 AM.
CPU: - I7 4930K (EK Supremacy )
GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
Motherboard: Asus x79 Deluxe
RAM: G-skill Ares C9 2133mhz 16GB
Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0
Bookmarks