Will become a Standard for Display Port 1.2a
Hope this will be the end off G-Sync
Read more: http://wccftech.com/amd-freesync-adp...#ixzz2yFTpXWAA
Will become a Standard for Display Port 1.2a
Hope this will be the end off G-Sync
Read more: http://wccftech.com/amd-freesync-adp...#ixzz2yFTpXWAA
Last edited by Heinz68; 04-07-2014 at 08:33 PM.
Core i7-4930K LGA 2011 Six-Core - Cooler Master Seidon 120XL ? Push-Pull Liquid Water
ASUS Rampage IV Black Edition LGA2011 - G.SKILL Trident X Series 32GB (4 x 8GB) DDR3 1866
Sapphire R9 290X 4GB TRI-X OC in CrossFire - ATI TV Wonder 650 PCIe
Intel X25-M 160GB G2 SSD - WD Black 2TB 7200 RPM 64MB Cache SATA 6
Corsair HX1000W PSU - Pioner Blu-ray Burner 6X BD-R
Westinghouse LVM-37w3, 37inch 1080p - Windows 7 64-bit Pro
Sennheiser RS 180 - Cooler Master Cosmos S Case
Big blow to G-Sync
Fold for XS!
You know you want to
its nice that its becoming a free standard but points to nvidia for getting the ball rolling on something that needed doing for a long time even if we dont like the way they go about it
TJ08-EW 6700k@4.7 1.375v - Z170-GENE - 2x8g 3866 16-16-16 - 1070@ 2088\9500MHz -Samsung 830 64G, Sandisk Ultra II 960G, WD Green 3tb - Seasonic XP1050 - Dell U2713 - Pioneer Todoroki 5.1 Apogee Drive II - EK VGA-HF Supreme - Phobia 200mm Rad - Silverstone AP181 Project Darkling
3770k vs 6700k RAM Scaling, HT vs RAM, Arma III CPU vs RAM, Thief CPU vs RAM
Nice investigation work by Hardware.fr again.
I hope G-Sync will disappear just as Mantle.
Proprietary stuff are no good.
Nice!
Too bad most display manufacturers are unlikely to provide updated firmware, and will force consumers to buy new displays to be able to use FreeSync...
Nicely played AMD.
I bet someone at AMD is very smug with that name as well
I really do hope Mantle and G-Sync die off and gimmiks.
Bring on the motorbikes!
i dont get why people would care about mantle or g-sync. you get a perc for the side you choose.
not a big deal to me. being that eventually both sides WILL get the same thing just under a differant name..
that being said i dont really care about either of those as they really dont make much of a differance to me, as all i care about is, playable fps decent looking and will it crunch.
price will be different ... that matter a lot
and compatibility
That "eventually" can take a long time depending on how much has already been invested in existing tech.
If you want some technology, and want it soon, you need the whole market to embrace it. Which involves dropping proprietary tech, and companies view that as lost investment. So the earlier it happens, the better.
Main Rig:
Processor & Motherboard:AMD Ryzen5 1400 ' Gigabyte B450M-DS3H
Random Access Memory Module:Adata XPG DDR4 3000 MHz 2x8GB
Graphic Card:XFX RX 580 4GB
Power Supply Unit:FSP AURUM 92+ Series PT-650M
Storage Unit:Crucial MX 500 240GB SATA III SSD
Processor Heatsink Fan:AMD Wraith Spire RGB
Chasis:Thermaltake Level 10GTS Black
Main Rig:
Processor & Motherboard:AMD Ryzen5 1400 ' Gigabyte B450M-DS3H
Random Access Memory Module:Adata XPG DDR4 3000 MHz 2x8GB
Graphic Card:XFX RX 580 4GB
Power Supply Unit:FSP AURUM 92+ Series PT-650M
Storage Unit:Crucial MX 500 240GB SATA III SSD
Processor Heatsink Fan:AMD Wraith Spire RGB
Chasis:Thermaltake Level 10GTS Black
please explain how it is the same, and dont forget to indulge us on the performance change brought about to minimum framerates and frame times as well since they play a part in mantle, not just who has the greatest FPS.
All along the watchtower the watchmen watch the eternal return.
This bums me out, I wanted g-sync.
Reminds me of writing code for ati's vesa 24bit frame buffer..., which sucked.
Nvidia gets what they deserve: had they pushed a free, open solution they wouldn't get the finger now.
They will never learn.
was at PAX East 2014 this weekend, and the Nvidia PR guy at the Future PC Gaming Panel was pushing G-sync hard. I do not think that they will be just giving up and letting this go. The panel had Chris Roberts, Matt Higsby, and some other high profile developers. They all are pro gsync/freesync because it lets them go away from the magical '60fps standard'. 24/30fps , with higher detail and gystnc/freesync seems 'just as good'
I am paraphrasing, but a guy in the audience asked the question, 'isn't G-Sync dead now that VESA has adopted FreeSync'? Nvidia PR guy course deflects the question that 'any rumors and comments on Standards not even released yet are a waste of time. Nvidia PR guy goes to talk about 'here and now' and how Nvidia is doing the heavy lifting for the gaming industry and will be getting monitors soon. BenQ, Asus, and a few other Vendors will be releasing G-Sync enabled monitors this quarter/next quarter going as high as 4K.
'Gaming' AMD FX-6300 @ 4.5GHz | Asus M5A97 | 16GB DDR3 2133MHz | GTX760 2GB + Antec Kuhler620 mod | Crucial m4 64GB + WD Blue 2x1TB Str
'HTPC' AMD A8-3820 @ 3.5GHz | Biostar TA75A+ | 4GB DDR3 | Momentus XT 500GB | Radeon 7950 3GB
'Twitch' AMD 720BE @ 3.5GHz | Gigabyte GA-78LMT-S2P | 4GB DDR3 | Avermedia Game Broadcaster
Desktop Audio: Optical Out > Matrix mini DAC > Virtue Audio ONE.2 > Tannoy Reveal Monitors + Energy Encore 8 Sub
HTPC: Optoma HD131XE Projector + Yamaha RX-V463 + 3.2 Speaker Setup
freesync doesnt kill gsync, it just makes the rediculous price even less attractive.
All along the watchtower the watchmen watch the eternal return.
That's right it won't kill! I personally can't wait for that Asus G-sync 1440P panel...
My toys...
Asus X79 Deluxe | i7 4820K | Koolance CPU-380I w/Triple Rad/Swiftech Pump | RipjawsX 16GB 1866MHz | eVGA GTX 780 TRI-SLI | X-Fi Surround 5.1 Pro USB | Intel 530 120GB *2 RAID 0, Intel 510 250GB, Samsung 840 Pro 120GB, Samsung 840 500GB, Kingston V300 240GB | Corsair AX1200i | In Win D-Frame Orange | Win 8.1 Pro 64
Asus Sabertooth Z77 | i7 3770K | NH-C12P SE14 | Vengeance 32GB LP | eVGA GT 240 | X-Fi Titanium Fatality | LSI SAS 9211-4i | Intel 330 120GB, Seagate 500GB *2, Samsung 200GB, WD 320GB *4 RAID 10, 500GB, Raptor 74GB | Antec TPQ-1200W | Corsair 650D | Win 8.1 Pro 64
Asus Sabertooth P67 | i7 2600K | NH-U12P SE2 | Vengeance Pro 16GB 1866MHz | eVGA GTX 680 | Sound via HDMI | Intel 330 60GB, Samsung 840 Pro 120GB, WD VRaptor 300GB, 150GB *2 | Antec HCG-750W | Lian Li PC-60FNWB | Win 8.1 Pro 64
Asus P8H77-M/CSM | i3 3220 | Shuriken | Vengeance 16GB LP | eVGA GT 610 | Sound Blaster Play | Hauppauge WinTV-HVR-1600 & HD PVR | Asus PCE-AC66 | Kingston V100 128GB, WD 1GB, 500GB, Seagate 2TB | Enermax Liberty 500W | Fractal Design Core 1000 | Win 8 Pro 64 w/Media Center
Asus P8H77-M/CSM | i3 3220T | Hyper 212 Evo | Vengeance 8GB | eVGA 210 | Hauppauge WinTV-PVR-250 | Intel 330 60GB, WD 750GB, 250GB | Enermax Liberty 500W | Antec 300 | Win 7 Premium 32
Axial SCX10 2012 Jeep Wrangler Unlimited Rubicon Modified
-Project Sakura-
Intel i7 860 @ 4.0Ghz, Asus Maximus III Formula, 8GB G-Skill Ripjaws X F3 (@ 1600Mhz), 2x GTX 295 Quad SLI
2x 120GB OCZ Vertex 2 RAID 0, OCZ ZX 1000W, NZXT Phantom (Pink), Dell SX2210T Touch Screen, Windows 8.1 Pro
Koolance RP-401X2 1.1 (w/ Swiftech MCP35X), XSPC EX420, XSPC X-Flow 240, DT Sniper, EK-FC 295s (w/ RAM Blocks), Enzotech M3F Mosfet+NB/SB
That's already too much ^^
I don't understand why people want buy something, that will prevent them to buy an other brand of graphic card after bought the expensive monitor.
They put an expensive FPGA in it, a lot of power consumtion for tech that will be outdated with freesync. It's a VESA adopted tech, so even nvidia will support it. It's gonna be universal. And there is no difference with freesync.
He should do just like us, wait for freesync. It's gonna be soon mass producted in all monitors. => So a lot more cheaper.
why buy private tech, when there is free tech that do exactly the same ???
It isn't the same thing, what amd proposes uses the exact same monitors of today and your cpu to achieve a better sync then today's vsync.
What nv proposes uses hardware in "new" monitors and does a better job of it, and less overhead too.
What do you do when you need the absolute fastest perf you can get?, you disable vsync, same goes for freesync.
And besides, maybe said person is a professional graphics person and doesn't want anything todo with amd's gpu/apu, you ever think about that ? ^^.
Of course not, you're probably thinking firegl lmao...
It's not just professional's, the military doesn't use amd either...
Some people will never have a real use for amd's graphics cards, will you amd fanbois ever realize this?
Last edited by NEOAethyr; 04-16-2014 at 05:30 PM.
Bookmarks