Hi, I havent yet wrote anything to here even this account is quite old. I just thought that how would you describe the difference between AMD Phenom II X6 1090T @ 3,85Ghz vs Intel Core i7 5960X @ 4Ghz and the another case AMD Phenom II X6 1090T @ 3,7Ghz vs Intel Core i7 8700K @ 4,8Ghz?

First case was my pc uprage and the another my brother. In my case the biggest change was games - in Napoleon Total War 40-50 FPS -> avg. 100 FPS. And in the lastest Tomb Raider, the change was that there wasnt anymore some lagpeaks even the FPS was acceptable almost all of the time on Phenom. Maybe the memorychanel and its capacity was just near the surface on water, usually on top of surface but sometimes drop below the surface (lagpeaks)... Newest Deus Ex and Mirros Edge worked fine on Phenom but used 80-90% of its power. Same high usage happened on Assasins Creed Unity and Syndicate (works but use +90% CPU).

In my brothers case, the reason for change was BF1 and BF5. We were disappointed and surprised when we realized that changing GPU from GTX660 -> GTX1060 didnt help FPS much on BF1. And when the BF5 came Origin the problem was bigger, the cpu was all the time 100% in game. Avg Fps was 45, without Phenoms overclocks it was nearer 30-35 :/.

When we changed the Phenom to Core i7 8700K, the avg Fps in BF5 was 100. That is very huge difference and it seems that we need more GPU power if we wants more fps. In Dota 2 the difference was smaller but in that game all is very smooth on 144Hz screen.

So can I believe the Cinebench, that Core i7 8700K is 100% faster in single core (100 vs 200pts CineR15) and 200% faster in multicore (1500pts vs 500pts). And same on my pc, that i7 5960X is 65% faster in single core (165 vs 100) and 200% faster in multicore (1500 vs 500pts)?

It is hard to find reviews with Phenom 1090T and 5960X and 8700K.