The Callisto Protocol: PC graphics benchmarks
G.Skill TridentZ 5 RGB 6800 MHz CL34 DDR5 review
Be Quiet! Dark Power 13 - 1000W PSU Review
Palit GeForce RTX 4080 GamingPRO OC review
Core i9 13900K DDR5 7200 MHz (+memory scaling) review
Seasonic Prime Titanium TX-1300 (1300W PSU) review
F1 2022: PC graphics performance benchmark review
MSI Clutch GM31 Lightweight (+Wireless) mice review
AMD Ryzen 9 7900 processor review
AMD Ryzen 7 7700 processor review
Review: Gigabyte G1.Sniper B6
We review the budget yet high-end board G1.Sniper B6 from Gigabyte, it is based on the lower cost B85 chipset from Intel. B85 based motherboards typically end up in business desktops and normally are limited from overclocking. Well, that is just different with the Gigabyte G1.Sniper B6! A small price, nice features and tweakability?
Read the full review right here.
« ASUS To offer White Colored GeForce GTX 970 Turbo · Review: Gigabyte G1.Sniper B6
· Grand Theft Auto V: PC GTA Online Heists Gameplay »
Review: AMD A8-7650K APU and A68H chipset - 04/01/2015 11:17 AM
Let's review the A8-7650K APU from AMD in combo with a new A68H motherboard from ASUS. The 7600 series APU is based on AMD's Kaveri architecture bringing the CPU and the GPU even closer together. Ka...
Review: OCZ Vector 180 Series SSD - 480 and 960 GB - 03/25/2015 07:26 PM
join us as we review the new 480 and 960 GB OCZ Vector 180 Series SSDs. Based on an Indilinx Barefoot 3 controller the product comes with the latest iteration of Toshiba NAND flash memory. The end ...
Review: AMD FreeSync With the Acer XG270HU Monitor - 03/20/2015 10:11 AM
Join us as we review AMD FreeSync with a 499 EURO costing FreeSync compatible Acer (27-inch 2560x1440p - 144Hz) screen. Following NVIDIAs GSync AMD took a different approach, no more tearing and sync ...
Review: GeForce GTX Titan X - 03/18/2015 01:05 PM
Join us as we review the fastest graphics card in the world, the GeForce GTX Titan X. The 12 GB beast has arrived. Initially the Titan series was positioned to be a professional line of products, howe...
Review: Corsair H80i GT - 03/12/2015 08:38 AM
In todays review we have a look at the new Corsair H80i GT AIO liquid cooler. This newly introduced processor cooler is compatible with Corsair's LINK software. This allows you to program, customiz...
Noufel
Senior Member
Posts: 107
Joined: 2013-05-12
Senior Member
Posts: 107
Joined: 2013-05-12
#5043453 Posted on: 04/03/2015 03:32 PM
Nice review, it's the perfect MB for the money!
Nice review, it's the perfect MB for the money!
schmidtbag
Senior Member
Posts: 7152
Joined: 2012-11-10
Senior Member
Posts: 7152
Joined: 2012-11-10
#5043455 Posted on: 04/03/2015 03:40 PM
Is it just me or is this the "cleanest" ATX motherboard in existence? Seriously look at the PCB - there's almost NOTHING on it. Get rid of the white print and the heatsinks and it'd look almost barren. I'm not complaining, I just think it's crazy that you can fit in so much technology while being so "empty".
Can someone explain why CF/SLI with Gen3x16+Gen2x4 is bad for a gaming rig?
Most tech sites show 5% loss at most, while the majority of people report losses of 10%+. Technically the bandwidth should be sufficient for most mainstream cards, so maybe the CPU is chocking?
Well, 2.0 at x4 has the performance of 1.0 at x8. When you get to speeds like that, most high-end GPUs from the past 3 or 4 years are going to start losing performance. Not much, but enough. It really all comes down to what you intend to CF/SLI. If you do anything that isn't enthusiast grade, you're probably fine.
Overall, it seems that PCIe 3.0 x16 is overkill for just about anything. DX12 or Vulkan could possibly change that though.
Is it just me or is this the "cleanest" ATX motherboard in existence? Seriously look at the PCB - there's almost NOTHING on it. Get rid of the white print and the heatsinks and it'd look almost barren. I'm not complaining, I just think it's crazy that you can fit in so much technology while being so "empty".
Can someone explain why CF/SLI with Gen3x16+Gen2x4 is bad for a gaming rig?
Most tech sites show 5% loss at most, while the majority of people report losses of 10%+. Technically the bandwidth should be sufficient for most mainstream cards, so maybe the CPU is chocking?
Well, 2.0 at x4 has the performance of 1.0 at x8. When you get to speeds like that, most high-end GPUs from the past 3 or 4 years are going to start losing performance. Not much, but enough. It really all comes down to what you intend to CF/SLI. If you do anything that isn't enthusiast grade, you're probably fine.
Overall, it seems that PCIe 3.0 x16 is overkill for just about anything. DX12 or Vulkan could possibly change that though.
lanelor
Member
Posts: 36
Joined: 2014-07-29
Member
Posts: 36
Joined: 2014-07-29
#5043463 Posted on: 04/03/2015 04:02 PM
I agree that there is loss of performance, still there are no recent GPU tests of x16+x4, which is most of the lower tier of CF/SLI gaming mobos. Honestly I am a bit curious, yet not enough to buy a new MOBO+CPU+may be DDR4+CASE just to test the performance
I agree that there is loss of performance, still there are no recent GPU tests of x16+x4, which is most of the lower tier of CF/SLI gaming mobos. Honestly I am a bit curious, yet not enough to buy a new MOBO+CPU+may be DDR4+CASE just to test the performance

Agonist
Senior Member
Posts: 4023
Joined: 2008-10-13
Senior Member
Posts: 4023
Joined: 2008-10-13
#5043482 Posted on: 04/03/2015 04:59 PM
Exactly why I am still on my X58 setup.
Just no point on upgrading. My board has two 2.0 x16 and two 2.0 x8 slots.
And 2.0 X16 is still not really slowing my 290 down at all.
Plus I have a 6 core Xeon. Im not gonna go to a quad core for a few extra % of single core performance when my cpu is @ 4.5ghz. Not worth the cost of a new motherboard and CPU, and memory.
I may use my tax return next year and get a new 6 six core,16gb DDR4, and nice motherboard. That system would last me a good 4-5 years.
I know this is a long ramble post guys lol.
I really love the Gigabyte Sniper boards. Ive always wanted one.
My friend has the X58 G1 killer.
I agree that there is loss of performance, still there are no recent GPU tests of x16+x4, which is most of the lower tier of CF/SLI gaming mobos. Honestly I am a bit curious, yet not enough to buy a new MOBO+CPU+may be DDR4+CASE just to test the performance 

Exactly why I am still on my X58 setup.
Just no point on upgrading. My board has two 2.0 x16 and two 2.0 x8 slots.
And 2.0 X16 is still not really slowing my 290 down at all.
Plus I have a 6 core Xeon. Im not gonna go to a quad core for a few extra % of single core performance when my cpu is @ 4.5ghz. Not worth the cost of a new motherboard and CPU, and memory.
I may use my tax return next year and get a new 6 six core,16gb DDR4, and nice motherboard. That system would last me a good 4-5 years.
I know this is a long ramble post guys lol.
I really love the Gigabyte Sniper boards. Ive always wanted one.
My friend has the X58 G1 killer.
Click here to post a comment for this news story on the message forum.
Member
Posts: 36
Joined: 2014-07-29
Can someone explain why CF/SLI with Gen3x16+Gen2x4 is bad for a gaming rig?
Most tech sites show 5% loss at most, while the majority of people report losses of 10%+. Technically the bandwidth should be sufficient for most mainstream cards, so maybe the CPU is chocking?