extreme 4 / z97x killer and usb 3.1/c |
Post Reply |
Author | |
hellazari
Newbie Joined: 11 Jul 2015 Status: Offline Points: 6 |
Post Options
Thanks(0)
Posted: 11 Jul 2015 at 8:03pm |
just small question to see if i get it right,
if i get the extreme 4 / z97x killer which bundled with a pcie x4 card with usb 3.1 and usb c port, that will mean that my gpu will run forever at x8 because this card is plugged into a pciex16 slot? and what about sli? the gpu's will run at X8 and X4 instead of X8 and X8 because all the 3 pciex16 slots are used? will be glad for some clarification , thanks! |
|
parsec
Moderator Group Joined: 04 May 2015 Location: USA Status: Offline Points: 4996 |
Post Options
Thanks(0)
|
Yes you are right about both situations.
All the socket 1150 processors you could use with those boards have 16 PCIe 3.0 lanes. That is a limitation we can blame on Intel. So strictly speaking it's not really the use of more than one PCIe 3.0 x16 slot that limits a single video card to x8 (although the result is the same), but the amount of PCIe 3.0 lanes that are available. Regardless, your descriptions are just as accurate about the final results. Most gamers say that running a single video card at PCIe 3.0 x8 does not reduce performance much if at all. The SLI situation is not the same. |
|
hellazari
Newbie Joined: 11 Jul 2015 Status: Offline Points: 6 |
Post Options
Thanks(0)
|
yes, thats what I thought, thanks for clearing that out.
but what about the extreme 6 board? he has a pcie 2.0 x16 slot which runs at x2 and from my understanding this slot is not affecting the 2 pcie 3.0 x16 slots speed. can i fit the USB 3.1 card into the pcie 2.0 x16 slot and the GPU will still run at X16 speed? if yes, pcie 2.0 at x2 speed is 0.8Gbit speed so this slot will bottleneck the card I guess.
|
|
Xaltar
Moderator Group Joined: 16 May 2015 Location: Europe Status: Offline Points: 25073 |
Post Options
Thanks(0)
|
At present, there are no GPUs that can fully saturate PCIE 3.0 x8, at least that is what benchmarks are showing. I think the closest to doing this is the Titan Z. PCIE 3.0 x8 is more or less = to PCIE 2.0 x16 in bandwidth.
|
|
hellazari
Newbie Joined: 11 Jul 2015 Status: Offline Points: 6 |
Post Options
Thanks(0)
|
thats mostly right, but not always, on some games it has big impacts even on gtx 980, which im planning to get, as you can see here:
anyway im more worried about SLI. do you have info about what i asked regarding the extreme 6?
Edited by hellazari - 12 Jul 2015 at 2:07am |
|
parsec
Moderator Group Joined: 04 May 2015 Location: USA Status: Offline Points: 4996 |
Post Options
Thanks(0)
|
Again you are correct. The PCIe 2.0 x16 slot (x2 lanes) is provided by the Z97 chipset, not the CPU. So using that slot will not use any of the PCIe 3.0 lanes, and all will be available to the video card(s). Another thing to know is the exact specs of the USB 3.1 card, which can be seen here: http://www.asrock.com/mb/spec/card.asp?Model=USB%203.1/A%2bC Note that this card is physically x4, but is x2 electrically, so uses only two lanes. I don't know why you said PCIe 2.0 x2 speed is 0.8Gbits, since one PCIe 2.0 lane is supposed to be 4Gbit/s max effective transfer rate, taking into account overhead. The card will be slightly bottlenecked when used in a PCIe 2.0 lane slot, since that would not provide the 10Gbit/s (or more) speed that two PCIe 3.0 lanes would provide. |
|
hellazari
Newbie Joined: 11 Jul 2015 Status: Offline Points: 6 |
Post Options
Thanks(0)
|
First, Thank you very much for all your help!
Dont know why I wrote 0.8, I geuss wanted to write 8Gbit. Thanks again! |
|
Post Reply | |
Tweet
|
Forum Jump | Forum Permissions You cannot post new topics in this forum You cannot reply to topics in this forum You cannot delete your posts in this forum You cannot edit your posts in this forum You cannot create polls in this forum You cannot vote in polls in this forum |