No clearance issue AFAIK, but due to the soundproof stuff on the side panels, I'd worry about heatsinks taller than the EVO. Maybe give a look at a Macho 120 Rev. A
Faut pas attendre. J'avais attendu avec la serie 300 et finalement c'etait du rebranding... A toi de voir par contre parce que la 280X est toujours fort puissante aujourd'hui.
J'avais pense prendre un InWin 303, mais j'ai abandonne quand il est arrive chez mon petit vendeur au coin de la rue... en mille morceaux. Je me suis fait rembourser et j'ai depense l'argent sur X-Plane 11.
J'ai des ventilos noctua qui trainent quelque part, ce sont ceux pour un NH-D14 qui ne rentre pas dans ce boitier, j'ai juste pas envie de les niquer avec les terribles vis pour boitier.
Les zipties (serflex) c'est pas trop mon truc. Surtout que je regarde pas mon PC a longueur de journee et qu'il reste en dessous de mon bureau.
Les 8GB d'une carte ne m'interessent pas pour les jeux qui prennent 8GB (je fais pas de la 4K non plus) mais pour les jeux qui prennent 4.5GB ou 5GB. C'est juste un peu court pour l'instant, je me rappelle quand les possesseurs de GTX680 disaient qu'il ne fallait pas se tracasser pour les 2GB de leur carte et que les 3GB de la 7970 etaient overkill.
Cette fury c'est la meme chose, OC factory @ 1050Mhz, je pourrais downclock et tout ca mais ehhh, pas trop envie. Et puis en 1080p y'a toujours un gain. Mes cartes graphiques precedentes geraient le 1600x1024 (mon ecran) parfaitement mais bon, dans des jeux tels que MGS V ou GTA 5, y'a un reel gain. C'est a toi de voir si ca vaut le coup. En tout cas, la 1070 est encore bien trop chere.
Sorry for replying insanely late. I was hesitating on the reply but since I discovered that I was just using VSR, I think that the time of commenting this has come.
If I understand monitors correctly, not having a scaler just means that if the input isn't the native resolution of the screen, it'll just nope out the input and display absolutely nothing.
My Cinema display handles only 1600x1024. If Windows ***** itself and goes 640x480, it's impossible to get it back at normal resolution using this monitor. Also, since the IGP is automatically deactivated while a dGPU is connected, it's impossible to reinstall gpu drivers by just ramming something on the vga output of the board (the fury doesn't output any analogue signal; the DVI is DVI-D, not I, and I haven't stumbled upon a DP->VGA converter in my local computer shop yet)
Again, sorry for the late answer.
If you went with NVIDIA cards because you thought AMD drivers were awful, it's not going to be much better on the "green" side.
-A former NVIDIA-card owner
The PSU would turn into an atomic bomb before that gentlemanshark could enjoy sunbathing in his office.
I've ran The Witcher 3 on a 560 in a Thinkstation with a single Xeon E5420 (downclocked C2Q Q9450) and I think that if you overclock your card and disable all the unnecessary stuff, you should be good to go. Also you've got great friends giving you cards :D
Man I've only heard good stuff about this series, I have to watch it.
That's pretty impressive. Good job, man!
What program are you using for your 3D renders?
I don't understand. Only a hardcore AMD fan can have this CPU, yet this SLI of "3.5 meme cards" contradicts the thing.
Will give a look, thanks.
Sweet G-Zus, that GPU cooler looks hefty enough to rip the PCIe slot
Yes it is! The S.C. Magi System-01 (ORIGINAL of course) that gets hacked by Ireul in the series and by the other Magis in the "End". All my computers are named by following this method: desktops are Melchior-[greek letter], Workstations/Servers are Balthasar-[greek letter] and laptops are Casper-[greek letter].
Of course I changed my System panel so it'd have NERV and Magi stuff in it and when I'm running Linux, this is my SSH greeting banner.
And yeah the monitor is okay, aged well for something that old.
depends man. What games do you want to play on your pc?
one single stick of RAM isn't really a good idea. You could put two sticks in there and enjoy dual channel. Doubled memory bandwith. You could maybe take a 480 if you plan to keep your gpu like really long and enjoy DX12 but if you play "nvidia optimized titles" (like project cars) then stay on the 1060, really good card.
I hope I didn't trigger you by using an old *** apple monitor, running windows due to no games and bad amd driver support on linux (I like myself some debian and gentoo), and by displaying a wallpaper associated with meme music culture.
I'll admit I only put this wallpaper because I thought it looked nice, not due to a particular affection for the whole vaporwave culture.
I'll send you some Fiji Water
I'm using an Apple DVI to ADC converter. Since those displays don't have a PSU for themselves but get their current through the ADC cable, this box is basically a power supply that takes the DVI signal, adds current and USB to it (the display got a hub, 1.1 USB I presume), stirs the thing up and sends it through the ADC cable to the monitor. It also seems to have some sort of ROM to remember settings, since some people can change the luminosity of those monitors on one computer, then plug it on another computer and keep the luminosity level.
Also it doesn't have a scaler.
It's actually in a pull configuration, the airflow is in the same direction for both the CPU fan and the outtake fan. I thought about replacing the outtake fan by one of those Corsair ML120 and remove the CPU fan to fit an air duct by Thermalright so it'd be two birds one stone.
And don't worry. I'm prepared.
I like to live dangerously
Must be my hard drives then. I can't really hear them anyway thanks to closed-back headphones
With a dank card to begin with, I could only do dank budgeting.
I understand the fear, but on paper it should be OK: a single molex connector is rated for 131W of power, two should be by far enough to supply the 150W required by the norm on the PCIe 8-pin connector.
The GTX1070 is best put into the bottom most PCI slot
The GTX1070 is best put into the bottom most PCI slot
Aren't you losing PCIe lanes though?
Neat christmas tree that you've got there!
Why didn't you go for two 2x4 sticks? You'd have gotten Dual Channel
Why didn't you go for two Radeon Pro Duos at this point :D?
Why would you use an overclockable CPU on a non-overclocking board?
Oh. I see. Good point.
Why did you go with the 960 when the R9 380 or a Kepler GTX 760/670 is more powerful than this one?