Hehe, well, you already know what my opinion on RGB is, buuuuut, I must say, your build is really pretty! Pretty and pink ^_^
I also like the braiding on the coolant lines~
The only niggles I have are that you used a single 6+2-pin PCIe power cable to power a 2x(6+2)-pin card (probably not a problem unless you're OCing loads), and that the GPU is plugged into the third PCIe slot, which on your motherboard is connected to the chipset, and is only a x4 interface, whereas the top one is x16 directly to the CPU.
If I may suggest a reason for a dual system build... one of them could be a FreeNAS system?
Thankya for the feedback, mal-kun. It means a lot, you know. ^~^
Good catch regarding the PCIe slot. I'd placed it down there ages ago to give the best platform for photographing Nendoroids, buut, given I've taken out the white fans I used to project lighting into the rest of the case, I'd reckon it's a moot point. I've also been having black-screen issues after trying to run triple monitors recently, so I wonder if that's related to the lower bandwidth of that final slot.
I'll also be adding a small expansion card to one of those x1 slots pretty soon. I'm not quite sure how, but the pins on one of my motherboard's two USB 3 headers got bent, and despite working with pliers, I can't quite get it right again. >~<'
No problem ^~^
I've heard about others having black screen issues with newer AMD cards as well, so it may well be a driver issue! Hopefully moving it to the top slot will fix it though. It's not unheard of for PCIe cards to run more reliably on a direct interface to the CPU.
Eugh, I have no idea how I haven't broken a USB 3.0 header yet. Those things are just terrible.
Right, though? I mean, I get needing more pins to deliver data or whatever, but could ya at least design a more robust connector? I'm really, really happy I went full ATX, otherwise I'd literally have to RMA a $200 motherboard for a component I got for twenty bucks as an expansion card. >_<'
As for the GPU... I ran a benchmark and got around a 500 point gain on Firestrike Extreme. That might have to do with the card being able to pull more air into its cooling fans, but it's still appreciated. ^~^
On another note... One thing that's surprised me is the difference in ambient temperature over time if I have all my RGB accessories turned on. LED lights might be a lot cooler than an incandescent bulb, but they still put out a tonna heat. Even without my receiver [which woulda turned this place into a veritable sauna], it still gets upwards of 81F/27C in here, a wee bit toasty for my liking.
Have you seen the USB Type-C header? Far superior.
I'm not really sure how much better 500 is, but it seems like a pretty big number~ But it is surprising how cooling can affect modern GPU performance. My 290X felt like a new card when I water cooled it.
LEDs can get a bit hot, but... a significant rise in ambient temps would surprise me a bit. It's not like they're putting out 1000 lumens of light. What is the ambient temperature delta?
Also, have you seen this? Looks like it might be worth trying out when it comes out!
I have~! The Luxe 2 has one, though sadly my motherboard doesn't have any headers. Although, four front USB A's are plenty enough I'd reckon.
One thing I don't like about my case, though, is the placement of the Reset button. Instead of being next to the flush power button on the top, it's a teeny-tiny little button in the front panel, right next to the audio jacks and integrated RGB controls. I actually disconnected it from the motherboard, precisely because the last thing I wanted was to fat-finger something in the middle of a crucial task. >_<'
As for temperatures from LEDs... I feel that comes down to a lot of different factors, on second thought. For example, I've felt quite hot today and been sweating a lot, but that's because I had a close brush with a migraine. My trusty alarm clock says it's 75F in here, and that's in spite of the GPU running rather warm, so I'll go by that...
I did~, actually! It's pretty amazing, I'll deffos check it out... ^~^
On another note [So much new hardware stuff to talk about!], did anyone see the reveal and demo of Nvidia's new Ampere GPUs? And more importantly, the price? If the benchmarks are even half of what the marketing charts say they are, AMD is gonna have to play some serious catch-up with RDNA2... ^^'