Back in 2019, I built my current gaming PC. Since then, the only modification I have performed, was to double the amount of ram to 32GB. I did that to facilitate hosting a few services on this PC, before I built my server.
Both of these PCs have performed fantastically. However- After installing plex on my server, and attempting to transcode a movie in 4k, I quickly realized…. the quad core 80$ APU was not going to cut it.
Now- don’t get me wrong- this CPU has served its purpose perfectly. It currently runs over 40 containers, ranging from home automation, network administration, git, game servers, backups, etc…. and also- hosts my blue iris NVR recording from multiple 5mp cameras constantly. It does all of this, while only averaging 30% cpu utilization.
However, I think its time has come…
If you don’t like to read, and just want to see the benchmarks, and final results… CLICK HERE
Disclaimer- Amazon affiliate links are used in this article. For this site, I choose to not pesture my audience with annoying advertisements, and instead, only rely on affiliate links to support this hobby. By using the affiliate link, you will pay the same price on Amazon, as you would otherwise pay, however, a small percentage will be given to me.. To note- I DID buy all of the seen products with my own money, and did not receive any incentive to feature or utilize them.
To note on the above disclaimer- absolutely none of the content you are seeing was sponsored. 100% of everything shown below, was purchased out of my own back pocket. There is no outside influence involved for this build. As a matter of fact- as of the time of writing this introduction, I have no idea if this build will even pan out correctly.
Now- normally, you would toss in a better CPU, and call it a day, right?
I want to do something different. I want to consolidate my server, with my gaming PC. To do this, I plan on leveraging unraid to expose my gaming PC, as a VM hosted on unraid, with the physical SSDs passed directly into the gaming PC.
That way- if issues ever do crop up, I can remove the unraid thumb drive, and boot into my pc normally, without any issues at all.
My only concerns on this approach- is…
- How much will it affect my gaming performance.
- How reliable will this solution be?
In this article, I hope to answer both of those questions for you.
The old server build will have all of the extra hard disks, and expansion cards removed, will be upgraded with the Ryzen 5 3600, and will be a birthday gift for somebody near the end of this month. With a 6c/12t processor, 32gb of ram, and a brand new SSD, this should excel at the tasks expected from the user(Photoshop/Checking email/etc…).. Well- it’s a bit overkill.
- Motherboard: Gigabyte X570 AORUS MASTER
- CPU: AMD Ryzen 7 5800x – 8c / 16t
- CPU Cooler:
Cooler Master Hyper 212 LEDCorsair H150i (Replaced after publishing this article)
- RAM: Corsair Vengeance LPX 16GB (2 X 8GB) DDR4 3600 (Only two of the sticks will be kept.)
- RAM: Corsair VENGEANCE RGB PRO 32GB (2x16GB) DDR4 3600 ( I couldn’t find the above sticks for a reasonable price in 16GB dimms)
- GPU: GeForce RTX 2070
- SSD1 (Gaming VM OS): Samsung 970 EVO NVME M.2 500GB (Passed through to VM)
- SSD2 (Gaming VM Steam): Samsung 970 EVO NVME M.2 500GB (Passed through to VM)
- SSD3 (Via USB) (Staging): Samsung 970 EVO NVME M.2 500GB
- Connected via SSK Aluminum M.2 To USB Type-C (I ran out of PCI-e lanes…) Still performs at 600MB/s.
- SSD4,5: (Server Cache): Samsung 970 EVO 1TB NVMe (BTRFS Mirrored Cache Pool)
- HDD1-8 = 8x 8TB Seagate 7200 RPM ST8000NM0105 (Added after publishing this article)
- HDD9 – Random 3TB (Used for NVR)
- NVMs PCIe: ASUS Hyper M.2 X16 PCIe 3.0 X4 Expansion Card V2 (After the fact note- only enough PCIe lanes for a single NVMe here….)
- SATA HBA: LSI 9207-8i (Added after this article)
- Case: Fractal Design Define R6
- PSU: EVGA SuperNova 80+ GOLD 750w
- Keyboard: IKBC CD108 (This keyboard is a pleasure to use)
- Mouse: Logitech M150 Mouse (Don’t replace what isn’t broken!)
I would have loved a new Ryzen 9 5900x, but, those are a bit hard to come by currently. So, I will make do with a 5800x.
Overall- My unraid server uses 16gb of ram for applications, and uses the remaining 16gb of ram for cache. My gaming PC uses around 9-12gb on average while playing games. With that said- 48GB of total ram, should allow everything to maintain the existing amount of ram.
For compute usage- If you compare core-by-core, My server had 4c/4t, while my gaming PC has 6c/12t. The new CPU, will have 8c/16t, leaving for the same amount of threads.
Looking at CpuBenchmark’s Comparison, The 5800x is more powerful than BOTH of the old processors combined, in nearly every area. So- there should be plenty of compute to go around.
With those comparisons out of the way, it should be safe to assume this new build, will have PLENTY of compute and memory to go around.
For storage, here is the plan:
- SSD1: 500gb Gaming PC OS – Will be passed directly to the VM.
- SSD2: 500gb Gaming PC Steam – Will also be passed directly to my gaming VM.
- SSD3: 500gb Staging – Connected via USB. Used for “Staging” from Sabnzbd, BlueIris, etc. I expect this one to wear out first.
- SSD4: 1TB: Mirrored Cache Pool.
- SSD5: 1TB: Mirrored Cache Pool.
- HDD1: 3TB: NVR storage. Redundancy is not required for this. Mounted directly in Blue Iris VM
- HDD2: 8TB: FreeNAS
- HDD3: 8TB: FreeNAS
- HDD4: 8TB: FreeNAS
- HDD5: 8TB: FreeNAS
- HDD6: 8TB: FreeNAS
- HDD7: 8TB: FreeNAS
- HDD8: 8TB: FreeNAS
- HDD9: 8TB: FreeNAS
SSD1, SSD4, and SSD5 will be placed directly into the motherboards NVMe slots.
SSD2 is connected via USB. I ran out of PCIe lanes.
SSD3 is on the PCIe/NVMe expansion card.
The 8x 8TB HDDs are hosted off of my LSI 9207-8i, which is passed directly into FreeNAS. The zfs array is configured as striped mirrors. (Raid 10)
The 3TB NVR drive is connected to my motherboard’s sata ports.
My existing server has a total of 4 active gigabit ethernet ports, thanks to my quad port HP/Intel Gigabit NIC... It is also connected directly to the “closet” switch, in the center of my house. This keeps my NVR footage from having to hop through the entire network. Essentially- the POE cameras plug into this switch, and there is a dedicated ethernet port on the server for NVR traffic, keeping this traffic away from the rest of my network.
Since, my gaming PC will remain in the bedroom, this limits me to a maximum of two ethernet ports total. Luckily, the motherboard DOES have two ethernet ports.
Doing some napkin math- on potential bandwidth usage..
- NVR Cameras – 20Mbit/s each. 5 Total = 100Mbit/s.
- Internet – 100Mbit/s max d/l.
- Streaming Media – Let’s assume 50Mbit/s.
- IOT/Server Traffic – Let’s assume an unrealistic number of 50Mbit/s. The actual number is < 1Mbit/s.
This totals up to 33% of a single gigabit connection. So- assuming the server is busy syncing at full speed from the internet, consuming 100mbit/s, while my living room TV is streaming a 4k movie, at a SUPER high bitrate, and while all of my IOT clients are downloading complete firmware from my server, we are likely going to be using less then a single gigabit connection. The only potential exception, is all of the cellular devices in my home decides to do a full next-cloud sync of the previous 10 years of photos- well- 802.11ac could potentially saturate the link. However, the weak leak will be from the access point, to the switch, and will not matter where the server connects to the network.
With that said, A single gigabit ethernet port should fulfill my needs. However, I plan on dedicating the realtek 2.5GBe to my gaming PC, and dedicating the intel NIC to unraid. I have had MUCH better luck with intel NICs on server applications.
I am not going to post a full step by step guide on how to build a PC. But- I will post a few select pictures.
As well, you can view THIS article for a few of the tweaks I had to perform inside of Unraid to get everything working properly.
How benchmarks will be performed
Remember- the VM is ONLY allocated 4 physical cores, and 4 hyper threads, for a total of HALF of the physical CPU. The bare metal results WILL BE MUCH HIGHER as a result!!!!! As well- the VM is only allocated 16GB of RAM, instead of the 48GB for bare metal.
As well, the benchmarks will be executed WHILE my server is running its full workload. This includes processing NVR traffic, AI detection, Home automation, and even streaming video. This is NOT a clean-room test. This is a down and dirty REAL WORLD test to see what actual results will look like.
Tested with CPU-Z, Version 1.95.0 x64, with benchmark version 17.01.64
Link to bare metal test: https://valid.x86.fr/zaptm4
To note- There is a separate test for bare metal, using only 8 threads, to match the resources allocated inside of the VM.
Note- #VM 1 is as a VM, before installing the h150i cooler. #VM 2 is AFTER installing it.
|CPU-Z||Bare Metal – 16 Threads||Bare Metal – 8 Threads||VM 1||VM 2||% Diff from bare metal|
Summary- there is a expected amount of impact. The VM multi-thread performance, is slightly less then half of the bare metal performance, which is expected, because only half of the CPU is allocated to the VM. My guess, is the 8 thread bare metal test, ran against physical cores, instead of hyperthreading. The single threaded performance, does seem to suffer more then I would like, however, 10 percent is expected.
Overall- this was a very successful test in my opinion.
Link to benchmarks performed when this PC was built. Using the same drive.
OS Drive Only – Bare Metal
Oddly enough, Its actually quite a bit faster then when I originally benchmarked it. I guess Samsung SSDs age like fine wine.
OS Drive Only – As a VM (Updated 3/23)
I ended up manually editing the XML file, to pass through the NVMe controllers, using hostdev tags. I will have a write up in the next few days on THIS ARTICLE.
After performing these corrections, here is the updated benchmark. The random speeds are suffering a tad, however, the sequential speeds are as good as bare metal. I noticed the one of the metrics was a bit weird- So, I ran the benchmark again.
The performance penalty here is pretty large… 64% slower access for large sequential reads, 60% for large sequential writes. Keep in mind- this is a bare metal drive, passed into a VM. This is not using a .VMDK, or QCOW virtual disk, this is a physical disk.
I will be trying to work with unraid to discover how this can be resolved.
At the current time, I would mark this test as a failure, as these results are not very good.
Far Cry New Dawn – Ultra Preset
Overall- extremely playable at 1440p. Playable at 4k. I am not quite sure why the average FPS inside of the VM, is higher then on bare metal.
|Bare Min||Bare Avg||Bare Max||VM Min||VM Avg||VM Max|
Basically zero difference from bare metal performance. Very successful test.
Rise of the Tomb Raider
To note- running these benchmarks multiple times gave somewhat inconsistent results.
I also don’t understand how it would outperform the bare metal install, inside of a VM. These benchmarks do not appear to be very consistent. Take these with a grain of salt.
Also- I don’t understand how the Mountain peak on 1440p, actually did worse then the same benchmark at 4k.
1440p – Very High Preset
|FPS||Min||Avg||Max||VM Min||VM Avg||VM Max|
4k – Very High Preset
|FPS||Min||Avg||Max||VM Min||VM Avg||VM Max|
My benchmark results were too inconsistent to call a result. Even running back to back benchmarks with no changes, would lead to varying results on either bare metal, or VM.
For the record- do NOT use user benchmark to determine what processor is better. Ever since the AMD fiasco, where they messed up their ranking so badly, an i3 was ranked above an i9 or AMD threadripper, I do not trust them one bit, whatsoever.
However, I do leverage the tool, to determine how well my hardware compares, to others, with the same hardware.
Bare Metal Run ID: 41087369
As a VM Run ID: 41161578
Note, There are purposely no links to the results. I refused to support them by driving traffic in their direction, after the obvious nerfs to AMD benchmarks, which caused them to report a quad core i3, as faster then a Ryzen or intel i7/i9.
You can read about that story here.. if you are unfamiliar with it.
~4% impact to single-threaded CPU performance.
~10% impact to multi-threaded CPU performance for their 4/8 core thread.
~50% impact to the 64 core test, which is expected, since VM is only allocated half of the cores.
The GPU benchmark actually did quite a bit better when virtualized.
HDD results are in line with the above HDD benchmarks.
Their RAM benchmark also performed better while virtualized.
Overall, I would say this is a very successful project. I now have a consolidated VM and server. While physically using the computer, You cannot tell in anyway it is virtualized.
The two 32″ 4k & 1440p displays, work exactly the same.
The USB ports, work exactly the same. You plug it in, and it pops up like normal.
Even wifi and bluetooth, work exactly the same. There is also a dedicated NIC I passed in.
The ONLY noticeable difference- you don’t want to click the power button…. In the current state, it will turn off the Server…
When the server does reboot- You will see the bios normally, followed by unraid booting, then a blank screen followed by the gaming VM a few seconds later.
I will note- during all of this testing, my CPU temps were hitting 90c, which IMO, is quite hot. This is within AMDs rated temp range. The current cooler is a Cooler Master Hyper 212. However- I already have a new Corsair H150i on the way as of writing this…. especially, since with Precision boost override, more cooling, usually translates to more performance. I expect, this will actually help some of the benchmarks as well.
As an update- the H150i hardly ever goes above 40 or 50c. Under full load, with all cores at 100%, the highest temp I have seen is 70c. This is a drastic improvement from the 90c temps I was receiving with the Hyper 212.
In the future, I will design and document an IOT project to make the computer’s power button “Smart”.. in the sense, clicking it will power up/power down the VM, instead of the entire server. As well- I will connect the power LED to the VM’s power state.
What I Would Change
Use Threadripper instead of Ryzen
While, I still technically can return everything and do so- If I were to repeat this build, I would use a thread ripper instead of a Ryzen. I estimate it would cost around 400-500$ more to do so, HOWEVER, I would gain a few KEY features for this build.
- Quad channel DDR-4. This helps performance.
- PCI-E Lanes.
Seriously- 20 PCIe lanes is not enough. Ideally- your GPU gets dedicated 16 lanes. But- 8 lanes still is not likely to affect performance.
Each PCIe NVMe uses 4 lanes. Remember the ASUS Hyper M.2 used in this build? Well- due to PCIe restrictions, I can only put a single SSD in it. I had to order another adaptor to squeeze my last NVMe into the bottom slot, which gets lanes from the chipset, instead of the CPU.
Want to add another GPU, for a second player to also use? Good luck, lanes are very limited on Ryzen.
However, if say, you spent the same amount on a Threadripper 1920x, you get a CPU with 12 cores, and 24 threads (up from 8/16), which gives you 64 pci-e lanes, instead of 20. Granted- this processor will perform overall- quite a bit worse, and it only supports PCI 3.0, instead of 4.0. But- the additional PCIe lanes will offer much greater room for expansion. I do expect this is why linus chose to use a server CPU/motherboard for this video.
The downside to using a thread ripper- it would cost quite a bit more. The motherboard is quite a bit more expensive, as is the CPU. In this case, it would have likely been a better decision due to my expansion constraints I have reached.
Technically, I have already ordered a Corsair H150i. However, it is not mentioned until the end of this article. But- better cooling usually means more performance… and a good cooler, also means less noise.
Note- The H150i is installed. It successfully brought temps down to 130F from 195F. It also improved the CPU benchmarks as well.
Install a mini-split into my office
This server/PC has been doing a very good job of heating my office. However- when summer time comes, this is going to be an issue…….
Q&A / FAQs
- What if the server stops working, and you need to access your PC?
- I remove the thumb drive and boot directly into windows, and everything works normally.
- Why didn’t you use a Ryzen 5950x?
- Because I am not made of money, and I cannot get my hands on one currently.