Last year I put together a new camera server using Blue Iris along with a few components from companies like Silverstone, Intel, Noctua, Cooler Master, MSI, and Kingston. I skimped out on hard drives at the time hoping to find a deal in the future and over the last year I’ve seen a few areas where we can improve on things. With that, it is time to revisit the Icyu build to give things a little polish, and hopefully finish things off.

Product Name: Project Icyu– Blue Iris Build Part 5

Written by: Wes Compton

Build Sponsors/Supporters: SilverstoneIntelNoctuaCooler MasterMSIKingston, ServerPartDeals

Amazon Affiliate links: Coral USB Accelerator

Links to the rest of the project: Part 1 - Part 2 - Part 3 - Part 4 – Part 5

 

Before getting into the new components that I’m going with I wanted to touch on how it has performed over the last year. The new system was a huge upgrade from our compact ITX build that was in use full-time for 6 years. The larger case size helped clean things up by being mounted in our rack finally and it helps us have the expandability combined with Silverstone’s FS303 hot-swap bays to make storage expansion possible when that wasn’t possible with our old setup. It also allowed the built to have water cooling which has kept things cool and also kept the noise down. The old system ran hot all day long and was loud. The 12600K was a huge improvement from the i7-7700K and as it turns out the Z690 platform as a whole does leave us open to upgrades with the 13 Series and 14 Series both being supported if later I need more processing power.

In addition to handling Blue Iris the Icyu server also handles a bit torrent client and for reviews at times it is also used for network testing by running an OpenSpeedTest server on it. That actually brings me to one area that I wanted to upgrade. The 2.5G Intel I225V NIC built in the MSI MAG Z690 Tomahawk Wifi is more than enough to handle the workload from Blue Iris but I want to have a full 10 gig connection on to our network so I’m looking to upgrade that.

At the time of the build, I went with a few 6TB hard drives that I had around but the plan was always to upgrade the storage. I’m going to address that issue today as well which will go a long way in expanding how many days of recordings we have. Last up, while the CPU has performed well Blue Iris has still been slow in AI processing which I use to tag notifications when there are animals, people, cars, and trucks. That is used to eliminate a majority of the false positives. But the speed of notifications can be very important. With the original build, I left room to add in a video card which could help with processing and I’ve even had one on hand but in the last year support for the Coral Accelerator has improved and their availability has gotten better. Last year even if you wanted one you would pay two or three times as much as their MSRP or you would have to wait 6 or more months, that isn’t an issue now. So I’m going to give that a try and keep a GPU as the backup plan.

Beyond that, I have been improving our network including a new PoE switch from EnGenius, cleaning up and redoing most of the wiring on our rack. I have also been slowly upgrading the network cables that go out to our cameras. With our original installation, I just used the Cat 5e cable that we had on hand but now 6 years later the cables that have runs outside are starting to have UV and water damage. In fact, one of our cameras just went down yesterday because of it. So I have been upgrading to TrueCable Cat 6 outdoor-rated cabling which should hold up a little better.

image 1

image 2

image 15

 


Network Card Upgrade

The MSI MAG Z690 Tomahawk Wifi that I used in our original build has a 2.5G Intel I225V network card which for Blue Iris is more than enough bandwidth. The built-in NIC hasn’t given me any issues though the I225V NICs have been known to have issues. But I have been using this build for a little more than just Blue Iris and when testing motherboards and other network equipment here in the office I have been needing a more reliable PC to use as a server when doing network testing. With this build being in the rack I decided an upgrade to a full 10G connection would come in handy. But I wasn’t looking to spend a lot of money doing it. I decided to go with an Intel X520-DA1 based network card and picked up the cheapest one available at the time on eBay which was just $28.83 shipped and including taxes. This one came from China and was Inspur branded.

image 25

image 26

image 27

image 28

image 29

I would need to get it connected to our network and keeping the costs down did mean it has an SFP+ port, not a copper 10G ethernet connection. You could get SFP+ adapters and run the cable but you can avoid having to worry about any compatibility issues there and keep the costs down even more with a DAC Cable aka a direct attach cable that has the SFP+ connection on both ends, as long as your server in nearby this is going to be the better option. I went with the RamboCables SFP+ DAC Twinax Cable on Amazon because I have used them in the past and know they work well with my hardware. I did however get just a 1 meter cable but looking back I should have gone with a slightly longer cable to give slack to be able to pull the server out. As it sits I have to disconnect it before pulling the server out to work on it, I may swap that out in the future.

image 30

image 31

image 32

image 33

Now there is one thing to keep in mind with this upgrade, however. Intel’s support for their old X520-DA1 NICs is a little dodgy now when it comes to Windows 11 support. But just using the Windows update on it and scrolling through the list did bring up a driver, you just can’t use the all-in-one driver that Intel uses for everything else anymore. This does mean that you will need a secondary network connection until you get things working.

image 6

 


Coral Accelerator

When building this system I knew that in the future I might end up upgrading to a GPU to help handle some of the processing, especially with how it filters out false positives. At that time Blue Iris was using Deepstack but later switched over to Codeproject.AI. I planned on this by going with a full ATX motherboard and preran power cables in the build to be able to hook up a video card if needed. But later on, I was excited to see support for a Coral Accelerator being added to CodeProject.AI. I had heard great things about the performance and power draw of the Coral Accelerator but for a while there they were hard to get unless you wanted to pay scalper prices that is. The availability finally cleared up and I ordered one. They have a few different connection options but at the time I went with the USB model for simplicity of installation. I thought I would just be able to toss it in at some point and not have to dig inside of Icyu but that went out the window with the network card upgrade, I should have gone ahead and gone with one of the PCIe or M.2 options looking back

The USB Accelerator came in a compact little box with the Coral branding on top and on the bottom that same sticker has a link to the setup documents on it. Inside you get the small external SSD drive looking device and a short USB cable which has a Type-C connection on one end and a standard Type-A connection on the other end. You also get a small piece of paper inside with basic setup instructions.

image 16

image 17

image 18

The USB Accelerator is just 65 mm long and has a machined aluminum housing with groves in the top for some basic cooling and it all sits inside of a plastic tray that protects the bottom and sides. It then has one Type-C connection on the end. The bottom has a small sticker with the serial number on it and the translucent plastic housing lets us see through and gives us a look at the white PCB which is cut to perfectly fit the aluminum housing.

image 19

image 20

image 21

image 22

image 23

image 24

The USB Coral Accelerator was easy to hook up but I ran into issues trying to get it to work right away. When I did get it working it was shockingly slow. As it turns out by default the USB model runs at half speed to try to keep thermals down. Even turning that off didn’t work. From there it was still slow but also would shut down which I assume was from thermals. Frustrated, I ordered an M.2 model but in doing that I ended up not paying enough attention and ordering the A+E key model. This was less than half the price when compared to the USB model but because I went with an A+E key model I ended up having to order in an adapter. So if you are looking to go in this direction, learn from my mistakes and buy the M.2 Accelerator B+M key from the start or if your system supports it the Mini PCIe version. They now also have a dual-edge TPU which has two TPUs. It is E Keyed so keep that in mind. But it is supported and can double up on performance if you have a lot of cameras. Below I have pictures of when I attempted to install the A+E key model, the accelerator itself, and then the adapter I picked up on Amazon for just under $10 but it is now $6.69. The M.2 TPU has a metal cover over it and given how much heat the USB version created I do wonder if we should be putting a heatsink on it in some form. But Spoiler, I have been running this for a while and I haven’t had any issues that make me think it is overheating at all.

image 9

image 6

image 7

image 8

image 10

image 11

image 12

Now during all of the issues getting the TPU up and running I did end up dropping in a video card as well to use temporarily. I kept things simple with an Nvidia GTX 1650 SUPER from Zotac. I would prefer a card that didn’t pull enough power to require a PCIe power connection but this still works. In the end, I ended up keeping this in the system even after handling decoding on a few cameras that didn’t work well with the Intel +VPP.

image 13

image 14

So the big question and the main reason for digging back into this system is did the Coral Accelerator TPU improve performance? Without a question, it was a huge improvement in processing time. As far as processing movement and figuring out if it is a human, animal, or something you don’t want to see. Well, all of these have their own quirks but it has worked well for the most part. I now get a false trigger sometimes when I have a human-shaped shadow every afternoon but it picks people up otherwise. The improved processing speed which I will touch on in a second also means it isn’t overloaded and missing triggers or giving me triggers minutes late. I have noticed that nighttime triggers are missed more, specifically, we have an area where cats eat and my wife likes getting notifications. At night we have a cat that is partially white that just gets ignored now where it previously didn’t but during the day or if the light is on that isn’t a problem. For raw numbers below is the processing time before and after switching. I was seeing an average of 3024ms to process each trigger and that is now down to 330ms on average. I should note that by default it was set to a small model size and at that size, it was in the 30ms range but going to large models helped improve the reliability of it picking up things. 

tersting1

testing2

 


Hard Drive Upgrades

I touched on it already. But one area that I never finished the ICYU build was in its total storage. I had planned on adding in a few large-capacity hard drives but just went with a few 6TB drives that I had on hand at the time. This was enough to get things rolling and was an upgrade from our previous setup which didn’t have internal storage at all and was running a USB hard drive. With future upgrades in mind, I did install a three-bay trayless hot swap that is installed in the front of the system. This way I wouldn’t have to pull the server out and open things up for upgrades. I didn’t plan on diving back inside like I did with these other upgrades but this still is easier to get to. I started off by moving what footage I could over to the one larger 8TB drive from one of the 6TB drives. Then I installed one of the two upgraded drives. That gave me room to move things from the second 6TB drive over and then replace the second drive. It took a LONG time to get everything moved over but without data loss, this was the way.

image 3

image 4

So what drives did I go with? While hard drive prices are always dropping, getting two big-capacity drives is still expensive. I ended up coming across the enterprise hard drives from serverpartdeals.com. While researching them, I saw recommendations on Reddit. But most importantly they were offering enterprise drives that would hold up to the 24 hour a day use in our server but at significantly lower prices than most other places. Specifically, I was eying the Seagate Exos X20 ST18000NM003D 18TB. Now all of this was happening a while back and their pricing and selection is always changing. But even now the ST18000NM003D is $189.99 for the manufacturer-recertified drive whereas on Amazon a renewed Seagate IronWolf 18TB is $198.98 and the IronWolf Pro is $229.99. New 18TB drives are $100+ more in the $300+ range. ServerPartDeals was where I would have been buying drives for this build either way, but I did reach out to them and they sent over two of the Seagate Exos X20 ST18000NM003D 18TB drives.

image 37

image 38

image 39

image 40

image 41

image 42

Being manufacturer recertified, I was curious how the new drives would ship. ServerPartDeals sent them in a box with packing materials, then both drives themselves come in their own brown box with thick air-filled padding as well. The drives also come sealed in static bags with the proper pull tab to open up, just like a new drive. In fact, there wasn’t anything about the drives that would indicate they were recertified other than the label itself on the drive which has that printed on it just below the Seagate logo.

image 34

image 35

image 36

Before getting the drives up and running, I checked them out using CrystalDiskInfo. One of the drives has just two power-on counts and 15 hours of total use and the other had 7 power-ons but significantly more hours at 496 hours. I also ran one of the drives through CrystalDiskMark to check out its performance with both the IOPS and MB/s performance. Because I don’t get the chance to test many spinning drives these days with the focus being on SSDs. It is impressive that spinning drives are reaching SATA 3 SSD numbers for read and write performance, of course, access times are still significantly lower. That is important though, the larger these drives get the longer it is going to take to transfer all of the files. Even with our 6TB drives, it took all day. I don’t look forward to the day where I have to transfer everything off of one of these and 18TB drives aren’t even at the top end of capacity.

image 46

image 48

image 44

image 45

The capacity upgrade let me move our inside cameras which some are only used from time to time to the original smaller 8TB drive and set them to record directly to that drive because they aren’t accessed as often. Then from there all of the outside cameras go to our fast SSD for three days allowing quick access to those recordings and they then move to the large capacity drives for long-term storage. The end result is nearly 6 months of storage for our inside cameras and two full months of storage for our outside cameras. My original goal was 30 to 40 days of overall storage and this surpasses that leaving a little room for expansion if needed. The combination of all of the different upgrades has smoothed out performance on this server and being concerned that it might miss something important is no longer a concern. That isn’t to say we haven’t had issues, but almost all have been issues with cameras or the wiring to cameras. With 6-7 years now of use, some of the network cables that I originally ran to the outside cameras haven’t held up to the UV and weather. I knew this would be an issue and did pick up proper outside rated CAT 6 and just have been upgrading as we go.

Author Bio
garfi3ld
Author: garfi3ldWebsite: http://lanoc.org
Editor-in-chief
You might call him obsessed or just a hardcore geek. Wes's obsession with gaming hardware and gadgets isn't anything new, he could be found taking things apart even as a child. When not poking around in PC's he can be found playing League of Legends, Awesomenauts, or Civilization 5 or watching a wide variety of TV shows and Movies. A car guy at heart, the same things that draw him into tweaking cars apply when building good looking fast computers. If you are interested in writing for Wes here at LanOC you can reach out to him directly using our contact form.

Log in to comment

We have 1759 guests and one member online

supportus