What I like about this dual Synology set-up is that this is that you essentially have limitless cloud storage without any cost beyond the initial hardware. If you have two locations with decent enough bandwidth you can make things work just as well as a subscription service.
Using Synology’s Hyper Backup on the source NAS, I did the initial data transfer of 3+ TB to the destination NAS on my local network. Once the initial backup was complete, I relocated the destination NAS to my mother’s house about 10 miles away. Because I connected the destination NAS to my Tailscale network I just had to type in the remote NAS’s Tailscale IP address to reconnect to the backup job. Subsequent backups are a lot faster as I just have to transfer new and saved files.
Tailscale really makes this entire process a lot easier as I don’t have to open up ports on my Mom’s router and/or configure a VPN server. It just works!
Disclosure: The Synology NAS devices featured in this video were provided to the channel free of charge by Synology. Synology is an occaisional sponsor on the channel but they did sponsor this video nor review or approve it before it was uploaded.
In my latest video we veer off into the nerdy weeds with a detailed step-by-step tutorial about how to spin up and manage complex Docker applications using the new Synology Container Manager that can be found in DSM 7.2.
As I mentioned in my previous video about my self hosted projects, there are hundreds of amazing open source applications out there that offer similar functionality to popular cloud apps. I received so many questions and comments from that video about how I get them running via Docker on a Synology NAS, so that’s where this video comes in.
Because the Docker containers run in an isolated environment, they’re a little more secure than just running applications on the NAS directly. They’re also very easy to back up and move to another server if needed. Just copy the folder over to the new machine, rebuild the containers with a mouse click, and migration is done!
In the video I demonstrate installing Wallabag, an open source “read later” application similar to Pocket and Instapaper. The way it works is that Wallabag will download an archive of a provided URL, transform the web page into a readable format with just the content, and make it available for offline reading via a web browser. The Wallabag app for Android and iOS can sync the Wallabag container’s data with a phone or tablet.
Wallabag runs on the NAS in a container and its data is stored locally there as well. Using Tailscale I’m able to connect back to the application from anywhere in the world securely without having to open up any ports on my router.
I chose wallabag for this demonstration because it’s an example of a project that consists of multiple Docker containers working in concert with each other. In this case there’s the main Wallabag application in one container, a mysql database server in another and a third container runs a redis caching server.
In the past it was possible to get a project like this working but it had to be done outside Synology’s Docker app using the command line or another tool. Container Manager now makes it possible to build and run applications like this without having to use anything else.
In the tutorial I detail the steps of finding and editing Wallabag’s Docker Compose file and building the application as a “project” inside of Container Manager. One of the important things in this process is pointing the containers to a directory on the NAS for storing data. Containers are considered expendable with each update or build, so user data has to be mapped to a persistent storage location on the NAS. After trouble shooting a few minor error codes I was able to get Wallabag project built and operating relatively quickly and reliably on the NAS.
While all of this might seem a bit daunting vs. finding an app and hitting the install button, containerized applications are in many ways the new standard for running open source applications like this. While there is some up-front complexity, the advantages of having what is essentially portable versions of very robust server applications save far more time in the future. Should something ever happen to my NAS I just need to restore the backup files to a new location, click the build button, and I’m back exactly where I left off.
Let me know what you think in the video’s comments! Also be sure to share some of the containers you’ve found to be most useful.
Disclosure: Synology is an occaisional sponsor here on the channel and they provided me with the NAS hardware used in the review free of charge. However they did not sponsor this video nor did they provide any input or approval prior to publishing.
My latest video is a review of Synology’s new storage product called the BeeDrive.
At first glance, it might seem like any other external solid-state drive. However, it’s the software layer that sets it apart. This drive and software combination offers data synchronization, backup capabilities, and even a feature that allows you to transfer files from mobile devices, similar to how AirDrop works on the Mac.
The BeeDrive is priced at around $120 for the one terabyte version and $199 for the two terabyte. From a hardware perspective, it’s straightforward with a single USB type-C connector. It operates at 3.1 Gen 2 speeds similar to other drives in this space.
The backup feature is quite simple. It monitors specified folders and updates the backup almost in real-time whenever a file changes. There’s also an option to keep file versions, so if a file changes, it will update the backup and store the previous version. It works with multiple computers, storing the backups in separate folders on the drive.
It also has a synchronization feature that allows for a two-way sync between the BeeDrive and a folder stored locally on the computer. It’s essentially a “sneakernet” DropBox that can keep files in sync on multiple computers with the BeeDrive acting as the master device. Syncing happens only when the drive is physically connected to the computer with the software running.
One of the standout features of the BeeDrive is its mobile transfer capability. With the BeeDrive app, available for both Android and iPhone, you can back up your photo library, including videos automatically whenever the BeeDrive is connected to a host PC on the same network as the phone.
There’s also the BeeDrop feature, which lets you quickly transfer files and photos from your phone to your computer similar to how AirDrop works on the Mac. This feature works remarkably well, even over the Internet. Unfortunately it’s only a one-way trip at the time of this review. Files can’t be transferred back to the phone.
Beyond the software features it’ll also work as a regular external hard drive provided files are stored outside of the folders the BeeDrive software uses for backups and syncs. When I ran a CrystalDiskMark test, the BeeDrive’s performance was adequate but not groundbreaking.
Like most new products there’s a lot missing here that will hopefully be added later in future software updates. The first is a lack of hardware encryption on the drive, a feature that’s found on most name brand external solid state drives these days. And although its backup feature is super simple to use there currently isn’t a restoration feature to put files back in place. That has to be done manually.
Mac users can use the drive to store files but the BeeDrive software was not yet available for the Mac at the time of my review. Synology does have plans for a Mac client shortly.
The BeeDrive offers a unique blend of hardware and software features. It’s worth keeping an eye on this product line, as I anticipate more software updates and improvements in the future.
Disclosure: Synology sent the BeeDrive to the channel free of charge, however they did not sponsor this review, provide any additional compensation, or review or approve this review before it was posted. Synology is also an occasional sponsor here on the channel but is not sponsoring this video.
My latest video involves some of the home networking projects I’ve been working on recently with my Synology NAS devices.
One of the projects I’ve been working on involves setting up a private network using Tailscale, a great (and free) personal VPN solution that allows you to connect remote devices together without having to expose ports on your router. I covered the basics of Tailscale in a previous video.
I’ve set up Tailscale on my primary NAS at home and another one on a Synology NAS at my mother’s house. Using Synology’s Hyper Backup software, I’ve been able to back up about 3 terabytes of data from my house to hers. This has provided me with a secure and efficient way to store a large amount of data off-site. Now that the initial 3TB is loaded subsequent backups will be much smaller as just the changes will be sent over.
Another project I’ve been working on involves Docker which runs on the Synology + series devices. Docker containers make it easy to host sophisticated self-hosted web apps with just a few clicks. I’ve been using Docker to host a few applications, including Pingvin, a self-hosted alternative to WeTransfer. This allows me to upload and share large files without having to rely on third-party services.
To ensure the security of my home network, I’ve been using Cloudflare’s Zero Trust Tunnel. This service allows me to expose certain services to the public internet without exposing my home IP address. It’s a safer alternative to opening up a port and provides an additional layer of security.
I’ve also been experimenting with PeerTube, an open-source application that allows you to create your own self-hosted version of YouTube. I’ve been able to host videos on my own server, which has given me a lot of control over my content. The software also uses a peer-to-peer system to distribute videos, which helps reduce bandwidth usage.
These projects have given me a deeper understanding of the potential of home networking for those lucky to have fast fiber optic connections. They’ve allowed me to explore new technologies, improve the security of my network, and gain more control over my data.
I’m excited to continue expanding my “home lab” and sharing my experiences with you. I believe that these projects can provide valuable insights for anyone interested in home networking, and I encourage you to explore these technologies for yourself!
Yes this headline is a mouthful! But I stumbled across a great solution for Wyze camera users who want to keep their cameras up to date yet still use them via RTSP to their own security NVRs. Setting this process up is the subject of my latest “how to” video.
With Wyze pulling their official RTSP firmware some super smart community members figured out a way to build a “bridge” that takes video out of the Wyze cameras and makes that video available as an RTSP, RTMP or HLS stream that can be used by any compatible security DVR/NVR. It does this through the use of a Docker container that can run on just about any compatible Linux based device.
Once installed and logged into your Wyze account, any compatible camera on the same network as the computer hosting the container will be available. Your security NVR will connect to the stream on the container which will in turn bridge the video from the camera. Since this process mostly passes a relatively low bandwidth video stream it’s not very resource intensive and even a Raspberry Pi can get the job done.
As of the time of this writing it’s compatible with most Wyze cameras with the exception of their new “OG” cameras and their Video Doorbell Pro. It’s likely Wyze is disabling whatever loophole existed in their older hardware to prevent this circumvention around their subscription services on newer devices. You can learn more about their push to subscriptions in my recent video on the topic.
Docker is something I’ve been learning about over the last year or two and this is a great first project to play with if you’re interested in dipping your toes into containerizing applications. Synology has a great graphical Docker interface that helped me wrap my head around how it all works.
Synology addressed some of the feature requests users had for a smaller more affordable plus series device, but not everyone will be happy in the implementation of them. First they added 10 gigabit ethernet support but you’ll need to purchase an additional $150 Synology manufactured adapter for that.
This drive also includes dual NVME SSD slots on the at the bottom for caching or using as a separate volume. Volume use, however, requires the use of official Synology branded NVME drives that cost a lot more vs. non-Synology ones. I tried using a WD branded drive and was presented with this message:
The new 723+ NAS includes 2 GB of RAM which is expandable to a whopping 32GB. However Synology only recommends using their branded ECC memory and will not support configurations using off-brand RAM.
Performance-wise this is a big step up over previous models using the NVME storage and 10 gig network adapter. In my testing we were seeing transfer speeds easily 7-8x what a typical 1` gigabit NAS can achieve off of the NVME volume. We saw slightly faster speeds when we configured the NVME as a striped RAID 0, with read speeds topping 1 gigabyte per second. From a practical standpoint I was able to edit a 4k Final Cut Pro project completely over the network.
The biggest problem here is the processor Synology chose for the 723+. After years of exclusively using Intel processors they switched to an AMD Ryzen R1600. While the processor is adequate enough for the types of small and medium sized business users might need, it lacks the built in video encoder found on an Intel processor. The result is that this will not work well as a Plex server because it’s not able to do any hardware transcoding of video. It’ll be fine for in-home streaming but any out of the home streaming requiring a transcode will grind its processor to a halt.
That issue aside the 723+ delivers an endless number of features. This class of Synology NAS gets you access to a bulk of their enterprise apps including advanced backup solutions we looked at in a recent tutorial series. It also has a nice docker client, virtual machine manager for booting up other OS’s and even an office suite that replicates many of the features of Google Workspace. You can see more about all of the features here.
In summation this is a solidly performing unit but long-time customers will be disappointed with the processor choice and limitation of having to use only Synology branded RAM and NVME storage. I hope Synology will re-think their decision to limit RAM and NVME choices as these restrictions can very easily be lifted in a software update.
The last two weeks on the channel could best be described as the “not for everyone” series. The ioSafe 220+ is another product not for most people but those who need one will appreciate that it exists. You can see my review here.
The ioSafe 220+ has all the guts of an Intel powered Synology 220+ NAS device inside of a fireproof and waterproof casing. It’s designed to survive being in a 1550 degree fahrenheit fire for 30 minutes and the subsequent water dousing it’ll take to put the fire out. The electronics won’t survive but the drives inside of the fireproof enclosure should.
It works thanks to an endothermic material that is built into the casing. Water molecules are trapped inside of the material and will turn into steam when placed in a high temperature environment. That steam draws heat away from the center portion where the drives are stored. The drive enclosure is hermetically sealed to prevent water intrusion. You can hear more about how it works in this interview I did with the founder of the company back in 2015.
One of the improvements in this version is a much quieter fan. Previous versions had super loud fans that made it difficult to locate the device in an office environment. This one is about as a quiet as a regular Synology NAS.
Performance otherwise is on-part with a regular Synology NAS.
Why is this not for everyone? Price. A regular diskless Synology 220+ NAS sells for $300. This one starts at $940. But there are often corporate and government requirements for data storage that call for flood and fire protection for mission critical data.