ATSC 3.0 Update: The CTA Opposes Broadcasters’ Request for Tuner Mandate

I’ve been following the rough rollout of ATSC 3.0—also known as NextGenTV—for a while now, and this week the transition hit another bump in the road. A dispute over tuner mandates has surfaced between two key players in the process: the Consumer Technology Association (CTA), which represents electronics manufacturers, and the National Association of Broadcasters (NAB), which represents TV broadcasters. I dive into this in my latest video.

The disagreement is notable because these two organizations have worked closely to get this new standard off the ground. Even the NextGenTV logo consumers see on compatible equipment is a registered trademark of the CTA, not the NAB.

Recently, the NAB asked the FCC to push the transition forward, proposing a 2028 cutoff for the current standard in major markets. That proposal included several desired mandates. One, which I mentioned previously, would require manufacturers to include ATSC 3.0 tuners in TVs well before that deadline. But there were a few other items tucked into the request. For instance, the NAB wants the FCC to require that remotes with buttons for services like Netflix also have buttons for broadcast TV. They also want broadcast content to be featured prominently in on-screen menus—right up there with paid placements from streaming platforms.

This is where the CTA pushed back. Gary Shapiro, CTA’s CEO, took to LinkedIn with a public response. He accused the NAB of trying to force an unpopular product on consumers and manufacturers. He noted that less than 10% of Americans rely on antennas for TV and argued that these mandates would increase costs for everyone, especially at a time when affordability is a concern.

The CTA also began lobbying FCC commissioners directly. They brought along cost comparisons, pointing out that TVs with ATSC 3.0 tuners are significantly more expensive. They argue that additional costs—like those tied to licensing and DRM requirements—are part of why manufacturers are reluctant to include these tuners in their products.

And that’s been a sticking point all along. The tuners are pricey. They’re expensive to make and expensive to buy, largely because of how difficult it is to meet all the DRM requirements that come with ATSC 3.0. These restrictions make it tough for smaller companies to enter the market, which in turn limits consumer choice.

A good example is something like the HDTV Mate, a sub $60 tuner that doesn’t meet the DRM standards. It’s more affordable than the few certified options, but because it doesn’t comply with the DRM, it’s not really part of the formal ecosystem. Without the DRM roadblock, I believe we’d already see a wider selection of tuners at better price points.

Broadcasters don’t seem likely to budge on DRM. The CTA seems less focused on that issue than on the broader economic impact of the mandates. Still, the lack of tuners—and the obstacles to building them—is at the heart of why this transition has been so slow.

Looking ahead, I don’t expect the FCC to go along with any of the mandates the NAB is pushing for. It’s hard to imagine this FCC chairman telling manufacturers how to design their remotes or menu layouts. But the broader transition to ATSC 3.0 is probably going to keep moving forward. If nothing changes, over-the-air TV might become even harder to access, which could lead to its gradual disappearance. That might suit some interests, especially if the valuable spectrum currently used by broadcasters gets reallocated or repurposed.

It didn’t have to go this way. With more affordable tuners and fewer restrictions, we might have had a more vibrant market by now—even if it was a small one. But instead, we’re left with a limited selection of costly devices and a standard that’s tough for both consumers and developers to embrace.

I’m not giving up on the DRM issue, and if you’re concerned too, there’s a way to weigh in. You can visit my instructions here to file a public comment with the FCC. I’ll be following this docket closely, and I expect more developments as the FCC begins formalizing its approval process for the transition. Public comment periods and even field hearings are likely on the horizon. I’ll keep watching.

See more in this saga in this playlist.

Amazon Haul 9!

I recently uploaded another Amazon haul, covering about two and a half months’ worth of gear—most of it affordable, a mix of items sent via the Amazon Vine Program, a few from manufacturers, and some that I bought myself. You can see the full list of items at this compensated Amazon affiliate link.

Nothing here was sponsored or pre-approved, just a rundown of what caught my attention and ended up being useful or interesting enough to feature.

One of the highlights was a 10-port USB-C charger from Plugable. It doesn’t come with a power supply, but if you’ve got one that puts out 100 watts or so, it’ll charge multiple devices efficiently by prioritizing the ports from left to right. Great for overnight charging, especially in places like classrooms where you need to juice up a bunch of iPads or devices at once.

Along the same lines, I tried out a universal travel adapter from Minix with a built-in GaN power supply. It delivers up to 170 watts through its USB ports and has built-in plugs for various regions. It’s not a voltage converter, but as a compact travel solution, it packs a lot of utility.

Another interesting find was a clip-on Bluetooth speaker from SuperOne. It’s lightweight, wearable, and doubles as a personal voice amplifier if paired with the right app. It’s not going to fill a large room, but in smaller settings, it could come in handy.

For streaming stick users, I checked out a couple of inexpensive but useful accessories. One was an HDMI elbow adapter that lets your device sit flush with the back of a wall-mounted TV. The other was a 100 Mbps Ethernet adapter that works with Fire TV and other compatible devices. It’s not the fastest option out there—Wi-Fi 6 built into recent low cost streaming sticks is quicker—but it can offer a more stable connection.

I picked up a Bell & Howell power hub from the Flip app as it tickled some nostalgic memories of their heavy duty AV equipment my school used in the 80’s. The design hints at the old-school gear they used to make, but the build quality doesn’t quite live up to that legacy. Perhaps due to the fact that Bell & Howell brand is no owned by a private equity firm. Still, it offers USB ports on three sides and a retractable power cord, though the latter was a bit clunky in use.

I also tried out some Apple-focused chargers from Belkin. These included a MagSafe-compatible 5,000 mAh battery with a kickstand, a foldable 3-in-1 travel charging pad, and a smaller 2-in-1 version. Each charger handled phones, AirPods, and Apple Watches to varying degrees, depending on the model. They’re simple, well-built, and compact enough to throw in a travel bag.

There was also an electric candle lighter—one of those arc-style models you charge over USB. It worked, but the narrow arc gap makes it easy to gunk up with wax. Practical outdoors, maybe, but definitely not something I’d leave around kids.

Finally, I spent some time with a surprisingly decent digital magnifying glass from a company called HIDEA. It has adjustable zoom, takes photos, and even comes with sample slides. It’s not a professional microscope, but it does a good job for its price, and I found it fun to play around with.

That wraps up this batch. I tend to go through a lot of gear, but only a fraction makes it to the table. The rest doesn’t pass the sniff test. Once I’ve got enough new stuff worth showing, I’ll put together another round. These videos are always fun to make, and it’s good to see that people still enjoy watching them.

Plugable 5G USB-C Ethernet Adapter Review – USBC-E5000

My latest video review is of the Plugable USBC-E5000 5 gigabit ethernet adapter —something that’s still relatively uncommon compared to the more widely available 2.5 gigabit options. The unit supports 5 Gbps speeds when plugged into a 10 Gbps USB 3.2 port, meaning you don’t need Thunderbolt or USB 4 to hit those higher transfer rates. You can see it in action here.

You can find these on Amazon at a pretty reasonable price (compensated affiliate link). Be sure to look for coupon options that might be available.

It’s powered by the Realtek RTL8157 chipset, which made setup a smooth process on macOS and Linux. Windows was a bit different. It recognized the device without needing a manual driver install, but initial download speeds didn’t meet expectations. Installing the drivers directly from Plugable’s site resolved that issue. I’d expect Windows to eventually update with better out-of-the-box support.

That chipset choice makes a difference. A few years back, I tried similar 5 gig adapters using less reliable chipsets, and the experience wasn’t great. This one worked consistently across all three major operating systems. It also worked with a few of my smartphones, although I found performance better on iOS vs. Android.

It’s worth noting that while this is a 5 Gbps adapter, it also scales down to 2.5 Gbps, 1 Gbps, and even 100 Mbps depending on the network switch it’s connected to. However, to get the full 5 Gbps performance, the USB port has to support 10 Gbps throughput. Plug it into a slower port, and you won’t get top speeds.

Once I had it connected to my Mac, I ran a speed test using my 10 Gbps internet connection. The results were in line with what I expected from a 5 gigabit connection—downloads and uploads both performed well, taking into account the usual network overhead. I saw similar performance on my Windows and Linux machines.

There’s not much else to the product. It does what it says. It’s compact, has indicator lights for link status, and so far it’s been reliable. Plugable is also a U.S.-based company with domestic support, which might be a consideration for those who like knowing there’s someone they can reach out to if anything goes wrong. Most of their products, including this one, come with a two year warranty.

If you’re looking to move beyond 2.5 Gbps over USB and want a relatively straightforward upgrade, this might be something to keep on your radar.

Asus ExpertBook P5 P5405CSA Review

I recently spent some time with the ASUS ExpertBook P5 series laptop—specifically, the P5405CSA model in my latest video review.

The version I tested is configured with an Intel Core Ultra 7 258V processor, 32GB of non-upgradeable RAM, and a 1TB NVMe SSD. There’s also an extra slot for storage—2230-sized if you’re looking to expand or do something like a dual boot setup with Linux. The price as tested comes in at around $1,200 (compensated affiliate link), though there’s a lower-tier version with a Core Ultra 5 and less RAM for roughly $1,000. Prices will likely shift as the year progresses, so it’s worth shopping around. You can also find them at Amazon where the price is always varying (compensated affiliate link).

The P5 has a 14-inch LED display with a 2560×1600 resolution and a 144Hz refresh rate, which was set to 60Hz by default but easy to switch. The screen brightness tops out at 400 nits—decent enough for a business-oriented machine but not incredibly bright. Color accuracy is also solid with 100% sRGB coverage, which should work fine for light creative tasks.

The build feels light at 2.8 pounds, and while the chassis is slim and portable, it comes at the expense of some flex in the keyboard deck. That said, the keyboard itself is well-sized, backlit, and pleasant to type on. The trackpad tracked well and felt solid—no complaints there.

In terms of ports, you get two Thunderbolt 4 ports which also work with USB-C devices, a full-size HDMI port, two 10Gbps USB-A ports, a headphone/mic combo jack, and a Kensington lock slot. The laptop doesn’t include Wi-Fi 7 but does support Wi-Fi 6E, which was more than sufficient during testing. The speakers are downward-firing and fine for casual use—especially calls and voice content—though headphones are still preferable for richer audio.

Biometrics are handled through both the webcam, which supports Windows Hello, and a fingerprint sensor embedded in the power button. The webcam is 1080p and includes some AI-driven enhancements through ASUS’s software suite. It also has a physical privacy shutter.

Battery life was solid. I was able to get close to 10 hours with light productivity tasks and lower screen brightness. It’s possible to squeeze out even more longevity depending on the workload. More intensive tasks like video editing or gaming will drain it faster, but the battery held up well throughout a full workday when used conservatively.

Speaking of AI features, ASUS includes its AI Expert Meet software, which can transcribe and summarize meetings directly on the device. The transcription worked offline using the NPU, and the summarization ran on the Intel processor’s GPU. It wasn’t particularly fast or accurate, especially when multiple speakers were involved, but it’s a useful tool that doesn’t rely on cloud access or subscriptions.

Performance-wise, web browsing was smooth with responsive page loads. YouTube playback at 4K/60fps dropped a few frames early on, but nothing disruptive. Benchmark scores in line with similar laptops confirmed that it holds up for general tasks. Basic video editing is also possible—simple projects like stringing clips together ran without issue, though more demanding workflows would require a more powerful PC or an external GPU via Thunderbolt.

Gaming was possible at lower settings. Cyberpunk 2077 ran between 25-35 FPS at 1080p on low settings. 720p ran a lot better. But still, given the lack of a discrete GPU, it’s amazing how far integrated graphics have come. Benchmark scores were comparable to a discrete GTX 1650 Ti from just a few years ago.

Thermal performance held up under load. The system passed a 3DMark stress test with a 98.5% score and stayed impressively quiet. The fan noise is minimal and the fan only kicked in during intensive tasks like gaming, and otherwise stayed silent.

One area where the laptop didn’t perform well was Linux compatibility. I booted into Ubuntu 24.1 and found that Wi-Fi, Bluetooth, and audio didn’t work. That was a surprise given that a similar ASUS VivoBook had no issues. It’s most likely a driver situation, so expect some troubleshooting if you’re thinking about switching to or dual booting Linux.

Overall, this laptop doesn’t stand out visually, but it offers reliable performance and some features that business users might appreciate—like the three-year warranty and nice display. Depending on what you’re looking for, this one might be worth keeping an eye on as prices shift.

Disclaimer: The laptop was provided on loan from Asus. No compensation was received for this review, and no one reviewed or approved this post or my video before it was uploaded.

Free App Hidden Gem : UTM – Run Windows, Linux and more on the Mac!

In my latest video, I take a look at UTM, a free and open-source virtualization app for macOS that allows users to boot up Windows, Linux distributions, retro operating systems and even other instances of MacOS. UTM provides an efficient solution without the licensing constraints and bloat of commercial alternatives like Parallels or VMware Fusion. You can find it for free on their website.

UTM is built on QEMU, an open-source emulation framework, and supports both virtualization and emulation. When running ARM-compatible operating systems in virtualization, such as Windows 11 ARM or an ARM-based Linux distribution, the performance is close to native. Emulating x86 based and other operating systems is slower but still functional.

I tested UTM on my M2 MacBook Air, that I purchased about two and a half years ago. This machine remains powerful enough for my needs on both macOS and in virtual environments. If you’re considering one of these machines, there have been some great deals lately, with prices dropping as low as $700 in some cases (compensated affiliate link).

Setting up Windows 11 ARM in UTM was straightforward. The software doesn’t provide the operating system itself, but with tools like Crystal Fetch, downloading the necessary installation files from Microsoft was simple. Once installed, Windows 11 ARM supports running both 32-bit and 64-bit x86/x64 apps through Microsoft’s built-in translation layer. This allows for smooth execution of many legacy applications, such as Microsoft FoxPro, which I demoed in the video. However, gaming performance is a different story—Windows in UTM doesn’t have GPU passthrough support, so graphically demanding applications won’t run well.

On the Linux side, UTM provides pre-configured images for quick setup. With GPU acceleration enabled for Linux, some applications run more efficiently than on Windows. File sharing between macOS and the virtualized system is also simple through the use of a shared folder, though not as seamless as drag-and-drop functionality in commercial alternatives.

UTM also allows users to emulate older operating systems designed for different processors, including Windows 95 and classic PowerPC macOS versions. Running a fully configured Windows 95 installation on a modern Mac was a fun exercise in nostalgia, complete with old files and applications from a backup of my college laptop from 1998.

The customization options in UTM are extensive. Users can tweak system configurations down to CPU architecture, RAM allocation, network adapters, and sound drivers. While this level of control can be overwhelming, many UTM users are sharing pre-built system images that offer a great starting point.

There’s also a version that runs on iPhones and iPads, although Apple app store restrictions impact the performance of the mobile version significantly. It runs better on jailbroken devices where the UTM app can run with the same compilers it uses on the Mac. More information on iOS and iPad OS can be found on the UTM website.

For anyone looking for a lightweight, cost-free virtualization tool on a Mac, UTM is worth trying. Whether you need occasional access to Windows, a Linux development environment, or even a retro computing setup, UTM provides a flexible and powerful option without the cost or complexity of commercial alternatives.

See more app hidden gems here!

Attaching an eGPU to a Low Cost Mini PC Without Thunderbolt!

I decided to try something unconventional: attaching an eGPU to the system bus of a budget GMKTec G3 Plus mini PC (compensated affiliate link), curious to see if I could push the limits of what these sub $200 PCs are designed for.

You can see if it works in my latest video.

The G3 Plus lacks the typical external graphics connections like USB4 or Thunderbolt. Instead, I used a workaround — an NVMe to Oculink adapter (affiliate link)—to see if this approach could effectively attach the eGPU to the system bus.

The setup process was straightforward. The G3Plus has two storage slots, one for its included NVME system drive, and a second slot for M.2 SATA disks. I imaged the existing storage drive onto an M.2 SATA drive to free up the NVMe slot, then installed the OCuLink adapter. The external GPU, a GMKTec AD-GP1 has an AMD RX 7600M XT, along with Oculink and USB 4 / Thunderbolt connection options.

Other eGPUs can work too if you get an Oculink PCIe slot adapter like I demoed in this video a few weeks ago. Many of the Oculink NVME kits come with the PCIe adapter. The cool thing about this is that you can interface just about any PCIe card with the PC.

Installation was simple, with the Oculink adapter inserting just like any NVME drive would. The G3 Plus’ slots are accessible on the top of the PC making it very easy for all of this to work.

It’s always a little nerve wracking when the “moment of truth” arrives. To my surprise the system immediately recognized the GPU right at boot and Windows 11 loaded without issue. Installing AMD’s Adrenalin drivers confirmed the GPU was fully detected, and from there, it was time to see how well it performed.

Cyberpunk 2077 was the first test, running at 1080p with ray tracing set to low. The system delivered frame rates in the range of 45 to 50 frames per second, but the CPU quickly became the bottleneck. The GPU utilization never exceeded 62%, demonstrating the limitations of pairing higher-end graphics hardware with a budget processor. Disabling ray tracing barely improved performance.

Red Dead Redemption 2 told a similar story. At 1080p on the lowest settings, frame rates fluctuated between 30 and 45 fps, depending on the complexity of the environment. The CPU remained fully maxed out, while the GPU hovered at just 30% utilization. This was a clear example of how throwing a powerful GPU into a low-end system doesn’t always yield massive gains.

Doom Eternal, however, showed a different side of the experiment. Running at 1080p with the lowest settings, the game reached 128 frames per second, dropping to the 90s in more demanding scenes. The GPU was significantly more engaged in this title, reaching 70% utilization. Turning on ray tracing caused a minor performance drop but still delivered a smooth experience, proving that certain games benefit much more from a powerful GPU than others.

Benchmarking with 3DMark’s Time Spy test revealed a significant GPU-driven boost. Without the external GPU, the mini PC scored 450 points. With the eGPU attached, the score jumped to 6,449, a stark difference that reinforced the impact of the external GPU—when the workload allowed it.

Beyond gaming, I tested an AI large language model with Ollama to see how well the setup could handle AI-based tasks. Running an 8-billion-parameter model, the GPU took full control, rapidly generating text while utilizing its compute power and 8GB of video memory to do it.

While this is not the most practical configuration, the experiment demonstrated the versatility of mini PCs when expanded through OCuLink. Despite some limitations, it was surprising to see how plug-and-play the process turned out to be.

Check out my review of the G3 Plus here.

Disclosure: I purchased the Mini PC with my own funds and GMKTec provided the eGPU to the channel free of charge for my prior review. No other compensation was received and they did not review or approve my content before it was uploaded. All opinions are my own.

2nd Generation Chromecast DRM Certs Expired – Google issues a fix but many still bricked

UPDATE: Google has now rolled out a fix for Chromecast users who reset their devices. From their latest blog post:

For users who have performed a factory reset, you will need to update your Google Home app to the latest version (3.30.1.6 for Android and 3.30.106 for iOS) to set up your Chromecast (2nd gen) or Chromecast Audio device again. The app roll out has begun and may take a few days to roll out to everyone. We’ll post a confirmation once the roll out to all users is complete.

Users who did not reset their devices were updated a little earlier this week with an over the air firmware update. Below is the background on the situation.

I recently noticed an unexpected wave of comments on my Chromecast video from a few months ago about the second-generation Chromecast devices suddenly failing to stream content. Mine stopped working too, displaying an error message that the device wasn’t trusted.

The underlying issue turned out to be an expired security certificate that was built into the hardware and set to expire after 10 years. When that certificate lapsed, it effectively halted the casting functionality for this older generation of the device.

In my latest video, we take a look at this issue and what it might mean for other useful long-lifespan devices.

I was surprised by how many of these decade-old Chromecast dongles are still in use, although perhaps I shouldn’t be. Even the 1st generation Chromecasts handle 1080p output, support popular apps, and offer a simple interface that many consumers never felt a need to replace.

Google did respond quickly to the issue and posted a brief statement on its support pages urging users not to factory reset their Chromecasts. They later pushed out a fix that updated all of those non-reset Chromecasts with presumably a new security certificate including mine. But there is still no solution for those who did a factory reset prior to the Google’s support guidance.

I’ve also been following similar concerns in the broadcast television industry, where the upcoming ATSC 3.0 standard allows for signal encryption that requires hardware-based certificates. Many of those certificates carry extended expiration dates, but the Chromecast situation serves as a reminder that even a 10-year window can seem short when a device is still perfectly functional. It would be unfortunate if these devices become e-waste simply because a DRM certificate lapses and can’t be renewed.

While the fix has given relief to those who didn’t reset their units, a portion of owners still have to wait for a workable solution. This case stands as a reminder of how dependent many gadgets are on ongoing support for restrictive DRM even when the hardware itself remains perfectly capable.

8Bitdo Ultimate 2C Budget Game Controller Review

In my latest video review, I try out 8BitDo’s latest budget controllers, the Ultimate 2C series, which includes options for both PC and Nintendo Switch. You can find these at a super reasonable price over at Amazon (compensated affiliate link).

As a follow-up to last year’s Ultimate C model, these new controllers introduce some upgrades, particularly the inclusion of hall effect sticks and on the PC version hall effect triggers, a feature previously reserved for higher-end models. The Switch doesn’t support analog triggers so the Switch variant of the 2C has a standard digital trigger. Both models have reliability improvements to prevent wear and tear from heavy gaming sessions.

One aspect that immediately stood out to me is how solid these controllers feel despite their budget-friendly price. They are well-balanced, with sturdy plastics that don’t feel cheaply made.

Latency performance was impressive. I ran my usual 240 frames per second slow motion video test, and the PC version connected via USB performed exceptionally well, matching the fastest controllers I’ve tested that cost far more. Wireless connectivity on the PC version is flexible, offering both dongle and Bluetooth options, although the dongle provided the lowest latency. Latency is higher on the Switch version both in wired and wireless configurations due to the Switch’s USB and bluetooth controller interfaces.

But retro game fans will be disappointed with the directional pad. 8BitDo has refined their d-pad designs considerably over the years, but the Ultimate 2C feels like a regression. While its smooth rolling might appeal to fighting game enthusiasts, I found it problematic titles like Zelda as it introduced errant diagonals. It was hard to keep Link walking in a straight line.

There are some limited customization options on the 2C. Two additional buttons on the controller’s back allow mapping individual or multiple simultaneous button presses, though without the ability to save profiles or deeper software adjustments seen in other 8BitDo models. Most of the buttons can also have turbo functionality.

In modern gaming scenarios, modern games performed well with smooth analog controls and decent rumble feedback both on the PC and Switch. The Switch version also supports motion controls!

Overall, the Ultimate 2C controllers deliver considerable value for gamers who might need to buy a bunch of controllers for couch co-op or kids that might be a little too rough on a more expensive controller. While retro gamers will find the d-pad limitations frustrating these controllers do offer a reliable and cost-effective choice for modern games.

See more 8bitdo reviews here.

Disclosure: 8bitdo distributor AKNES sent these to the channel free of charge. They did not review or approve the video or this review before publishing, no other compensation was received and all opinions are my own.

Free App Hidden Gem: Libreoffice – an open source alternative to Microsoft Office

In the latest edition of my “Free App Hidden Gems” series, we look at LibreOffice, an open-source office suite that runs on Windows, Mac, Linux, and even Chromebooks. Check it out in my latest video here.

Libreoffice might be familiar to tech enthusiasts as it comes preinstalled in many Linux distributions, but it’s not likely as wide known to the general public. Unlike subscription-based office suites, LibreOffice allows full ownership and control of your files without requiring an internet connection.

Installation is straightforward. Users can head to libreoffice.org, download the appropriate version, and get started. In addition to supporting most operating systems, LibreOffice also has native support for Apple Silicon and ARM-based Windows devices. The interface has a classic look reminiscent of Microsoft Office before the introduction of the ribbon menu (although that interface is an option). It feels intuitive, with essential features easily accessible without extra layers of complexity.

The suite includes a word processor (Writer), a spreadsheet application (Calc), and a presentation tool (Impress), all of which offer compatibility with Microsoft file formats. Documents, spreadsheets, and slides created in Word, Excel, and PowerPoint open in LibreOffice with minimal formatting issues. However, some complex documents may require adjustments. LibreOffice also includes Base, a database application that supports ODBC but does not fully replace Microsoft Access. Other tools like Draw, for vector graphics, and Math, for creating complex formulas, round out the suite.

LibreOffice handles older files exceptionally well. Files created in early versions of Microsoft Office that are no longer supported by modern software can often be opened without issue. This makes it a valuable tool for those with archives of older documents that need access.

One key limitation of LibreOffice is its lack of real-time collaboration. Unlike Google Docs or Microsoft 365, it does not allow multiple users to edit a document simultaneously. There is a basic collaboration feature in Calc, but changes only appear only after saving, rather than in real time. Additionally, mobile integration is not as seamless. While apps like Collabora Office enable mobile editing, the experience is limited compared to cloud-based office suites.

Chromebook users can install LibreOffice through the Linux development environment. The process involves enabling Linux in Chrome OS settings and running a few simple command-line instructions to set up the suite. Once set up, LibreOffice runs locally, allowing offline document creation and editing without reliance on Google Drive or other cloud services.

LibreOffice provides a functional, no-cost alternative to mainstream office software. It offers full control over files without requiring cloud storage or monthly fees. While it lacks some modern collaboration features, it compensates with reliability, compatibility, and an interface that feels familiar to long-time office software users. For those who prefer working offline or want to avoid subscriptions, LibreOffice is definitely worth a try.

Beelink SER9 Pro with Ryzen AI 9 365 Review

We might be in the “golden era” of Mini PCs with many high performance options hitting the market like the amazing Mac Mini and many PC options that offer comparable performance.

In my latest review, we take a look at a new high-end unit from Beelink called the SER9 Pro.

Equipped with an AMD Ryzen AI9 365 processor, this compact machine aims to deliver solid performance for everyday tasks, media consumption, and even gaming. With a high price tag, it offers a blend of power and efficiency, though there are some trade-offs to consider. You can find the current price on Amazon here (affiliate link).

The Ryzen A I9 365 inside the Beelink SER9 is a 10-core CPU with a 12-core GPU. While it carries the AI branding, it’s important to keep expectations in check. This machine won’t be running large-scale AI models locally, but it does provide some AI-assisted enhancements in video and photo editing applications. The system also features 32GB of LPDDR5X-8000 RAM, though it’s soldered onto the motherboard per AMD requirements, making future upgrades impossible. On the storage side, it includes 1TB of NVMe storage with an additional NVME slot for expansion.

The front panel includes a USB-C 3.2 port, a USB 3.0 port, and a headphone/microphone jack. The rear panel offers additional ports, including two USB-A 2.0 ports, one USB-A 3.2 port, dual video outputs (DisplayPort and HDMI), and a USB 4.0 port that supports 40 gps Thunderbolt devices like eGPUs. There’s also a 2.5Gb Ethernet port for high-speed wired networking. Its Wi-Fi 6 performance falls slightly below expectations compared to other devices tested on the same access point.

In real-world use, the Beelink SER9 handles everyday tasks smoothly. Web browsing at 4K resolution feels snappy, and even demanding applications like DaVinci Resolve can run well for basic video editing. Heavier tasks, such as applying complex effects, do slow down playback slightly, but for general content creation, this system performs well.

Gaming on the Beelink SER9 is surprisingly decent for a Mini PC. Running Cyberpunk 2077 at 1080p on the lowest settings yields nearly 60 frames per second, with smooth performance in other demanding games like Red Dead Redemption 2. Benchmarks put its performance in line with older GPUs like the GTX 1060, but with significantly less power consumption. Under full load, the system draws around 95 watts, and fan noise remains minimal, making it a quiet option even during intensive tasks. Idle consumption is around 10-11 watts.

For those considering alternative operating systems, the SER9 runs Linux without issues. Everything from audio to Bluetooth and Wi-Fi is detected properly, making dual-booting a viable option.

While the Beelink SER9 is an impressive performer in the Mini PC category, potential buyers should weigh their options. A comparably priced laptop with similar specs could offer the same performance with added portability. However, for those who need a small, quiet, and efficient desktop alternative, the SER9 delivers on many fronts. Its sleek design, solid performance, and energy efficiency make it a strong contender in the Mini PC market, despite a few limitations.

The 2013 “Trashcan” Mac Pro is Cheap and Surprisingly Relevant in 2025

I never really considered myself a collector of retro computers, but somehow, they seem to be accumulating around me. My childhood Apple IIgs and a Mac SE/30—nostalgic relics of past computing eras—sit behind me in every video. Offstage I have a bunch of other retro Macs and PCs that I’ll get around to showing you some day.

The other day, I came across a late 2013 Mac Pro for just $169 at OWC (compensated affiliate link) and couldn’t pass it up. Back when these machines first hit the market, they were priced at over $3,000, and I never found a good enough reason to justify owning one. But now, with the cost so low, I finally had my chance. The question was: what could I actually do with it in 2025? Surprisingly quite a bit. Take a look in my latest video !

The Mac Pro I picked up is the lowest-end model from its generation, featuring an Intel Xeon E5 processor, dual AMD FirePro D300 GPUs with 2GB of VRAM each, and 16GB of DDR3 memory. It did not come with the original packaging, just a simple brown box with bubble wrap—but cosmetically, the Mac barely looked used.

This design, often referred to as the “trash can” Mac Pro still looks pretty cool IMHO. It’s compact, upgradable, and eerily quiet, thanks to its innovative cooling system that pulls air through the bottom and exhausts it through the top.

One of the first things I did was install macOS Sequoia 15.3.1 (the most current version of macOS at the time of this writing) using the OpenCore Legacy Patcher. Apple officially stopped supporting this machine at macOS Monterey (version 12), but OpenCore extends its lifespan by enabling newer macOS versions to run. The installation was surprisingly straightforward, though I disabled automatic updates to avoid potential compatibility issues. Even a small point release can break OpenCore’s boot loader, so it’s best to wait until OpenCore updates to run the macOS updater.

1440p feels like the sweet spot for this machine, as performance at 4K 60Hz is a little sluggish but far better than expected for browsing, office tasks etc. Playing 1440p/60 videos on YouTube is seamless, but pushing it to 4K 60fps results in dropped frames and choppiness.

For productivity, Apple’s Keynote runs well at 4k60, handling animations and transitions without issue. Even Final Cut Pro, thanks to OpenCore, is functional—though rendering and previewing at 4K 60fps is sluggish compared to modern Apple Silicon machines. Editing at 1080p or 1440p is more practical, but it’s clear that this machine isn’t optimized for high-end video production anymore.

Running the Speedometer 3.0 benchmark, the Mac Pro scored 10.7—comparable to some budget mini PCs we look at today. However, its power consumption is significantly higher, sitting around 95 watts at idle and spiking much higher under load.

Windows 10 installation via Boot Camp is surprisingly still supported under Sequoia. I opted to boot from an external Thunderbolt SSD rather than partitioning the internal storage following instructions I posted way back in 2016.

Windows feels snappier than macOS. Basic web browsing and productivity tasks perform well, but modern gaming struggles. No Man’s Sky, running at 720p on the lowest settings, hovered between 25-30 fps—playable but not ideal. Updating the AMD drivers helped, but support for these GPUs ended in 2021, limiting future compatibility.

I also explored Linux, specifically Linux Mint LMDE 6, which was recommended for this hardware. It detected both GPUs, CPU, and network interfaces without issue. Performance was decent, though not as fluid as Windows. But, ChromeOS Flex was a no-go due to compatibility issues with the graphics hardware.

Macs running with the “Apple Silicon” M-series chips, like the M4 Mac Mini we looked at recently, outperform this Mac Pro in every metric while consuming a fraction of the power. But twelve years later this Mac is still able to run the most recent version of macOS reliablity, has great Windows support, and Linux compatibility. If you’re looking for a fun project or a retro computing experience, a late 2013 Mac Pro might be worth picking up if you find one cheap like I did.

YouTube is Testing a “Lite” Version of their Premium Subscription Service

I’ve been a YouTube Premium subscriber since the days when it was called YouTube Red. YouTube Premium provides an ad free YouTube experience, access to the complete YouTube Music catalog, and a number of other features.

Over the years, the price has crept up from about $10 a month to $13.99, likely to keep up with music industry demands for higher streaming royalties. Given that not everybody wants a music subscription, YouTube is looking at a new tier called “Premium Lite” that will offer an ad-free experience only on core YouTube content at a lower monthly cost.

In my latest video, we explore what this new tier will offer along with some insight on how YouTube Premium works for both viewers and creators.

The biggest draw of YouTube Premium has always been the ad-free experience. Without ads, videos start instantly, making the whole experience feel faster. YouTube Premium also includes some smaller conveniences, like the ability to queue up videos quickly and even jump ahead past baked-in sponsor segments. The service costs $13.99 for individuals, $22.99 for families (up to six accounts), and students can get it for $8 a month for up to four years.

The new Premium Lite tier is expected to cost around the same as the student rate but will come with trade-offs. YouTube Music won’t be included, and ads might still appear on music videos while being removed from other content. Features like background playback and offline downloads may also be excluded.

YouTube has been ramping up efforts to counter ad blockers, often preventing users from watching videos until they either disable their blockers or subscribe to Premium. The introduction of a lower-cost plan suggests YouTube is looking for a price point that will convince more people to pay rather than deal with the inconvenience of keeping up with the cat and mouse game of blocking the ad blockers. For some, $13.99 is too much, but $7.99 might be reasonable enough for some to make the switch.

For creators, Premium’s revenue share works differently than ad revenue. Instead of paying creators for a portion of an ad view, YouTube pools all the minutes watched by Premium subscribers and distributes 55% of the subscription revenue based on watch time. This means that if a subscriber watches nothing but one channel, the creator of that channel still only gets a fraction of the total subscription fee, depending on their share of overall Premium watch time.

In recent years, Premium revenue per watch hour has declined. In 2016, I earned five cents per hour from Premium viewers, compared to three cents per hour from ad-watching viewers. By 2024, Premium watch time earnings had dropped to around two cents per hour, while ad-supported watch time was earning around eight cents per hour. Part of this could be due to the ever increasing amount of YouTube content being added every day, leading to more competition for watch time.

But I’m still a bit skeptical.. Back in 2016 YouTube was experimenting with exclusive content on the Premium subscription tier, which presumably was funded out of those subscription dollars. One of the few successes from that era was Cobra Kai, which later moved to Netflix. Since then, YouTube has largely abandoned the idea of exclusive content which should have freed up more subscription dollars for Premium revenue share. Unfortunately, YouTube is not all that transparent insofar as how they distribute creator share of subscription revenue.

YouTube is transparent, however, about how much advertisers pay per thousand views on a creator’s channel. And the numbers are significant. For example, the average ad rate YouTube collects on my boring tech channel is $20 per thousand views before YouTube takes their cut. Just imagine what some of the more hip and popular channels bring in.

Given those lucrative rates its no wonder that YouTube seems hesitant to aggressively market Premium. In just the last quarter of 2024, YouTube pulled in $10.4 billion from ad sales, a figure that would be difficult to replace with subscription revenue.

With the possible rollout of Premium Lite, YouTube may be trying to strike a balance—offering an affordable alternative to attract ad blocking viewers while still keeping its ad-driven model intact. For those who prefer a seamless viewing experience without playing the ad-blocking cat-and-mouse game, Premium in some form will continue to be an appealing option.

GMKTec G9 Compact NVME NAS Review

For nearly a decade, I relied on a WD MyCloud PR2100 NAS device as the backbone of my home media setup. It served as my Plex server, managed my HDHomeRun DVR, and generally functioned as the central hub for all my media needs.

While it continued to perform reliably, the hardware was starting to show its age. It was limited to gigabit Ethernet despite my upgraded multi-gig network, and its processor struggled with newer video formats, particularly for hardware-based transcoding in Plex. I decided it was time to explore an alternative that offered more flexibility, particularly for installing Docker containers and other self-hosted applications.

Enter my latest experiment: the GMKtec G9 and a Wavlink USB drive enclosure, which is the subject of my latest review.

This compact Intel N150-powered mini PC is marketed as a NAS device, thanks to its four NVMe slots that allow for a solid-state storage array. It also features a USB-C 3.2 port capable of 10 Gbps speeds, which enabled me to integrate a Wavlink four-bay USB drive enclosure. With a combination of SSDs and spinning drives, I set out to see if this unconventional setup could handle my media server needs.

For the operating system, I opted for Unraid. Having explored Unraid in the past, I was already familiar with its capabilities, particularly its flexibility with storage and Docker applications. While Unraid doesn’t yet support the N150 processor’s GPU for hardware transcoding, that feature is expected in version 7.1 next month. For now, that means this setup isn’t ideal for Plex transcoding, but it works fine for direct streaming and other media-related tasks which is what I typically do at home.

One of the main considerations with this setup was cost. The GMKtec G9, priced around $239 with a 512GB SSD pre-installed (compensated affiliate link), offers a relatively affordable entry point for a NAS-like system. The Wavlink drive array, at about $115 without disks, provides a budget-friendly option for additional storage, though it lacks some of the conveniences of higher-end NAS enclosures, such as hot-swappable bays. Instead, drives are secured in trays that require screws, making swaps more labor-intensive. One other important note is that the Wavlink device doesn’t support software RAID on Windows – it’s basically a JBOD array which makes it a good fit for Unraid.

From a hardware perspective, the GMKtec G9 is a compact but capable device. It has 12GB of soldered RAM, which isn’t upgradeable—a potential limitation for users running multiple self-hosted applications. However, in my use case, RAM hasn’t been an issue; even with Plex and a few Docker containers running, memory usage remains low. The back panel offers two 2.5-gigabit Ethernet ports, multiple USB ports, and HDMI outputs, allowing it to function as a compact desktop PC if needed. In fact it comes with a fully licensed version of Windows 11 Pro preinstalled!

One concern I encountered early on was heat management. The NVMe slots lack active cooling, and while I added heat sinks to mitigate the issue, temperatures are still running higher than I would like. Selecting lower-powered NVMe drives may help, but it’s something to keep in mind when configuring this setup especially if you plan to work the array heavily.

With Unraid up and running, I configured my storage into an array that includes four 4TB hard drives, one of which serves as the parity drive for data protection. I also designated an NVMe drive as a cache to improve performance, particularly for media applications. One of the key lessons from this project was the importance of caching in Unraid. Initially, I configured my media share to write directly to the spinning drives, but this significantly slowed write speeds due to the parity configuration. Enabling the cache drive drastically improved performance, allowing for smooth file transfers without the mid-transfer slowdowns I initially encountered.

For applications, I set up Plex, the HDHomeRun DVR, and Immich, an open-source photo organization tool. Plex has been responsive, particularly when browsing the library, thanks to the NVMe storage. However, without Unraid’s GPU support for hardware transcoding on the N150, it’s not yet an ideal solution for remote streaming of high-bitrate content. Once Unraid 7.1 is released, I plan to revisit the transcoding capabilities.

As a whole, this setup has been surprisingly functional. While it lacks the polish of a dedicated NAS, the combination of a mini PC with external storage provides a flexible and cost-effective alternative. It’s not the most elegant solution—there are cables everywhere—but it works. Unraid’s flexibility means that if I decide to transition to a different hardware setup in the future, I can easily migrate my storage and applications without major disruptions.

I’ll be keeping this system running for a while until I come across a better solution. One of the advantages of Unraid is the ability to pick up the drives and plop them into another PC without any need for reconfiguration. More to come on this project!

Disclosure: the NAS Box came in free of charge from GMKTec and the Wavlink SATA Array came in free of charge through the Amazon Vine program. No additional compensation was received nor did anyone review or approve this review before it was uploaded. See more on my disclosures here.

I filed a response to the NAB’s ATSC 3.0 transition request

As I reported on the other day, the nation’s broadcasters are hoping the FCC will finally set a date to transition to the new ATSC 3.0 standard. This of course comes with restrictive DRM that makes it difficult for consumers to tune into over the air television the way they do today.

Here’s what I filed in response:

Dear FCC Commissioners,

It is clear from both the Future of Television report and the recent request for rulemaking from the NAB that the availability of ATSC 3.0 tuners is the major barrier to this transition. Broadcasters seem to believe that setting a firm transition date while simultaneously pulling most of their programming off ATSC 1.0 will magically create a market for tuning devices.

The real reason for this tuner availability problem is that broadcasters have implemented a broken DRM encryption standard that barely works—even for early adopters like me. Before this encryption experiment, it was possible to tune and decode ATSC 3.0 signals on a variety of hardware and operating systems.

With encryption, however, broadcasters now limit tuning boxes to pre-approved tuners running Google’s operating system and encryption technologies. The NAB’s claims that Google is destroying their industry ring hollow when they have essentially created a monopoly for Google in tuning over-the-air signals.

Currently, I have an HDHomeRun “gateway” tuner connected to a single antenna, which delivers ATSC 1.0 and unencrypted ATSC 3.0 content to all of my televisions, computers, and other devices on my home network—regardless of manufacturer. DRM, however, will require consumers to make individual antenna connections to each television and purchase a Google-powered standalone tuner. How is this progress?

Broadcasters will argue that their streaming competitors also use DRM. This is true. However, those streaming services ensure that their apps are available across multiple platforms. Even Apple—well known for its closed ecosystem—makes its Apple TV+ app available on nearly all streaming devices.

A look at Amazon sales data for “ATSC 3.0 tuners” shows that consumers are choosing gateway products at a rate of 20 to 1 over standalone tuners. Why? Because purchasing a single device that integrates with their existing TVs and set top boxes just makes sense. Yet broadcasters want to restrict this option, forcing consumers to either buy more hardware or push them back into a subscription service where they can collect retransmission fees.

While broadcasters have assured this commission that their self-imposed “broadcast encoding rules” allow for in-home recording and gateway use, what they didn’t disclose is that these encoding rules—created entirely by them—only apply to ATSC 3.0 broadcasts that are simulcast on ATSC 1.0.

The broadcast industry’s reliance on retransmission fees will ultimately bankrupt them, as this business model defies basic economic principles. As demand for their product declines, they continue to raise prices. Comcast was charging me $36 per month in pass-through retransmission fees right before I cut the cord!

You will hear from many major corporations in the coming weeks, but I believe it is equally important to listen to the thousands of consumers who have filed on this docket. The truth is that DRM is harmful. The tuning solutions that broadcasters have approved are subpar, expensive, and have stifled innovation—preventing more tuners from reaching consumers by now. ATSC 3.0 offers significant improvements in signal quality that we should all be able to benefit from. But allowing broadcasters to encrypt their signals—on our publicly owned airwaves—in an effort to keep consumers locked into predatory retransmission fees is not the right path for this transition.

Broadcasters already have the full weight and power of the U.S. government to combat and prevent signal piracy. DRM does nothing to prevent piracy but significantly restricts law-abiding consumers from accessing the airwaves that we have granted broadcasters to use for free.

If they want a set transmission date, give it to them—with the requirement that these signals be delivered to the public without encryption or restriction, just as they are now.

For more, you see my prior reports here. You can add your voice to this effort by filing with the FCC yourself! Instructions are here.

Broadcasters Ask the FCC for a 2028 ATSC 3.0 / NextGenTV Transition Date

The nation’s broadcasters are making a push for the Federal Communications Commission (FCC) to lock in a firm February, 2028 date for the transition to the NextGen TV standard, ATSC 3.0.

I take a look at their filing in my latest video.

The broadcaster proposal includes setting the February, 2028 date for the top 55 television markets to fully switch over, with smaller markets following by February 2030. Along with that date request, they’re asking the FCC to make a number of policy changes to accelerate the transition.

One ask is for the FCC to lift the simulcasting rules that exist under the current ATSC 3.0 rule. Right now, stations are required to offer “substantially similar” broadcasts in both the current ATSC 1.0 and newer ATSC 3.0 formats. Broadcasters want to move their higher value programming to ATSC 3.0 to push more viewers to upgrade their televisions or tuners.

Last month’s long awaited “Future of Television” report indicated significant adoption issues centered around ATSC 3.0 tuner availability. At the moment only higher end TV sets have the new tuners built in and standalone tuners are expensive. and lousy. Broadcasters are asking the FCC to mandate the inclusion of ATSC 3.0 tuners on all new televisions as soon as possible to get more of them out to consumers.

One hurdle to this request is an ongoing legal dispute over patents related to the ATSC 3.0 tuning technology. A company has already won a lawsuit against LG, requiring the manufacturer to pay excessive licensing fees on every television sold with an ATSC 3.0 tuner. The case is currently before an appeals court but will no doubt make a mandate difficult to put in place right now.

Another contentious issue is digital rights management (DRM) encryption that broadcasters are building into the new standard. Broadcasters acknowledge the concerns raised by consumers, but tell the FCC that their existing “encoding rules” allow unlimited recording and storage of TV broadcasts. They fail to mention that these rules only apply to simulcasts of ATSC 1.0 content, not dedicated ATSC 3.0 broadcasts. If simulcasting is phased out, broadcasters would have more control over how content is recorded and accessed. And on top of that there are significant compatibility issues that limit how consumers can access the broadcasts and record them.

Currently, the only tuners capable of decrypting these broadcasts rely on Google’s Android TV operating system and Google’s DRM technology. This means broadcasters, who argue they need regulatory relief to compete with Big Tech, are indirectly reliant on Google’s ecosystem to distribute their content. Additionally, consumers have expressed a strong preference for networked tuner solutions—such as gateway devices that connect to a home network—yet broadcasters have struggled to deliver on their promise to support them.

Cable providers are likely to push back against this transition timeline, due to the costs involved in upgrading their infrastructure to support ATSC 3.0’s DRM along with its new video and audio codecs. Broadcasters argue that setting firm deadlines will give cable companies enough time to prepare and budget but they make no offer to assist cable providers’ transition expenses.

Alongside this requested transition, broadcasters are also asking for policy changes that could impact local station ownership rules and streaming services like YouTube TV.

They asked the FCC to lift restrictions on station ownership, claiming they need the ability to scale up their businesses in order to compete for advertising revenue. Unlike digital platforms that can expand without regulatory barriers, broadcasters face limitations on how many TV and radio stations they can own, both nationally and within local markets.

Another significant request involves treating streaming services that carry local stations — such as YouTube TV and Hulu — the same as cable providers when it comes to retransmission negotiations. Currently, national networks negotiate these deals for streaming platforms on behalf of their locally owned affiliates, whereas cable companies must negotiate with each individual station. If the rule changes, it could drive up the cost of streaming services as local broadcasters gain leverage to negotiate their own carriage fees.

The broadcast industry’s current business model defies basic economic principles: they continually raise prices even as demand for their product declines, while simultaneously making it more difficult for cord-cutters to tune in over the air due to the industry’s insistence on broadcast DRM. This FCC chair has already indicated that there are better uses for TV spectrum, so I predict he will approve broadcasters’ request just to hasten their demise.

Plex Adds HEVC Transcoding (sponsored post)

I spent some time experimenting with a new feature in Plex’s hardware transcoder that allows for HEVC transcoding of media. This means that high quality 1080p streams can be sent remotely at the same bit rate (or less) as a 720p h.264 stream. You can see it in action in my latest monthly sponsored Plex video.

The goal was to see how well this feature performs in terms of efficiency and quality and how easy it is to set up on a Plex server. My test system was a low-cost GMKTec G3 Plus mini PC running Linux, equipped with an Intel N150 processor.

Setting up the feature was straightforward. In the Plex web interface, under the server settings, I enabled the experimental HEVC video encoding option. It was also necessary to ensure that hardware acceleration was turned on. Additionally, Plex provides an option for HEVC optimization, which pre-encodes videos for better playback on low-powered servers.

To test performance, I loaded a 4K HDR Blu-ray movie onto the Plex server and played it back on my laptop. Initially, the video was streamed in full 4K resolution, but I then switched to a lower bitrate of 720p at 2 Mbps to force a transcode. The server responded quickly, and the video quality remained impressive. Due to copyright restrictions, I couldn’t share a direct visual comparison, but the results were noticeably better than the standard H.264 encoding.

Checking the Plex dashboard, I confirmed that both decoding and encoding were being handled in hardware, with the output using HEVC. The CPU usage remained relatively low, hovering between 25% and 36%, which was similar to what I had observed with H.264 encoding. This suggests that enabling HEVC does not significantly increase the processing load, at least on a modern Intel processor like the one in my test setup. With this level of efficiency, I estimate that the system could handle three or four simultaneous transcodes without much issue.

For those considering enabling this feature, you’ll need at least a 7th-generation Intel Core i3, i5, or i7 processor. Lower-end hardware needs to have Jasper Lake or a newer architecture to be fully supported. Even if a system supports hardware transcoding, that doesn’t necessarily mean it will support HEVC encoding, as some older Intel chips lack the necessary features.

Playback device compatibility also plays a role in whether a client can receive an HEVC stream. On Apple and Android devices, including Apple TV and Android TV-based systems, the automatic quality adjustment features defaults to H.264. To ensure HEVC transcoding is used, the resolution and bitrate must be manually selected. Additionally, HEVC playback requires a Chromium-based browser on Windows, macOS or Linux, or Safari on macOS. Other browsers like Firefox and Opera won’t work. Similarly, the Xbox One S doesn’t support HEVC playback but will automatically revert to H.264 when necessary.

The improved efficiency and quality of HEVC make it a useful addition to Plex’s transcoding capabilities. It’s worth experimenting with if you have the right hardware.

Disclosure: This was a paid sponsorship by Plex, however they did not review or approve this content before it was uploaded.

Survey: Half of Americans Still Use Physical Media

Physical media is still going strong according to a recent survey from Consumer Reports. Despite the shift toward digital downloads and streaming services, a significant number of consumers continue to hold on to tangible media, whether out of nostalgia, preference, or practicality. While we typically look at sales data to determine format preferences, this survey reveals what consumers are actually using on a regular basis.

In my latest video, we dive into the survey results and also interview the Consumer Reports journalist who initiated the survey.

The survey, which included over 2,000 respondents weighted to reflect the American population, found that 45% of Americans still listen to CDs. This number surpasses vinyl records, which have outsold CDs in recent years, but not necessarily in terms of actual usage. Even cassette tapes have a notable presence, with 15% of respondents saying they still use them. Surprisingly, 5% of Americans still listen to eight-track tapes, a format that largely disappeared decades ago.

On the video side, DVDs and Blu-rays remain in use by almost half of Americans. Even as streaming services dominate entertainment consumption, many consumers still rely on physical copies, whether for better quality, affordability, or simply because they own large collections. VHS tapes, once considered obsolete, are still watched by 15% of respondents. Even laser discs, a niche format from the 1990s, still have a small but dedicated following, with 3% of Americans reporting they still watch them.

But consumer-generated media has also seen a more dramatic shift away from older formats. Only 9% of respondents say they use a dedicated camcorder, a sharp decline from past decades when handheld video cameras were common in households. The rise of smartphones with high-quality video capabilities has made camcorders largely redundant. DVR usage has also declined, with only 4% of Americans still relying on devices like TiVo.

Classic video game systems remain popular, however, with 14% of Americans still using older consoles. While this number may seem lower than expected given the strong online retro gaming community, it reflects the difference between casual users and dedicated collectors. Many small businesses and conventions continue to thrive around vintage gaming, and many enthusiasts like myself have even returned to using CRT televisions for a more authentic experience. I think we may see this number actually increase over time.

Legacy home office equipment also persists in some households. About a quarter of Americans still use landline telephones, though many of these are now VoIP-based rather than traditional copper-line connections. Fax machines continue to be used by 11% of respondents, and even Rolodexes and floppy disks still have their niche users, with 5% and 4% respectively.

The journalist behind the Consumer Reports article, Jim Willcox, joined me in the video to discuss how he personally added the questions about legacy technology to the survey out of curiosity. He noted that the longevity of physical media often defies industry expectations. While new formats tend to be predicted as the downfall of older ones, the transition is rarely immediate. Communities continue to form around niche formats, and the appeal of tangible media has proven resilient.

Willcox also highlighted the changing landscape of content ownership. With the rise of streaming, consumers have become increasingly aware of the drawbacks—such as the unpredictability of content availability and the necessity of multiple subscriptions to access favorite shows or movies. In contrast, physical media ensures long-term ownership without concerns over shifting licensing agreements or digital rights management.

While digital convenience is undeniable, the enduring appeal of physical media suggests that many consumers still value having something they can hold, play, and collect. Whether it’s a preference for higher-quality audio and video, a sense of nostalgia, or simply wanting control over their media, this survey shows us that physical formats are far from extinct.

GMKTec AD-GP1 External GPU (eGPU) Review

The GMKTec AD-GP1 is a compact external GPU that houses an AMD RX 7600M XT graphics card with 8GB of video memory. Designed for portability, it connects via USB 4, Thunderbolt or Oculink connections. This device is a good external graphics option for those looking to boost the graphical capabilities of an ultrabook while maintaining the flexibility of a lightweight laptop. You can check it out in my latest review.

It is important to note that while the GPU supports Thunderbolt-enabled devices, it does not function with Apple’s silicon-based Macs, limiting its compatibility to certain Intel-based Macs and Windows ultrabooks with Thunderbolt, USB 4 or Oculink connections.

The price point is approximately $469 on GMKTec’s website. Depending on sales you might find a lower cost option on Amazon (compensated affiliate links).

The AD-GP1 features two HDMI 2.1 outputs and two DisplayPort 2.0 outputs, allowing for up to four external displays with resolutions up to 8K at 60Hz. However, despite its compact form factor, the GPU requires an external 240W power supply, which is roughly the same size as the unit itself. This power supply not only supports the GPU but also provides up to 100W of power back to the host device.

In testing, the GPU demonstrated solid performance when paired with an Asus Vivobook S 14 ultrabook with an Intel Core Ultra 7 258V. Running No Man’s Sky at 1080p on high settings, the system maintained a consistent 60 frames per second (fps). At ultra settings, performance fluctuated between 45 and 60 fps. However, in Red Dead Redemption 2, performance gains were negligible due to CPU bottlenecks, highlighting the fact that the GPU’s benefits will depend on how graphically demanding a game is relative to the processor’s capabilities.

Benchmark testing using 3DMark Time Spy revealed a significant increase in graphical performance with the external GPU attached. The laptop’s base score of 4,385 jumped considerably to 9,421 when the AD GP1 was connected, though the improvement was primarily in GPU-intensive tasks, with the CPU performance remaining unchanged.

Additional testing was conducted using a GMKTec Evo X1 mini PC (compensated affiliate link) equipped with a Ryzen AI 9 HX-370 processor. When connected via OCuLink, the external GPU delivered a performance score of 10,026, which was nearly identical to its performance over USB 4, suggesting that the GPU was not pushing beyond the bandwidth limitations of the connection.

Beyond gaming, the external GPU proved beneficial for tasks like local AI processing. Running a distilled version of DeepSeek 8B using the GPU significantly outperformed CPU-only processing.

Fan noise is minimal even when running at full blast for extended periods of time. The 3DMark Stress Test came in at 99.2% indicating that there won’t be much thermal throttling under sustained loads.

While external GPUs like this remain a niche product, they offer a good solution for users who need enhanced graphical power with a lightweight laptop. For those with compatible hardware, it’s an option worth considering for boosting graphics performance at home or in the office.

Kodak Slide N Scan Review – Rapid photo negative scanner

Scanning and digitizing old film negatives and slides is often a daunting task, requiring expensive equipment and meticulous effort. The Kodak Slide N Scan simplifies this process, providing a rapid and accessible way to convert old photo negatives and slides into digital images. I took a close look at this device in my latest review to see how well it performs and whether it’s a viable solution for casual users looking to preserve their film-based photos.

The Slide N Scan is found at many retailers including Amazon and Best Buy (compensated affiliate links) so shop around for the best price.

Unlike traditional scanners that require software and complex settings, the Slide n Scan operates without a PC. Negatives or slides are inserted into the film tray, the device automatically converts them into positives, and a simple push of a button saves the image onto an SD card (not included). It supports various film formats—including 35mm negatives and slides along with 110, and 126.

Quality, however, is where expectations need to be tempered. The Slide N Scan is not an archival-quality scanner. The 13-megapixel sensor interpolates images up to 22 megapixels, but the results lack fine detail and sharpness. Color reproduction is also inconsistent, as the device attempts to automatically determine the correct balance. While there are some limited color and exposure adjustments on device, enthusiasts will be looking for a lot more. This means additional editing is often necessary after scanning to achieve accurate colors and exposure. But for snapshots, the automatic settings will usually be good enough.

In practical use, scanning is incredibly fast. The device writes images directly to an SD card, which can then be transferred to a computer, phone or tablet. The process is similar to using a digital camera—plug in the SD card, and the images are readily accessible. The Slide n Scan also features an HDMI output, allowing users to project the output of its 5″ screen on a larger television screen.

Examining the scanned images, it’s clear this is not a professional-grade solution. Compression artifacts are visible, and while the device outputs large JPEGs, there is no option for saving uncompressed formats like TIFF. The upscaling process to 22 megapixels does little to enhance detail. I found that black-and-white negatives tend to look better when using the lower 14-megapixel resolution setting, especially since popular films like Tri-X are quite grainy and can interfere with the upscaling and image compression process at the 22-megapixel option.

Despite these shortcomings, the Kodak Slide n Scan serves a purpose. For casual users looking to quickly digitize old photos for sharing on social media or archiving in a non-professional capacity, it provides a convenient solution. The speed and ease of use make it appealing, especially for those with a large number of negatives or slides to process. However, users seeking high-quality digital preservation of film-based images will need to explore more advanced scanning options.

A device in the $500–$600 range with a better sensor and uncompressed file-saving capabilities would fill a gap in the market for those who want high-quality results without the time investment of professional scanning solutions. While the Slide n Scan doesn’t meet that standard, it represents progress in the space, providing an affordable and efficient way to convert old film into digital format.

See more products like this here.

A Nifty Smartphone SSD with Real-time Diagnostics! Twopan SSD review

This smartphone SSD was originally going to be part of my next Amazon haul, but its unique features made it worth a dedicated look in a standalone video.

This SSD, from a company called Twopan (compensated affiliate link), offers some interesting functionality. It connects directly to an iPhone or Android phone via USB-C and includes a built-in real-time diagnostic display. That means users can monitor power consumption, read and write speeds, and even temperature in real-time. The drive also features a single port built-in USB 2.0 hub, allowing additional devices to be plugged in, as well as MagSafe compatibility for easy attachment to the back of an iPhone.

However, one drawback is that plugging in power to that USB port to charge the phone causes the drive to reset, potentially disrupting ongoing work. So be sure to plug in power before recording.

One of the most crucial aspects of using an external SSD with an iPhone is power consumption. iPhones cut off devices that draw more than 4.5 watts through the USB-C port, but this SSD consistently operates at around 2 watts, making it a safe option for pro-res video recording. In testing, it handled recording ProRes 4K video at 60 frames per second without issue, maintaining a steady data rate of about 180MB per second.

One of the standout features is the ability to plug in a USB microphone while recording into its USB 2.0 port. When testing with a DJI wireless microphone, the SSD continued to function smoothly, though power consumption increased slightly. This could be particularly useful for mobile video creators who need external storage and high-quality audio input simultaneously.

The drive’s MagSafe compatibility is another convenient feature. With the included angled USB-C cables, it attaches magnetically to the back of an iPhone, providing a more secure connection than just the SSD hanging off the port. However, the package does not include a cable for connecting the SSD to a computer. When plugged directly into a MacBook, it blocked all other ports, making a USB-C extension cable necessary for practical use.

Performance testing on a MacBook using Blackmagic Disk Speed Test showed read speeds close to advertised numbers but write speeds that fell short, averaging around 600MB per second instead of the promised 960MB per second. While this may be due to power-saving measures, it still delivers sufficient performance for ProRes video recording.

Overall, this SSD presents an interesting solution for those looking to record high-quality video on an iPhone. It addresses several pain points associated with external drives, including power management, real-time performance monitoring, and USB accessory support. While having two USB ports—one for power and one for peripherals—would have been ideal, the drive still manages to offer a solid, functional experience. A niche product, but one that solves a very specific problem effectively.

Disclosure: This product came in free of charge through the Amazon Vine program. However, nobody reviewed or approved this content before it was uploaded and no other compensation was received. All opinions are my own.