Legion Go S with SteamOS Review

In my latest review, I take a look Lenovo’s Legion Go S – a handheld gaming PC that ships with Valve’s Linux-based SteamOS installed. It has a similar look and feel to the Steam Deck, but there are some key differences in the hardware that set it apart. Check out the review here.

The model I tried is the entry-level version with an AMD Ryzen Z2 Go processor, 16GB of soldered RAM, and 512GB of storage. These sell for about $599 over at Best Buy (compensated affiliate link). There are also Windows variants available at a higher price point, which come with more RAM to accommodate the heavier OS. The entry level version comes in about $50 more than a comparably equipped OLED-based Steam Deck.

What you get for that extra $50 is a more advanced processor and a larger, higher-resolution display. The Legion Go S has an 8-inch 1920×1200 IPS screen running at 120Hz, compared to the 7-inch 720p display on the Steam Deck. While the OLED on Valve’s device offers deeper contrast, this IPS panel still looks sharp and is definitely an upgrade over the original Steam Deck LCD.

The form factor is roughly the same, though the Legion is slightly heavier and thicker. I found it comfortable to hold thanks to its rounded edges. It also offers two USB 4.0 Type-C ports on the top—and they’re fully functional for power, data, and video. The Steam Deck has only a single USB 3.2 USB-C port. SteamOS doesn’t yet support external GPUs, but with this hardware, it’s at least possible down the line.

One of the more interesting features here is the ability to switch the analog triggers into digital ones using a small physical lock. The Hall effect analog sticks are a nice improvement too, helping to reduce drift and offering smoother, more accurate control. I was able to customize the dead zones easily through settings. There are customizable LEDs around the sticks, though after a system update I wasn’t able to find the option to enable them.

The D-pad stood out to me—it’s responsive and easy to rock, with a feel reminiscent of some older Sega designs. The face buttons are slightly larger than those on the Steam Deck, and while there’s a tiny trackpad, it’s nowhere near as usable as the larger ones on Valve’s device. If you rely on trackpad input for certain games, that’s something to keep in mind.

As for performance, the Legion Go S compares well to the Steam Deck. In my tests, games that hit a playable frame rate on the Steam Deck at 720p were able to achieve similar performance at 1080p on this device. Cyberpunk 2077 ran around 40–45 FPS at native resolution on the Steam Deck preset. When I dropped it to 720p, the framerate climbed to 63 FPS. Doom Eternal and Ace Combat 7 also ran well, hovering around 60 FPS with modest settings.

The downside here is fan noise. When running intensive titles in performance mode, the fan ramps up and is noticeably louder than what you get on a Steam Deck. The system draws a lot of power—up to 46 watts—which drains the battery quickly. You’re looking at about an hour to 90 minutes of gameplay under those conditions, so carrying a charger is a must.

There was one audio issue worth pointing out. I ran into scratchy, distorted sound in louder games like Doom Eternal. A reboot helped temporarily, but the issue returned, and it seems to be a software or driver problem others have experienced too. Hopefully it’s something that gets patched soon.

Emulation worked great. The EmuDeck project installed without a hitch, and I was able to run Outrun 2006 for PS2 at a solid 60 FPS with 1080p upscaling. For anyone looking to use a handheld primarily for retro emulation but who wants a sharper screen than what the Steam Deck offers, this might be a good fit. Xbox 360 games didn’t fare well here, though—about the same as on the Steam Deck—so expect the best results with consoles from the PS2 era and earlier.

The collaboration between Lenovo and Valve really shows in how smoothly SteamOS runs on this hardware. It’s a polished experience overall, and while the performance gains over the Steam Deck aren’t massive, the upgraded screen and processor do make a difference—especially if you’re running games at 720p and want to squeeze out higher frame rates. For those considering their first handheld gaming PC, the Legion Go S is a solid contender.

NVME Six Pack: Beelink ME Mini Server / NAS

I recently got a look at a compact mini PC from Beelink called the ME, and what makes it stand out is its ability to hold six NVMe drives internally. This device is built with network-attached storage in mind, and while I’m demoing it here with Unraid, it also supports other NAS operating systems and Linux distributions. It even ships with a licensed copy of Windows if you want to go that route.

You can see it in action in my latest review.

Inside, it runs on an Intel N150 processor—definitely on the lower end—but well-suited for light server tasks and Docker containers. You can find it on Amazon or direct with a few more configuration options on their website directly (compensated affiliate links).

My review unit included a Crucial-branded NVMe drive pre-installed in slot 4. All the bundled storage options appear to use Crucial, which I’ve been using myself for years.

The drives insert vertically and make contact with a heat pad that connects to a large central heatsink. That design does a noticeably better job at keeping drives cool than other compact NAS units I’ve tested recently. The slots themselves are mostly 1x PCIe interfaces, with slot 4 being the faster with a 2x lane slot. Even so, it maxed out around 1.3 GB/s with the Crucial PCIe 4.0 SSD out of that slot. The rest are slower but the bottleneck in most NAS applications will be the network, not the drive speeds.

This unit includes two 2.5Gb Ethernet ports, which gave me around 200–250 MB/s throughput over the network during my tests. It’s unlikely you’ll saturate even the slowest drive slot with this kind of networking. Internally, the device has 12GB of soldered Crucial RAM. That’s not expandable, but for NAS and home server purposes, it’s enough. There’s also an Intel AX101 Wi-Fi 6 card if you’d rather go wireless.

Ports include two USB 3.2 Gen 2 ports (one USB-A, one USB-C), HDMI, USB 2.0, and a power jack—no external power brick here, just a built-in 45W supply. The casing is plastic but feels solid and clean, especially for a device that may sit out in the open. Video output supports 4K60, and I tested it with Ubuntu and Windows 11 Pro, both of which ran without issues. The hardware was properly recognized under Linux, and the preinstalled Windows license activated without a problem.

To test Unraid, I simply took the drives out of a GMKtec NAS I had been using and inserted them into this one. Everything came up immediately, including my external USB drive array. The only hiccup came from the USB-C port not playing nicely with my drive array; switching to the USB-A port resolved it, but I did lose my parity drive in the process. That seems more like a controller compatibility issue than a fatal flaw, though it’s something to be aware of.

I’m now considering moving entirely to solid-state storage, especially since this device gives me two more NVMe slots than the GMKtec box did. With Unraid’s parity setup, five slots can be used for storage and one for parity, giving me up to 20TB of usable space if I install 4TB drives across the board. I’ve only got about 9TB of data right now, so it’s feasible. 4TB NVME storage is pretty pricey at the moment so I’ll probably piece it together with smaller drives.

Power consumption is low—about 18–20 watts idle with five NVMe drives installed and a couple of Docker containers running. Under load, like when writing large files or playing back a Plex stream with hardware-accelerated 4K HDR tone mapping, it edged up to around 26-30 watts. Hardware transcoding works just fine in Unraid as long as you remember to add /dev/dri to your container configuration. I detail that in the video.

Temperatures on the drives were impressive. A WD cache drive that previously idled at 69°C in the GMKtec unit now hovers around 50–51°C in this one. Under load, those numbers go up a bit, but they’re still dramatically better than before. It’s a testament to the improved passive cooling inside this unit. The fan is also whisper-quiet—much less noticeable than my spinning external drives.

One downside is thermal throttling under extended CPU load. A 3DMark Time Spy stress test resulted in a fail grade, with performance dropping around 16%. That’s shouldn’t impact most NAS workloads, but I wouldn’t use this for anything that demands sustained CPU performance.

Overall, this mini PC has proven to be a capable, efficient little box for self-hosting in tight spaces. I’ve got some reconfiguring to do now—time to dig through my parts bin and see which higher-capacity NVMe drives I can consolidate onto this unit. It feels like there’s real potential to go all solid-state here and simplify the setup.

Here’s Why Your Cable or Streaming TV Bill is So Expensive..

If you’ve ever looked at your cable or streaming TV bill and wondered why it keeps climbing, there’s a good chance it has something to do with retransmission consent disputes like the one playing out between Altafiber and Nexstar. This case gives us a rare look inside the kinds of negotiations that usually happen in private and might help explain some of the hidden costs passed along to subscribers.

I take a look at the complaint in my latest video.

Altafiber, formerly known as Cincinnati Bell, filed a complaint with the FCC accusing Nexstar of negotiating in bad faith. At the heart of the complaint is Nexstar’s demand that Altafiber carry its cable news network, NewsNation, as a condition for continuing to retransmit one of its local broadcast stations. Altafiber claims this violates FCC rules as they allege that Nexstar is not negotiating in good faith by forcing a cable channel to be bundled with a local broadcast station.

What’s more, Altafiber says that only about 900 of its 87,000 subscribers live in the market where Nexstar’s broadcast station is located. Yet they’re being asked to pay for NewsNation across their entire subscriber base. Altafiber says viewership of NewsNation is extremely low, adding that only about 30 people complained when NewsNation was dropped. They argue that the proposed increase in Newsnation’s renewal fee is 15 times the rate of inflation.

This situation is part of a larger trend. Broadcasters used to be guaranteed carriage on cable systems through must-carry rules, but those were ruled unconstitutional in the 1980s. The Cable Act of 1992 replaced that with a system where broadcasters can either demand free carriage or negotiate “retransmission consent” which requires cable operators to pay to carry the station. Most broadcasters chose the latter, and the result is a steady increase in retransmission fees as advertising revenues decline. In my area, Comcast’s local broadcast TV fee recently jumped from $32.75 to $37.50 per month at the start of 2025. And that’s on top of the regular monthly bill for cable and internet service.

This kind of cost creep was what finally pushed me to cut the cord. These fees tend to sit outside of long-term contracts, so they can be increased at any time. The added frustration is that you’re often paying for channels you don’t watch or want, but have no choice in the matter. Altafiber claims NewsNation is profitable not because of viewership, but because of these kinds of forced bundling tactics.

In 2023, Nexstar made $2.57 billion from retransmission fees—far outpacing their ad revenue. In 2024 that number rose to $2.9 billion. The business model seems less about attracting viewers and more about collecting fees from cable and streaming companies, who in turn collect them from you.

The National Association of Broadcasters is pushing for even more deregulation, including relaxed ownership rules and changes that would let them negotiate directly with streaming services like YouTube TV and Hulu in the same way they do with traditional cable companies. That means the $83 monthly bill you’re paying for streaming could go even higher if these efforts succeed.

Some people (like me) try to bypass all this nonsense with an antenna, but that’s becoming harder too. The new ATSC 3.0 broadcast standard is encrypted using DRM that relies on Google and Amazon infrastructure. To watch free over-the-air TV, you often need a “certified” Android box connected to the internet to download decryption keys. The whole system is positioned as protection from “big tech,” yet it can’t function without it.

It’s not often we get this level of detail into how the sausage is made. But based on how things are trending across the industry, the next price hike is probably already on its way.

Sandisk Creator Phone SSD Review

I’ve been testing out an external SSD from SanDisk that’s designed for smartphones. It’s called the Creator Phone SSD, and it attaches magnetically to the back of your MagSafe compatible phone. If you’re not using an iPhone with MagSafe, there’s a ring included in the box to help with the mounting. The drive connects via USB-C, so it’s compatible with just about any device that has a USB-C port including Android phones.

You can see it in action in my latest review.

Once connected via USB, your phone can record video directly to the drive using apps like Final Cut Camera on iOS or Blackmagic’s Camera App that works on both platforms.

You can find it on Amazon and see the latest prices at this compensated affiliate link.

Physically, there’s not much to it—just a USB port and the magnetic mount point. One limitation is the lack of a pass-through port, which means the phone’s USB-C port is completely occupied when the drive is connected. That rules out charging your phone or connecting something like an external microphone while recording. It’s a tradeoff that could matter for certain workflows.

There is a five-year warranty on the product, which adds some peace of mind for professional users. Out of the box, it’s formatted with the exFAT file system, which works across most devices—iPhones, iPads, Windows, Macs, and Linux systems. Some older Android phones might not mount the drive properly due to lack of exFAT support, but reformatting to FAT32 can help with compatibility in those cases.

One frustration I had was with the bundled app, SanDisk Memory Zone. If installed, it auto-launches every time the drive is plugged in. It’s useful for things like photo and contact backups, but it also tends to interfere with other apps, particularly Final Cut Camera. Even worse, if you don’t install it, your iPhone will keep prompting you to do so. The only workaround I found was to install the app and then uninstall it, which stops the prompts and lets other video apps access the drive properly.

Once I removed the app, Final Cut Camera immediately recognized the drive, and I was able to record without issues. Using HEVC compression, the 1TB drive can store a lot—up to 18 hours of 4K 120 fps footage, or 36 hours at 60 fps. With ProRes, that drops to about 1 hour and 9 minutes at 4K 60 fps, which is still respectable for the format.

SanDisk seems to have done a good job managing the drive’s power draw, which is important for iPhones. The iPhone will cut off power if a connected accessory pulls more than 4.5 watts, and I didn’t encounter that problem during extended testing at 4K 60 fps or 120 fps.

But in order to manage power draw, the drive will throttle its write performance. While I was able to achieve 4k 60 fps ProRes recordings without any dropped frames, I did have some frame drops shooting 120 fps with the ProRes codec. I did not have issues with fast frame rates at 120fps, however.

I ran the Blackmagic Disk Speed Test on my MacBook Air to get a sense of the drive’s performance. It clocked write speeds around 917 MB/s and read speeds near 881 MB/s. Those are solid numbers, but I did notice some variation in earlier tests, with write speeds occasionally dipping to around 400–500 MB/s. Apple says ProRes 4k 120 needs a minimum of 440 megabytes per second in sustained write speeds.

For users who need consistent, reliable performance at 4K 60 fps using ProRes, I think this drive holds up. It didn’t overdraw power, and I didn’t see dropped frames during long recordings. I’d like to see a future version of this drive with pass-through power and maybe a USB hub for audio gear, and it would be a big improvement if the software didn’t interfere so much. Still, the hardware itself seems reliable, and that counts for a lot if you’re shooting professionally with a phone.

Disclosure: Sandisk sent the drive to the channel free of charge. However no other compensation was received and they did not review or approve this post or my video prior to publication.

8BitDo Controller Compatibility with the Switch 2!

I noted last week that I was able to pick up a Nintendo Switch 2 on launch day, and one of the first things I wanted to check out was how well 8BitDo controllers work with it. If you’re not familiar with 8BitDo, they make a mix of retro-styled and modern wireless controllers that have been popular with Switch owners for years. They are excellent budget replacements for the first party Switch Pro controller that cost substantially less.

When the Switch 2 launched, their controllers didn’t work right out of the box, but 8BitDo recently pushed down some firmware updates that bring compatibility to some of their newer controllers. In my latest video, I take a look at a few and see how they perform.

8BitDo recently posted an update on their X account detailing which of their controllers now support the Switch 2 via firmware updates. Some older models are still out of luck, but a good number of the more recent ones—including newer versions of the SN30 Pro, their translucent editions with Hall effect sticks, and others—can now connect and work properly once updated.

I took a handful of these controllers, installed the latest firmware, and tested them with Super Mario Odyssey to see how they held up with basic controls, motion input, and rumble feedback.

The update process itself was a little bumpy. The SN30 Pro uses a different tool than the Ultimate and 2C controllers, and I ran into some hiccups—especially with the Ultimate 2C controller not connecting properly on macOS. Switching to Windows solved the issue, and once I got the firmware installed on all three, it was time to test them out.

Pairing the SN30 Pro was straightforward. After holding down Y and Start to enter Switch mode, the console recognized it as a Pro Controller. In Mario Odyssey, the buttons, analog sticks, and motion controls worked as expected. The same held true for the 2C and Ultimate controllers—everything was responsive and mapped correctly, including motion gestures like flicking Mario’s cap.

One thing I did have to tweak was the vibration setting. It was off by default, and none of the controllers rumbled until I went into the system settings and turned vibration back on. Once enabled, rumble worked normally, although it’s the standard type—not the HD Rumble you’d get from Nintendo’s Joy-Cons.

I also tested input latency using a 240fps camera to measure button response time. All three controllers, when connected via Bluetooth, performed identically to the Joy-Cons in terms of latency.

At the moment, Switch 2 compatibility is limited to specific 8BitDo models, as outlined in their recent post. Support for older controllers isn’t here yet, but they’ve indicated that more updates are on the way. For now, if you have one of their newer models and install the latest firmware, you should be in good shape.

Disclosure: 8BitDo and their distributor AKNES sent the controllers to the channel free of charge. No other compensation was received and they did not review or approve this post or my video before publication.

Android 16 Beta Turns Pixel Phones into a Desktop—Sorta..

Android 16 is now out for Google Pixel phones, and a new beta of Android 16 has a (very) early desktop mode feature. While this kind of functionality isn’t new—Samsung’s DeX has been around for a while—this is the first time we’re seeing Google itself build something like this directly into Android. You can see it in action in my latest video.

The idea is to turn a phone into something that more closely resembles a desktop computer, complete with windowed apps and external display support.

Performance on mobile devices is getting to a point where this sort of thing actually makes some sense. Apple’s iPads, for instance, use the same chips as their MacBook Airs. There’s no technical reason a tablet couldn’t run a full desktop OS at this point. So while desktop modes might have felt like a gimmick in the past, they’re starting to feel like a real alternative—or at least a supplement—to a traditional PC.

To get this working, I had to enroll the Pixel 8 Pro in the Android 16 beta program. This particular phone had not yet received the Android 16 update, so after opting in the update was immediately available. That wasn’t the case with another Pixel 8a I tested on a different account, which had already received Android 16—it took a few days longer for the beta to show up there.

Only Pixel 8 and newer models support HDMI output via USB-C, so older devices won’t be compatible. I enabled developer mode, scrolled down to the “window management” section, and turned on the desktop mode features. After a reboot, I plugged the phone into a dock that provides HDMI, power, Ethernet, and USB input. I used that to connect a keyboard and trackpoint combo and sent video to an external display.

The result was a desktop-style interface on my display. Apps appeared in movable and resizable windows, and I could open and interact with multiple apps at once—like Google Docs and my blog—side by side. That said, the experience was clearly still in the early stages. Visual quality was disappointing, with text appearing blurry even though the display was set to 1080p. I also didn’t see any built-in options to adjust resolution or text scaling.

I tried running it on a 4K display as well, but everything was too small to be usable. Sticking to 1080p was more manageable. App support was inconsistent. YouTube, for example, didn’t scale well and maintained a layout more suited to a phone screen, even in a resized window.

There’s clearly a lot of work to be done. It doesn’t feel like something that’s close to release-ready, even as a beta. Still, I’m glad Google is exploring it. I’ll be keeping an eye on how this develops and plan to revisit it as the feature matures. There’s real potential here, even if it’s a little rough around the edges for now.

The Switch 2 Launch Was Nintendo’s Most Successful and Most Boring..

I picked up a Switch 2 (compensated affiliate link) the other day—not because I had planned on it, but because I noticed GameStop had them in stock, so I grabbed one. I’ve been playing with it since, but what really stood out to me wasn’t the console itself—it was the nature of the launch. This might be the most low-key console release I’ve ever seen. My kids, who are big Nintendo fans, didn’t even know it was happening. None of their friends were talking about it either. It felt like the Switch 2 just kind of… appeared. And I think that was by design.

See more in my latest video.

That said, the launch was a success for the Big N. They manufactured enough inventory to get units into the hands of most early adopters who wanted one. Nintendo says it’s their most successful console launch to date, selling 3.5 million units in its first four days on the market. Scalpers are not making much money this cycle as a result.

The Switch 2 feels like a slightly better version of the original Switch. It feels faster while navigating the interface and it now has 4K output when docked, though most games won’t take advantage of that. The handheld now sports a larger and higher resolution 1080p screen at 120Hz with variable refresh rate.

There are some tweaks to the hardware: it now features magnetic Joy-Con attachments that attach securely (but prevent the use of non-drifting hall effect sticks), dual USB-C ports, and a sturdier kickstand. Docking works smoothly, and the whole thing feels very familiar to the original Switch. That seems intentional. Nintendo didn’t want to reinvent the wheel—they just wanted to refine it. The result is a console that’s very recognizably a Switch, just with some extra capabilities and polish.

Backward compatibility has been seamless in my experience. Some older games even seem to run a little better. Nintendo is also offering paid upgrades for certain titles—I spent $10 to upgrade my copy of Zelda Tears of the Kingdom, for instance.

As for new games, there’s not much to talk about. Mario Kart World is the marquee launch title along with Fast Fusion, a sequel to an F-Zero style racing game that launched on the first Switch. There’s three remakes/remasters of older games exclusive to the Switch 2: Survival Kids, and Bravely Default HD, Yakuza 0 Director’s Cut. Aside from that, there’s Nintendo Welcome Tour, which is more of a tutorial than a game. The rest of the lineup are bunch of ports of games that have been out for awhile on other systems including Cyberpunk 2077 and No Man’s Sky.

Price-wise, it’s not cheap. $449 for the console and dock, or $499 if you want the Mario Kart World bundle (which comes as a digital download). Nintendo has also introduced a new kind of cartridge—digital key cards that don’t contain the game but rather a code to download it embedded on the chip. On the plus side, these can be resold unlike non-physical digital titles. On the downside, they rely on Nintendo’s servers, which raises questions about long-term access.

Battery life is about on par with the original Switch: two hours or so when running demanding titles like Mario Kart, and a bit more for lighter games.

What stood out to me most about this launch was how quiet it was. Nintendo made a deliberate choice to ease into this. After all, they’ve been here before. The Wii sold over 100 million units, but its successor, the Wii U, sold only 13.5 million. That was a hard lesson in how quickly things can go south when the mainstream consumer base gets confused or alienated. The Switch reversed that trend and became a runaway success. Now, Nintendo’s being cautious, and I can’t blame them.

What I think we’re seeing here is the continued commoditization of video game hardware. Consoles no longer have unique, defining traits. The PlayStation and Xbox are essentially the same inside—PCs in console shells. Microsoft isn’t even making its own handheld—it’s letting ASUS handle that with a Windows-based Xbox-branded device. Nintendo’s sticking to ARM architecture with Nvidia chips, but even that feels like a holdout against an inevitable shift.

It’s starting to feel like we’re heading into a hardware-agnostic future. Where you play might soon matter less than what you play, and the idea of console exclusivity might not hold much weight when the hardware differences vanish. That raises some big questions for Nintendo. Do they eventually pivot fully into software? They resisted that move before, but as more consumers expect access across devices, the pressure might mount again.

For now, the Switch 2 is what it looks like: a slightly nicer Switch. And that might be enough to get through the rest of this decade and into the next.

GTBox G-Dock Review – Oculink/USB 4/Thunderbolt eGPU Enclosure with Built-in Power Supply

I’ve been experimenting lately with external GPUs on the channel, especially now that Oculink ports are showing up in more mini PCs. One of my ongoing frustrations, though, is that a lot of the budget Oculink gear looks like a science fair project when you set it up—there’s power supplies and cables all over the place.

The other day a company called GTBox reached out and sent over their G-Dock, which aims to clean things up a bit. You can see it in action in my latest video review.

The G-Dock integrates an 800 watt power supply, which not only makes things neater but also more convenient. What’s nice about the G-Dock is that it’s more versatile than most setups I’ve tried. It supports both Oculink and Thunderbolt/USB 4 connections, and the USB/Thunderbolt port also provides power delivery, so it can charge a laptop with up to 100 watts while providing external GPU support all through one connection. You don’t get a traditional enclosure with this—your card mounts on top, exposed—but it does make for a more compact and affordable option.

The unit sells for $249 on GTBox’s site (compensated affiliate link), and they provided a coupon code—LON10—for an extra $10 off. It’s also on Amazon. Just keep in mind, if you’re connecting over USB 4, you’ll need a full 40 Gbps port for it to work. A lot of USB-C ports look the same, but older USB 3.2 ports won’t cut it. Oculink, on the other hand, requires your PC to have an Oculink port or an adapter that adds it. Some mini PCs have them built in now, and I’ve tested some of those here in the past.

For the G-Dock test, I hooked up a 4060 GPU to the dock and connected it to a MinisForum mini PC using the included Oculink cable. It’s important to note that Oculink isn’t hot-swappable, so you need to boot the system with the connection already in place. I also made sure to connect my HDMI cable directly to the GPU rather than the mini PC for best performance—routing through the system’s onboard video usually results in lower performance.

The G-Dock powered up just fine. The GPU’s fan spun up, Windows detected the card, and after installing the latest NVIDIA drivers, I fired up Cyberpunk at 1080p with medium settings which ran great. The mini PC featured in this video has strong CPU performance but weak integrated graphics, so the external GPU really gave it a boost.

Next, I tried it with a laptop over Thunderbolt. Everything worked as expected, though there was a small performance dip compared to the Oculink connection. I also made sure to disable the laptop’s internal display and run everything through the external monitor connected to the GPU, which helps avoid further performance losses.

Overall, the G-Dock felt solid. The fan noise was minimal, and the integrated power made it a lot less cluttered than the other Oculink setups I’ve worked with. Still, I’d like to see companies revisit the more protective enclosures we see in the Thunderbolt world—something that completely houses the card and power supply for better durability and aesthetics. But all in this is one of the better Oculink solutions I’ve used.

Disclosure: GTBox sent the eGPU enclosure to the channel free of charge. I purchased the 4060 GPU with my own funds. No other compensation was received and no one reviewed or approved this post or video before it was uploaded.

DRM and Your Rights: Interview with John Bergmayer from Public Knowledge

John Bergmayer, Legal Director at Public Knowledge, provided me some detailed insight into the ongoing FCC debate surrounding DRM (Digital Rights Management) and ATSC 3.0, also known as NextGen TV in a recent interview.

You can watch the full interview here.

Bergmayer’s organization, alongside the Electronic Frontier Foundation (EFF), Consumer Reports and other organizations, submitted a comprehensive FCC filing strongly opposing the DRM implementation proposal from the National Association of Broadcasters (NAB).

Public Knowledge, a Washington D.C.-based consumer rights advocacy group, champions balanced digital rights, net neutrality, intellectual property reform, and media policy reforms that benefit diversity of voices and consumer interests. Bergmayer, who has been with the organization for over 12 years, emphasized their proactive role: “We do interface with government directly and participate in regulatory proceedings like this one at the FCC.”

Despite engaging in working groups aimed at consensus-building for the future of television, Bergmayer identified substantial disagreements among stakeholders. He explained, “There was consensus on the sort of issues that don’t really matter all that much… but on fundamental questions about DRM and encryption issues, there was not a lot of agreement.” Bergmayer highlighted that within broadcaster groups, positions significantly diverged, citing smaller broadcasters like Weigel Broadcasting, who see limited benefits in transitioning to ATSC 3.0.

A central point of contention involves DRM implementation, which Bergmayer argued severely threatens fair use rights and consumer freedoms. He emphasized the inherent conflict: “DRM interferes with things that are legal… it prevents you from accessing the content to do things that are fair uses.” According to Bergmayer, DRM undermines established consumer rights, specifically referencing landmark fair use cases such as the Sony case, which secured the right to record and privately use broadcasted content at home.

Bergmayer pointed out the paradox created by DRM regulations, noting that the Digital Millennium Copyright Act (DMCA) makes circumventing DRM illegal, even if the underlying action, such as recording television programs for personal use, is legally protected fair use. He explained that this contradiction effectively criminalizing legitimate first amendment activities.

The chilling effect of DRM was another significant concern raised by Bergmayer. He indicated that DRM requirements could severely limit innovation and device availability. Specifically, he mentioned popular devices like the HDHomeRun, which significantly outsell DRM-compatible devices precisely because of their flexibility and consumer-friendly nature.

Bergmayer also underscored the unique obligations of broadcasters, emphasizing their responsibilities given their free access to valuable public spectrum. “Free public airwaves should not be turned into a private playground for these companies,” Bergmayer said.

Regarding consumer engagement, Bergmayer praised the active participation of thousands of individual commenters in the FCC docket, noting its unusual depth for such technical issues: “It’s really impressive that there’s people out there who are willing to spend the time to make their voice heard.”

Looking forward, Bergmayer predicted inevitable legal challenges regardless of the FCC’s decision, referencing previous influential cases like the Broadcast Flag litigation, which Public Knowledge successfully led. He believes further court battles are likely due to persistent conflicts between DRM implementation and established individual rights.

Bergmayer strongly encouraged continued public awareness and advocacy as the FCC is obligated to process and acknowledge consumer feedback in making its decisions.

I will have more on this topic as news develops!

Public Knowledge, The EFF, Consumer Reports and Other Organizations Oppose DRM in a New FCC Filing

A major filing was submitted just before the ATSC 3.0 public comment deadline by a coalition including Public Knowledge, the Electronic Frontier Foundation, Consumer Reports, and several other organizations. Their message to the FCC is clear: DRM has no place in public broadcast spectrum. You can read the document here and watch my analysis piece here.

Their argument centers around the idea that mandatory encryption under ATSC 3.0 fundamentally conflicts with the legal and constitutional frameworks that have long governed broadcast TV.

One case they point to is American Library Association v. FCC, where a rule that would have forced devices to honor a broadcast flag was overturned. The court concluded that the FCC had no authority to regulate what happens inside consumer devices once a signal is received. That precedent is particularly relevant as we now face a situation where encryption could prevent people from exercising their long-established right to record broadcasts.

The filing emphasizes that public spectrum isn’t a private asset—it’s a shared, collectively owned resource managed under a mandate to serve the public interest. That’s different from how spectrum is handled in industries like mobile phones, where companies purchase and control allocated spectrum. Here, broadcasters are allowed to profit, but only as trustees serving the public.

What stood out in this filing was how thoroughly it outlined the risks to consumers. Many certified ATSC 3.0 devices are already showing their flaws—most require Internet access to tune televisions, others are running outdated software, and few give users any meaningful flexibility. If encryption becomes the norm, gateway devices, DIY DVRs, open-source solutions, and even basic home recording could vanish.

A central point made by the filing is that DRM turns broadcasters into gatekeepers—not just over content, but also over the devices people can use. It also creates a strange contradiction in the law. On one hand, it’s legal to record a broadcast under the American Library decision and the 1980s Sony Betamax case; on the other, it’s illegal to bypass encryption under the DMCA. So even if you have the right to record something, you will be breaking the law in practice.

They also call out the ATSC 3.0 Security Authority, or A3SA, for setting private rules that aren’t subject to public oversight. Even the encoding guidelines broadcasters have touted are limited—they only apply to ATSC 1.0 simulcasts, not future ATSC 3.0-only broadcasts.

The process by which A3SA licenses devices is also under scrutiny. Developers have to sign NDAs, the terms aren’t transparent, and consumers have no voice in the process. This kind of structure, the filing argues, runs counter to the FCC’s mandate to ensure open and nondiscriminatory access to public airwaves.

Interestingly, the document even questions whether encrypted broadcasts still qualify as “broadcasting” under the law, since they require a privately licensed decoder to access them.

So what happens next? It’s going to be a waiting game. The FCC is about to be short on commissioners, with two stepping down and replacements not yet confirmed. Until the commission has a quorum, it won’t be able to vote on anything substantial—including ATSC 3.0 rules.

On Monday we’ll have an interview with John Bergmayer from Public Knowledge, the lead author of the filing, to dive into this topic further.

Until then, this conversation around DRM is going to slow down a bit as we wait for the FCC to get back to full strength. But I’ll keep tracking the story and will have more updates when the next phase begins.

Unifi U7 Lite Review: $99 Wi-Fi 7 Access Point Breaks the Gigabit Barrier

I’ve been gradually upgrading the Wi-Fi setup in my house, and the latest step in that process is beginning to swap out my UniFi Wi-Fi 6 access points for the new Wi-Fi 7 models. This first step was installing the new U7 Lite, their entry-level Wi-Fi 7 device priced at $99 (compensated affiliate link).

You can see it in action in my latest video.

I started with the one in my studio since this is where I’ll likely have the most Wi-Fi 7 clients to experiment with. It’s a good test case for seeing how much of a real-world bump I can get from upgrading to Wi-Fi 7.

Physically, the U7 Lite is nearly identical to previous “Lite” models from UniFi. It uses the same mounting bracket as the AC Lite and U6 Lite, which made installation a 30-second job—twist out the old one, twist in the new one, and that was it. It requires PoE (Power over Ethernet), and I’m powering it through the UniFi Flex 2.5 PoE switch I reviewed recently. The U7 Lite, like the prior model, doesn’t include a PoE injector.

Specs-wise, the U7 Lite is a 2×2 access point for 2.4 GHz and 5 GHz only—it doesn’t include 6 GHz support. For my environment, which is a rural home with minimal RF interference and modest usage, that’s fine. The jump to 2.5 Gbps Ethernet from the 1 Gbps found on older units opens up some potential bandwidth gains, and I was curious to see just how much improvement I’d get on my Wi-Fi 7 devices.

Before upgrading, I ran some benchmarks using an iPhone 16 Pro Max connected to my U6 Lite prior to its decommissioning. Downstream speeds hovered just under 500 Mbps, and upstream was a bit better, close to 600 Mbps. Those were solid numbers for a mid-range access point, and I saw similar results on my Windows PC as well.

Once the U7 Lite was installed and adopted by the UniFi Controller, I didn’t change any settings initially—just let it run with the defaults to see if the upgrade alone made a difference. And it did. Download speeds immediately jumped to around 700 Mbps. Upload stayed in the same ballpark as before, but the increased downstream bandwidth was a good early sign.

Next, I tried enabling a wider channel width. The U7 Lite allows up to 240 MHz, but that depends on client compatibility and has the potential for channel overlap and interference with other access points. I set it to 240 just to see what would happen, and my iPhone connected at 160 MHz—likely its hardware limit. Still, that change alone brought my download speeds right up to a gigabit, with upload seeing an improvement as well.

Then I tested out Multi-Link Operation (MLO), a new feature in Wi-Fi 7 that allows simultaneous connections across multiple frequency bands—in this case, 2.4 and 5 GHz. I created a new SSID and enabled MLO in the UniFi Controller, but the results weren’t impressive. Downloads dropped a bit compared to the single 160 MHz channel, and upload didn’t see much change either. Latency was slightly worse as well, with occasional packet drops during ping tests. For now, MLO seems like a feature that still needs some maturing—both in terms of firmware and client device support.

When I reverted back to the standard 160 MHz Wi-Fi 7 configuration, latency improved and speeds returned to peak levels. I’ll continue to keep an eye on MLO as I bring in new test devices with stronger radios, but it’s not quite ready for prime time in my setup.

The takeaway so far is that Wi-Fi 7, even on an entry-level access point like the U7 Lite, can deliver meaningful performance gains—especially on the downstream side. It’s a simple, affordable upgrade that integrates easily into existing UniFi networks. I’ll likely pick up another, more robust unit for my upstairs area where traffic is heavier and keep testing from there. As always, more to come!

Disclosure: I purchased the U7 Lite myself. The router I’m using, the UniFi Dream Machine Pro, was sent to the channel five years ago, but all opinions are mine and the video was not sponsored or pre-reviewed.

The Lenovo Yoga Tab Plus Packs a Lot of Value – Full Review

Lenovo’s Yoga Tab Plus is a large Android tablet that packs a fair amount into a single package. It includes the tablet, keyboard, and pen—all for a price that’s often around $700, sometimes less if there’s a sale. You can find it direct at Lenovo or at Best Buy (compensated affiliate links). Shop around and you may get a good price when it goes on sale.

You can see it in operation in my latest video review.

The tablet comes equipped with a nice 12.7-inch LTPS display. It’s not OLED, but the 3K resolution and 144Hz refresh rate made it look sharp and feel very responsive. It supports Dolby Vision HDR with a peak brightness reaching up to 900 nits. The display was quite visible outdoors even under direct sunlight. Colors are accurate too, with full DCI-P3 coverage.

Performance is solid thanks to the Snapdragon 8 Gen 3 processor and 16GB of LPDDR5X RAM. It’s responsive for everyday use, can run Android games well, and even handles emulation up to the PlayStation 2 era reasonably smoothly. There’s 256GB of internal storage, though there’s no SD card slot, which might be a limitation for some. You can expand storage via USB-C, but there’s not much in the way of ports otherwise—just a USB-C and a power button that doubles as a fingerprint reader.

The included keyboard, while not backlit, feels solid—similar to Lenovo’s laptop keyboards with good spacing and travel. It attaches magnetically and folds around the back when not in use. One weak point is the kickstand, which doesn’t sit flush when folded up and feels a bit awkward. Still, the keyboard gives the tablet a laptop-like experience, especially when used with Lenovo’s optional “PC Mode” that lets apps float in windows rather than running full screen.

The included pen introduces something a little different. It offers subtle haptic feedback and a paper-like writing sound that made the experience feel more natural. It also supports pressure sensitivity and charges magnetically on the top of the tablet. The build quality of both the pen and the tablet feels premium, with a metal body and sturdy design.

For media consumption, the tablet supports Widevine L1 DRM, so Netflix and Disney+ stream at the full resolution of the display. Audio is decent with a quad-speaker setup that includes what Lenovo says are subwoofers. There’s not much deep bass, but the sound is balanced and immersive enough, especially in landscape orientation.

The camera system is pretty good especially for conference calls. The front-facing 13MP sensor supports 4K at 30fps and looks better than many laptop webcams I’ve looked at. Around back, there’s a second 13MP camera and a 2MP macro camera. Rear video also records in 4K, but there’s no stabilization, so handheld video can get shaky.

Battery life held up well in testing. Lighter tasks like web browsing and media playback stretched past 11 hours, while more demanding apps will naturally pull that number down. The tablet ships with Android 15 and will get security updates through 2029—less than what Chromebooks or PCs usually get.

There’s also a local AI feature onboard called AI Now. It works completely on device and will analyze attached documents to answer questions. It’s not perfect, but it worked reasonably well in my tests, and it’s entirely self-contained on the device. For more accurate or nuanced answers, you’ll still want to turn to cloud-based tools like ChatGPT or Google Gemini.

All told, the Yoga Tab Plus offers a lot for the price—especially with the pen and keyboard included. It’s not a top-tier device, but for those looking for a larger Android tablet that can handle a bit of everything without breaking the bank, this might be worth a look—particularly if you catch it on sale and don’t mind waiting for the right deal to come along.

Make Your Own Streaming TV Channels with Plex and ErSatzTV (sponsored post)

For my latest monthly sponsored Plex video, I took on a fun project that turned my Plex library into a fully programmed, always-on TV channel. Using an open-source tool called ErsatzTV, I set out to recreate the experience of traditional broadcast television—with scheduled shows, filler ads, and a sense of timing you just don’t get from on-demand shuffling.

I step through how to get it up and running in my latest video.

The idea behind ErsatzTV is pretty straightforward: it links into your Plex server and plays back episodes from your media library on a set schedule. It even keeps track of what episode aired last, so it will step through a season of a show each day or week. It can also shuffle episodes each time. If you tune in halfway through, that’s where you start watching—just like the old days.

To make it work with Plex, you do need a Plex Pass since it ties into the live TV and DVR features.

I installed ErsatzTV on a Windows machine for demonstration purposes, but it’s cross-platform, and you can run it on Linux, macOS, or via Docker. After downloading and extracting the app, I launched the server and configured it through the web interface. The first tweak was enabling hardware acceleration for better performance, which in my case meant selecting the VAAPI option for Intel graphics.

From there, I connected ErsatzTV to my Plex server and synced my TV show library. My mix included old episodes of David Letterman and Johnny Carson, Star Trek: The Next Generation, some 80s cartoons, and a healthy dose of Bluey for the kids. I also added a folder of vintage commercials and PSAs as filler content to help round out the schedule to clean half-hour or hour blocks.

ErsatzTV doesn’t let you slot individual files—everything has to be bundled into collections. So I grouped the ads into a “filler” collection and set up presets for midroll and fallback padding. This way, the system could drop in the right number of ads to stretch shorter content to the next block precisely.

Then came the fun part: building out the channel. I created a schedule starting at 6 a.m. with “Star Blazers and “He-man”, Bluey from 9 to 1, some afternoon Star Trek, and nighttime talk shows beginning at 10 p.m. with Johnny Carson. I used a mix of fixed and dynamic scheduling depending on the content length. ErsatzTV handled the logic to round everything off nicely with filler content when needed.

Once the channel was ready, I registered ErsatzTV as a tuner device inside Plex using its HDHomeRun emulation feature. That let me pull the guide data from ErsatzTV’s built-in XMLTV feed, and just like that, my custom channel showed up alongside my antenna broadcasts. Everything worked as expected: metadata, descriptions, proper timing—it all lined up. If I tuned in late, I caught shows mid-episode, and the transition between shows and filler felt natural.

There’s definitely something satisfying about turning Plex into a virtual broadcast network. It’s more work than hitting “shuffle,” but the end result feels more alive. There’s structure, nostalgia, and the bonus of always having something ready to play, exactly when and where I want it. Now that the framework is in place, I can add more shows, create additional channels, or even bring back “Tuesday Night Movies.” The only real limit is how much media I can cram onto my server.

See more of my Plex videos here.

Disclosure: This was a sponsored video from Plex, however they did not review or approve this before it was uploaded.