Over the past couple of weeks, I’ve been checking out some of the new USB 4.0 10-gigabit Ethernet adapters that have hit the market. In my latest review, we take a look at one from Unifi – known more for their network infrastructure equipment vs. individual adapter cards.
At $200, it sits a bit higher on the price scale compared to others, but this one stands out because it doesn’t have a noisy fan. The outer case gets a bit warm but I noticed no performance degradation under sustained loads.
This adapter is running with the same Marvell AQC113 chipset as some of the others we’ve reviewed recently. Setup was simple. On macOS, Linux (Ubuntu), and Windows, the adapter functioned as plug-and-play in my testing. UniFi also provides a direct link to Marvel’s driver download page for those who need additional support. That’s a refreshing change from some of the vendors I’ve seen who push downloads through sketchy sites.
Performance was consistent across the board. I ran speed tests on all three platforms—Mac, Windows, and Linux—and consistently saw results in the 9.4 to 9.47 Gbps range in both upload and download directions using iPerf. The adapter maintained those speeds reliably with minimal variation.
To get the full performance you will need to connect this to a USB 4 or Thunderbolt connection. In addition to 10 gigabit speeds it will also operate at 5, 2.5, 1 gigabits or 100 megabits. I’m sure it’ll probably support 10 megabits too if you happen to plug it into a 90s era hub.
Overall this one feels like a solid option for anyone looking to get reliable 10 gig connectivity and something I’m comfortable recommending.
I’ve been recovering from laryngitis, but I’m back at it with a look at a new TV tuner from ADTH. This one comes with a lot of buzz from the broadcast industry, which is pitching it as a reliable solution for tuning encrypted ATSC 3.0 TV signals. After spending time with it, though, I found it falls painfully short of expectations. See more in my latest review.
The ADTH tuner will likely cost more than the device you’re plugging it into. It’s also imported from China, so there’s a chance future shipments might cost more due to tariffs. You can find it here on Amazon (compensated affiliate link).
It connects via USB to Android TV or Fire TV devices. Out of all the hardware I tested, the only one it fully worked with was the Onn 4K Pro box. Everything else—like the Nvidia Shield, Onn stick, and Fire TV Stick 4K Max—ran into trouble with encrypted channels. On the Shield, encrypted ATSC 3.0 channels froze after showing a single frame. The Fire TV Stick displayed an error saying DRM wasn’t supported. In each case, unencrypted channels were fine, but the whole point of this tuner is to handle encryption, and that’s where it stumbled. My friend Elias Saba of AFTVNews.com tested twenty supposedly compatible devices and found only two worked as advertised.
To make things more complicated, AC-4 audio compatibility on the host device is also required for ATSC 3 broadcasts. Unfortunately most devices don’t say whether they do. So users are left guessing.
Setting it up was relatively simple. The app is available on the Android and Fire TV app stores. After granting permission for USB access—something I had to do each time I launched the app—it walked me through a channel scan and a dongle firmware update. It found channels quickly and offered a decent guide, both a quick overlay and a more detailed grid. The app also lets you pause live TV and jump back to the live broadcast, but there’s no recording or rewinding.
One feature that stood out was the stats screen. It’s the most detailed I’ve seen for ATSC 3.0 tuning and could be useful for those trying to troubleshoot signal issues or understand what’s coming through the airwaves.
It’s worth noting that the app only works on Android TV and Fire TV but not phones or tablets. I checked some APK sites to see if there was an unofficial workaround for mobile, but couldn’t find anything that worked. It also will never work with PCs, or anything Apple- or Roku-based. That’s a big limitation for a device that’s supposed to represent the future of TV tuning.
All of this brings me to DRM and its cost. Right now, the ADTH tuner is one of the few options that’s officially sanctioned to handle encrypted ATSC 3.0. But the GT Media USB tuner we looked at last year, which doesn’t support encryption but works on a much wider range of Android devices—including mobile—sells for as little as $30 on AliExpress. It even has DVR support via an SD card. Despite being cheaper and more versatile, it’s being held back by the same DRM restrictions that limit broader innovation in the space.
As broadcasters continue to push the FCC to accelerate the ATSC 3.0 transition, we’re left with hardware that still doesn’t deliver on the promise. Two years into this DRM rollout, basic functionality still isn’t guaranteed. There’s more to come this week as the FCC opens public comments on the DRM issue, and I’ll be sharing how to get involved. For now, this is where things stand—and it’s not a great place to be.
My latest video takes a look at RetroAchievements, a free online service that adds Xbox-style achievements to classic games played through supported emulators. Think of it as a gamified layer on top of your retro library, with score tracking, leaderboards, and a whole community of players competing to earn bragging rights.
In the video I demo an achievement I earned playing the Sega Master system version of Choplifter for gaining an extra life without losing any lives. When an achievement is reached, an on-screen badge pops up, and points get logged on the RetroAchievements website. That bumped me to rank 100,800 out of around 111,000 players. It’s not exactly elite status, but it’s a start :)!
Setting it up was straightforward. After creating an account on retroachievements.org, I linked it up with my emulators. There’s a wide range of compatible emulators, including RetroArch, Dolphin, DuckStation, and PCSX2. On my Steam Deck, I use EmuDeck, which simplifies the process even further and logs you in across all your installed emulators.
RetroAchievements has two play modes: “hardcore,” which disables cheats, save states, rewinds, and slow motion, and “softcore,” which allows those conveniences. There’s a separate ranking system for each mode, so your score stays relevant no matter how you like to play. The community enforces rules against cheating, so even in softcore, the competition feels fair.
What makes this system interesting is how the achievements are actually built. They’re tied to the emulator’s memory and look for specific values or in-game events. When the right conditions are met, the emulator triggers the achievement and reports it back to the website. You can even follow other users and compare your scores directly.
Another nice feature is the in-game feedback indicating how close you’re getting to your next achievement. While playing Afterburner II on the Sega Saturn, I went after a particularly tricky achievement called “Too Close for Missiles,” which requires destroying 150 enemies with guns on normal difficulty or higher. A counter appeared on screen, incrementing after every plane was downed with my cannons. I haven’t cracked that one yet—it resets if you game over and continue—but it’s kept me coming back to the game long after I completed it.
The achievements themselves are created by the RetroAchievements community, and becoming an achievement developer involves learning how emulators and game memory work. It’s more than just coming up with fun challenges—you actually have to build them into the game logic without breaking anything. There’s a whole process for ensuring that achievements trigger properly and don’t interfere with the game’s performance.
FPGA-based systems like the MiSTer and Analog consoles aren’t supported, at least not yet. I use a MiSTer for a lot of my retro gaming on a CRT upstairs, and it would be great to get achievements while playing there. Hopefully, support for those systems is on the horizon.
I recently came across a post on Android Authority that a viewer, James Randolph, flagged for me, and it highlights an experimental shift YouTube is making to its notification system. It looks like YouTube is quietly testing changes that could result in fewer notifications being sent—even if you’ve clicked the bell for a channel. The gist is that if you’re not actively engaging with a channel you’ve subscribed to, YouTube might just stop sending you notifications from it altogether.
The details come from a March 26th entry on YouTube’s experiment page. Initially this is a small test targeting people who’ve clicked the bell for “all” notifications but haven’t been watching those channels lately. YouTube’s reasoning? A lot of people, overwhelmed by the flood of notifications, either stop engaging or turn off all notifications at the app level.
From YouTube’s perspective, this is about cleaning things up. But to me, it seems like a fix for a problem that was never really handled well in the first place.
For years, creators like myself have been dealing with how unreliable the bell icon has become. According to my analytics, only about 10% of my subscribers actually click the bell. But of those, just 3.9% actually enabled YouTube notifications on their device.
One of my longstanding issues with the bell is how rigid it is. It pings right when a video goes live, regardless of whether it’s a good time for the viewer. Most people aren’t going to stop what they’re doing and watch immediately. For my part, I started using YouTube’s scheduled digest feature. I have all my notifications come in at 7:00 p.m. every day. That way I can review everything at once and decide what I actually want to watch. Most users likely don’t use that feature.
Another issue is the inability to add a video to the “Watch Later” list directly from a notification. Personally, I rely on the Watch Later heavily. I’ll often find something I want to check out later in the evening on the TV, and being able to queue it up is a big part of how I use YouTube. That functionality is just missing from the notifications tab.
When YouTube first introduced the bell, it was a compromise to give viewers more control after the algorithm started more aggressively recommending videos. People were frustrated that they weren’t seeing videos from creators they cared about. The bell was meant to restore that connection. But the way it was implemented—again, notifying you at the exact moment of publication—just doesn’t work for most people.
There are also little things about how the system works that don’t help. Take the example of turning off notifications for a specific channel from the notification bell menu in the YouTube apps. Turning off notifications from that menu doesn’t revert to the default algorithmic notification setting, it stops ALL future notifications from that channel.
These are relatively small usability issues, but they add up. When YouTube says people aren’t using notifications or the subscription tab, my view is that it’s not because they don’t want to—it’s because the features don’t work well for them. I’ve looked at my own analytics and have seen the same trend. Even though my subscriber count has nearly doubled since 2018, the thumbnail impressions I get from the subscription tab continue to decline. It’s not for lack of interest. It’s just that the tool itself hasn’t evolved in a way that supports how people actually watch videos.
YouTube’s algorithm clearly does a good job keeping people on the platform. But some of us prefer a little more manual control over what we see and when. I for one like to see what I’m choosing not to watch. A few thoughtful changes—both to how notifications behave and to the usability of the subscription tab—could go a long way for viewers who use YouTube as their primary video platform.
I’ve been following the developments in over-the-air television closely, and something interesting is brewing—what looks like the early stages of a format war. Right now, the major U.S. broadcasters are backing ATSC 3.0, but it hasn’t been smooth sailing.
Now, a company called HC2 Broadcasting Holdings, which owns about 60 low-power TV stations across the country, is asking the FCC for permission to use a different technology entirely: 5G TV. Instead of sending out ATSC 3 signals, they want to use the same spectrum to transmit using mobile phone standards, essentially turning TV broadcasting into a data service compatible with 5G phones and presumably set-top boxes. I dive more into what 5G TV is all about in my latest video.
If the FCC allows low powered stations to use 5G technology, we might be looking at two separate approaches to the future of free over the air TV. 5G TV’s backers are hoping that the technology can be cheaply built into mobile phones which have a far higher adoption rate vs. those using antennas on televisions for over the air television.
The way 5G TV works is relatively straightforward. It uses the same modulation as 5G mobile data but is designed for one-way communication—broadcasting data like video streams without requiring a return signal. It would operate in the 470 to 698 MHz range, which is the same frequency band currently used by low-power TV stations. That makes the transition more feasible from a technical perspective, assuming the FCC gives the green light.
What HC2 seems to be banking on is a future where phones can tune into TV signals without a mobile phone service subscription. In theory, a $20 prepaid phone from a big-box store could be enough to access live television and emergency broadcasts. That’s a significant departure from ATSC 3.0, which currently restricts playing back content on anything other than approved TV box.
This 5G-based approach also offers flexibility beyond just video. Since it’s essentially an IP data stream, broadcasters could use it to push all kinds of content. Speeds wouldn’t be blazing fast—maybe 10 to 25 Mbps—but that’s more than enough for several video channels. There’s also potential for emergency communication. In a stadium, for example, people could receive live camera angles or evacuation instructions without clogging up traditional mobile networks.
At the moment, there are no consumer devices that can tune into these 5G TV signals. So if a format war is on the horizon they’ll have a lot of catching up to do with ATSC 3.0 which, while flawed, still has been shipping on higher end televisions for awhile now. But given 5G TV works over the same technology mobile phones use already it shouldn’t be a heavy lift to add it to next generation handsets.
Most of HC2’s stations aren’t broadcasting high-quality content today. Most are just looping infomercials or retransmitting cable channels, often only running standard definition programming. Pivoting to data casting might offer them a more profitable path forward with a larger potential audience, especially if they can license out access or offer value-added services through the new format.
It’s worth watching how the FCC responds. If they approve the request, these broadcasters would have the option to pursue 5G TV instead of sticking with ATSC standards. That kind of flexibility could open the door to innovation, or at least force a new conversation about what over-the-air TV should look like moving forward.
My daughter Ellie really wanted to try her hand at product reviewing. So I let her take over my YouTube channel for April Fool’s Day! Ellie wanted to do her own review of the Hideal magnifier we looked at in my most recent Amazon haul video.
Plex officially released its new mobile app, implementing the UI changes from the beta version that I previewed last November. Along with this release, Plex made some significant changes to personal media users running servers on the free tier.
I found the new app is mostly the same as the beta we looked at previously. One important addition is improved support for TV tuners for live TV. Users can now easily switch between their antenna channels and Plex’s free streaming channels directly within the app. Users can set recordings for over the air channels too but can’t yet schedule them like they could on the previous version of the app.
Another big change is moving music and photos out of the main app. Plexamp now handles personal music libraries, and Plex Photos manages photo libraries. Both are standalone, free apps. Plexamp has additional features for Plex Pass users.
With this new app comes changes to how free tier users stream or share media outside the home along with the first price increase for Plex Pass in nearly a decade. Starting April 29, the cost of a lifetime Plex Pass will go up to $249, yearly to $69.99 and monthly to $6.99. Users can get a Plex Pass at the old price before April 29 using my affiliate link.
For users on the free tier, a new “Remote Watch Pass” is required remote streaming or sharing with others outside their home network. This pass costs $2 per month or $20 annually. If the server owner has a Plex Pass, free tier users can continue accessing that server for free. Additionally a Plex Pass user won’t incur fees accessing a server that is one the free tier. Plex did eliminate the mobile unlock fee previously required for Android and iOS users to stream remotely without limitations.
I have definitely heard from users who are having trouble with the new app or don’t like the changes. But this is the direction Plex is taking in order to keep the product sustainable into the future. I have been a Plex user long before they were a sponsor on the channel and still happy with it. But there are alternatives for those who disagree with the direction they’re taking.
Disclosure: The video attached to this post was a paid sponsorship from Plex. However they did not review or approve it before it was uploaded. All opinions are my own.
I recently picked up a pasta pot from Gotham Steel on the Flip app, mostly because I liked the convenience factor. My kids go through a lot of pasta, and this pot has a built-in straining lid that locks in place with a couple of twistable dials. No need for a separate colander, which means fewer dishes for me to wash. Sounded like a win.
But when I opened the box, I found something unexpected: an arbitration “agreement.” It wasn’t buried in fine print either—it was front and center, making it clear that by using the pot, I was giving up my right to sue the company. If I got injured or had any kind of problem that might lead to legal action, I’d be required to go through binding arbitration. That means no judge, no jury, and no chance to join a class action suit.
I didn’t see any obvious way to opt out of it, either. Sure, I could return the pot to Flip, but there didn’t seem to be any way to just reject the agreement and still keep the product. And from what I could tell, simply using the pot amounts to accepting those terms. It should be noted that the arbitration requirement is not visible until the product is unpackaged as the agreement was resting inside the pot. Additionally there were no mentions of this arbitration requirement on the Flip product page prior to sale.
Naturally, I started digging to understand why a pasta pot would come with this kind of legal baggage. Turns out Gotham Steel was the target of a class action lawsuit a few years ago. Customers alleged that their non-stick cookware didn’t live up to the marketing, and while I don’t know how much the company had to pay out, it seems to have been enough to prompt this arbitration requirement. Now, instead of addressing future customer grievances in court, they’re steering everyone into arbitration from the start.
I’ve never come across something this blatant in kitchenware before. Maybe other companies have similar clauses tucked away in their manuals, but this was right there when I opened the box. It raises the question of how many more every day consumer products are starting to do this, and what it means for consumers in the long run.
I’m curious what people with legal backgrounds make of this kind of clause—whether it’s enforceable or just intimidation. Either way, it’s a small but telling example of how consumer rights seem to keep getting chipped away, even when we’re just trying to make a pot of pasta.
I’ve been following Framework’s hardware ecosystem for a while now, ever since they sent over their Chromebook a couple years back. It’s easily the most upgradeable Chromebook I’ve used—I brought its initial 8GB of RAM up to 64 and storage is just as easily upgradeable. Every single part on the laptop is available for purchase and users can upgrade or repair their laptop using just a single included tool.
Recently, while browsing their website, I noticed they had a sale on replacement mainboards. That got me thinking: could I turn this Chromebook into a Windows laptop and make a desktop “Chromebox” out of the Chromebook’s mainboard? That’s what we set out to do in my latest video, the first of a two-part series.
I picked Framework’s Coolermaster case, along with the necessary components for the Windows laptop which include a Core Ultra 5 mainboard, RAM and a WiFi card.
You can see the full process in the video. From start to finish this took roughly an hour or so. Most of the project went off without a hitch, with the only challenge being the WiFi components.
I discovered while shopping for parts that the Chromebook’s Wifi antenna assembly is different from the Windows version. The antenna cables come out of the opposite side vs. the Windows version which made routing them difficult in the desktop case that anticipated the cable coming from the other direction. But the assembly fit into the desktop case perfectly.
Once everything was placed in the desktop case I decided to power up before sealing things up. I read the experiences Andrew Myrick from Android Central had with a similar project and was expecting some added steps. But to my surprise the system came to life without needing any additional steps.
The desktop case is thin and wide, and due to its cooling system design it can’t lie flat. But it does come with a stand for propping it up on a desk. There’s also a VESA mount option with some hardware that’s stored inside of the stand. That will let you mount the case on the back of a display.
In part 2 I’ll be converting the Chromebook to a Windows laptop. The big question mark will be whether or not the Chromebook keyboard will work on Windows or if I’ll have to order a replacement keyboard. Stay tuned!
Disclosure: The Framework Chromebook was provided to the channel free of charge. The replacement parts I ordered for this project I paid for with my own funds. This is not a sponsored review and no one reviewed or approved this video and post before they were uploaded. All opinions are my own.
I’ve been spending some time lately testing out high-speed Ethernet adapters, and the latest one to land on my desk is a 10-gigabit model from a company called Maiwo. It connects over Thunderbolt or USB 4, and is plug and play on the Mac and Linux. Windows users may want to approach with caution due to the sketchy nature of the driver installation process. More on that below or in my latest video review.
At around $139, it’s priced similarly to the OWC 10-gig adapter, (compensated affiliate link) which is from a more well-known brand. That model has been my go-to for years—quiet, fanless, and consistently reliable. By comparison, the Maiwo unit feels more like a standard, lower-cost alternative. It does have a fan, which only spins up when the adapter is under load, but when it does kick on, it’s noticeably loud. During testing, it was even louder than the mini PC I had it hooked up to.
Performance-wise, it holds its own. I tested it on a Mac with a 10-gig internet connection from Comcast. The download speeds were in line with what you’d expect from a 10-gig setup. Uploads were a bit slower, likely due to network variables, but local network tests using iPerf showed solid speeds in both directions, close to the full 10 gigs.
It runs a Marvell AQC113 chipset, also known as the Aquantia chipset. For Windows, the only way the manufacturer offers to obtain the required driver is through messaging them via Amazon. The link takes you to a non-secure site in Chinese, and even then, the download often fails. The error translates to a simple timeout, but frankly, I’m not sure I’d be comfortable installing a driver from that source anyway.
Usually, if a product doesn’t impress, I skip making a video altogether. But in this case, the way the driver distribution is handled for Windows users is problematic enough that I felt it warranted a closer look—and a warning. If you’re on Mac or Linux, it might be a workable budget option. For Windows users, though, it’s probably best to look elsewhere.
Disclosure: Maiwo sent me the adapter free of charge, however they did not review or approve this content before uploading and no other content was received. All opinions are my own.
My latest video review is of the new GMKTec EVO-X1. It’s built around AMD’s Ryzen AI 9 HX 370 processor and is clearly aiming for the higher end of the mini PC market—not just in terms of performance, but price as well. At the time of testing, this unit retailed for about $892. It’s not cheap, but high-performance mini PCs generally don’t come with low price tags.
The unit I looked at came with 32GB of LPDDR5X RAM running at 7500 MHz. That memory is soldered on, which is a limitation of this AMD chipset for the best performance, so it’s not upgradeable. There is a 64GB version available (compensated affiliate link), and I’d recommend picking the one that best suits your needs up front.
Physically, the EVO-X1 is nicely compact and has a clean design with subtle RGB lighting. The lighting is barely noticeable unless you’re in a dark room.
On the storage side, there are two NVMe slots—both PCIe 4.0—which allow for some flexibility. The one I tested came with a 1TB drive, and you can have a maximum of 4TB in each slot for a total of 8TB. Great for dual booting operating systems.
Up front, there’s an OCuLink port which can allow using a desktop GPU or any PCIe card with the right breakout board. There’s also a 40 gigabit USB4 port, which supports Thunderbolt 3 devices, including GPUs. In a previous video, I tested the system running both an OCuLink and Thunderbolt GPU simultaneously, which was an interesting capability for a device this size.
Additionally it has two 10Gbps USB-A ports on the back, two more on the front, HDMI and DisplayPort outputs, and dual 2.5Gb Ethernet ports, both powered by Intel controllers. That said, Wi-Fi performance was not great. It does support Wi-Fi 6, but I saw significantly lower throughput compared to other Wi-Fi 6 devices in the same physical location. Wired Ethernet is definitely the better option here.
Booting into Windows 11 Pro (which comes pre-activated on most GMKtec systems), the system idled at around 8.4 watts of power consumption—pretty efficient. But the fan was active even at idle. Under load, the fan noise ramps up noticeably. Cooling is aggressive, which helps prevent thermal throttling, but it comes at the cost of constant fan noise. If quiet operation is a priority, this may not be the best choice.
Web browsing was smooth, as expected. The system handled 4K 60fps YouTube playback with a handful of dropped frames. Video editing in DaVinci Resolve is doable, especially for simpler tasks like cross dissolves and basic effects. Once I started layering on more intensive effects, some lag was noticeable, but for basic YouTube-style content creation, the performance was quite serviceable.
Since AMD is positioning this processor as an “AI” chip, I ran a local language model using the DeepSeek 8B parameter model. It worked well enough, though it relied solely on the CPU—not the internal GPU or NPU—so performance was a bit slower than on systems with dedicated GPU acceleration. Still, for light AI workloads, it’s passable.
Gaming was a surprisingly solid experience. Cyberpunk 2077 ran at around 55 fps on low settings at 1080p. No Man’s Sky managed to hit 60 fps most of the time, also at 1080p and low settings. These results are particularly impressive considering everything was running on integrated graphics. Advanced retro emulation should also be well within its wheelhouse.
I also gave Linux a spin using the latest version of Ubuntu. Everything worked right out of the box—video, audio, Bluetooth, and Wi-Fi. Performance was consistent with what I saw on the Windows side, and with the two NVMe slots, dual-boot setups are easy to configure.
So overall, I walked away impressed with the performance and expandability of the EVO X1, even if the fan noise was hard to ignore. It’s not for everyone, especially given the price, but it has a lot to offer for those who want serious performance in a small form factor—and don’t mind a little whirring in the background.
I’ve been following the rough rollout of ATSC 3.0—also known as NextGenTV—for a while now, and this week the transition hit another bump in the road. A dispute over tuner mandates has surfaced between two key players in the process: the Consumer Technology Association (CTA), which represents electronics manufacturers, and the National Association of Broadcasters (NAB), which represents TV broadcasters. I dive into this in my latest video.
The disagreement is notable because these two organizations have worked closely to get this new standard off the ground. Even the NextGenTV logo consumers see on compatible equipment is a registered trademark of the CTA, not the NAB.
Recently, the NAB asked the FCC to push the transition forward, proposing a 2028 cutoff for the current standard in major markets. That proposal included several desired mandates. One, which I mentioned previously, would require manufacturers to include ATSC 3.0 tuners in TVs well before that deadline. But there were a few other items tucked into the request. For instance, the NAB wants the FCC to require that remotes with buttons for services like Netflix also have buttons for broadcast TV. They also want broadcast content to be featured prominently in on-screen menus—right up there with paid placements from streaming platforms.
This is where the CTA pushed back. Gary Shapiro, CTA’s CEO, took to LinkedIn with a public response. He accused the NAB of trying to force an unpopular product on consumers and manufacturers. He noted that less than 10% of Americans rely on antennas for TV and argued that these mandates would increase costs for everyone, especially at a time when affordability is a concern.
The CTA also began lobbying FCC commissioners directly. They brought along cost comparisons, pointing out that TVs with ATSC 3.0 tuners are significantly more expensive. They argue that additional costs—like those tied to licensing and DRM requirements—are part of why manufacturers are reluctant to include these tuners in their products.
And that’s been a sticking point all along. The tuners are pricey. They’re expensive to make and expensive to buy, largely because of how difficult it is to meet all the DRM requirements that come with ATSC 3.0. These restrictions make it tough for smaller companies to enter the market, which in turn limits consumer choice.
A good example is something like the HDTV Mate, a sub $60 tuner that doesn’t meet the DRM standards. It’s more affordable than the few certified options, but because it doesn’t comply with the DRM, it’s not really part of the formal ecosystem. Without the DRM roadblock, I believe we’d already see a wider selection of tuners at better price points.
Broadcasters don’t seem likely to budge on DRM. The CTA seems less focused on that issue than on the broader economic impact of the mandates. Still, the lack of tuners—and the obstacles to building them—is at the heart of why this transition has been so slow.
Looking ahead, I don’t expect the FCC to go along with any of the mandates the NAB is pushing for. It’s hard to imagine this FCC chairman telling manufacturers how to design their remotes or menu layouts. But the broader transition to ATSC 3.0 is probably going to keep moving forward. If nothing changes, over-the-air TV might become even harder to access, which could lead to its gradual disappearance. That might suit some interests, especially if the valuable spectrum currently used by broadcasters gets reallocated or repurposed.
It didn’t have to go this way. With more affordable tuners and fewer restrictions, we might have had a more vibrant market by now—even if it was a small one. But instead, we’re left with a limited selection of costly devices and a standard that’s tough for both consumers and developers to embrace.
I’m not giving up on the DRM issue, and if you’re concerned too, there’s a way to weigh in. You can visit my instructions here to file a public comment with the FCC. I’ll be following this docket closely, and I expect more developments as the FCC begins formalizing its approval process for the transition. Public comment periods and even field hearings are likely on the horizon. I’ll keep watching.
I recently uploaded another Amazon haul, covering about two and a half months’ worth of gear—most of it affordable, a mix of items sent via the Amazon Vine Program, a few from manufacturers, and some that I bought myself. You can see the full list of items at this compensated Amazon affiliate link.
Nothing here was sponsored or pre-approved, just a rundown of what caught my attention and ended up being useful or interesting enough to feature.
One of the highlights was a 10-port USB-C charger from Plugable. It doesn’t come with a power supply, but if you’ve got one that puts out 100 watts or so, it’ll charge multiple devices efficiently by prioritizing the ports from left to right. Great for overnight charging, especially in places like classrooms where you need to juice up a bunch of iPads or devices at once.
Along the same lines, I tried out a universal travel adapter from Minix with a built-in GaN power supply. It delivers up to 170 watts through its USB ports and has built-in plugs for various regions. It’s not a voltage converter, but as a compact travel solution, it packs a lot of utility.
Another interesting find was a clip-on Bluetooth speaker from SuperOne. It’s lightweight, wearable, and doubles as a personal voice amplifier if paired with the right app. It’s not going to fill a large room, but in smaller settings, it could come in handy.
For streaming stick users, I checked out a couple of inexpensive but useful accessories. One was an HDMI elbow adapter that lets your device sit flush with the back of a wall-mounted TV. The other was a 100 Mbps Ethernet adapter that works with Fire TV and other compatible devices. It’s not the fastest option out there—Wi-Fi 6 built into recent low cost streaming sticks is quicker—but it can offer a more stable connection.
I picked up a Bell & Howell power hub from the Flip app as it tickled some nostalgic memories of their heavy duty AV equipment my school used in the 80’s. The design hints at the old-school gear they used to make, but the build quality doesn’t quite live up to that legacy. Perhaps due to the fact that Bell & Howell brand is no owned by a private equity firm. Still, it offers USB ports on three sides and a retractable power cord, though the latter was a bit clunky in use.
I also tried out some Apple-focused chargers from Belkin. These included a MagSafe-compatible 5,000 mAh battery with a kickstand, a foldable 3-in-1 travel charging pad, and a smaller 2-in-1 version. Each charger handled phones, AirPods, and Apple Watches to varying degrees, depending on the model. They’re simple, well-built, and compact enough to throw in a travel bag.
There was also an electric candle lighter—one of those arc-style models you charge over USB. It worked, but the narrow arc gap makes it easy to gunk up with wax. Practical outdoors, maybe, but definitely not something I’d leave around kids.
Finally, I spent some time with a surprisingly decent digital magnifying glass from a company called HIDEA. It has adjustable zoom, takes photos, and even comes with sample slides. It’s not a professional microscope, but it does a good job for its price, and I found it fun to play around with.
That wraps up this batch. I tend to go through a lot of gear, but only a fraction makes it to the table. The rest doesn’t pass the sniff test. Once I’ve got enough new stuff worth showing, I’ll put together another round. These videos are always fun to make, and it’s good to see that people still enjoy watching them.
My latest video review is of the Plugable USBC-E5000 5 gigabit ethernet adapter —something that’s still relatively uncommon compared to the more widely available 2.5 gigabit options. The unit supports 5 Gbps speeds when plugged into a 10 Gbps USB 3.2 port, meaning you don’t need Thunderbolt or USB 4 to hit those higher transfer rates. You can see it in action here.
It’s powered by the Realtek RTL8157 chipset, which made setup a smooth process on macOS and Linux. Windows was a bit different. It recognized the device without needing a manual driver install, but initial download speeds didn’t meet expectations. Installing the drivers directly from Plugable’s site resolved that issue. I’d expect Windows to eventually update with better out-of-the-box support.
That chipset choice makes a difference. A few years back, I tried similar 5 gig adapters using less reliable chipsets, and the experience wasn’t great. This one worked consistently across all three major operating systems. It also worked with a few of my smartphones, although I found performance better on iOS vs. Android.
It’s worth noting that while this is a 5 Gbps adapter, it also scales down to 2.5 Gbps, 1 Gbps, and even 100 Mbps depending on the network switch it’s connected to. However, to get the full 5 Gbps performance, the USB port has to support 10 Gbps throughput. Plug it into a slower port, and you won’t get top speeds.
Once I had it connected to my Mac, I ran a speed test using my 10 Gbps internet connection. The results were in line with what I expected from a 5 gigabit connection—downloads and uploads both performed well, taking into account the usual network overhead. I saw similar performance on my Windows and Linux machines.
There’s not much else to the product. It does what it says. It’s compact, has indicator lights for link status, and so far it’s been reliable. Plugable is also a U.S.-based company with domestic support, which might be a consideration for those who like knowing there’s someone they can reach out to if anything goes wrong. Most of their products, including this one, come with a two year warranty.
If you’re looking to move beyond 2.5 Gbps over USB and want a relatively straightforward upgrade, this might be something to keep on your radar.
The version I tested is configured with an Intel Core Ultra 7 258V processor, 32GB of non-upgradeable RAM, and a 1TB NVMe SSD. There’s also an extra slot for storage—2230-sized if you’re looking to expand or do something like a dual boot setup with Linux. The price as tested comes in at around $1,200 (compensated affiliate link), though there’s a lower-tier version with a Core Ultra 5 and less RAM for roughly $1,000. Prices will likely shift as the year progresses, so it’s worth shopping around. You can also find them at Amazon where the price is always varying (compensated affiliate link).
The P5 has a 14-inch LED display with a 2560×1600 resolution and a 144Hz refresh rate, which was set to 60Hz by default but easy to switch. The screen brightness tops out at 400 nits—decent enough for a business-oriented machine but not incredibly bright. Color accuracy is also solid with 100% sRGB coverage, which should work fine for light creative tasks.
The build feels light at 2.8 pounds, and while the chassis is slim and portable, it comes at the expense of some flex in the keyboard deck. That said, the keyboard itself is well-sized, backlit, and pleasant to type on. The trackpad tracked well and felt solid—no complaints there.
In terms of ports, you get two Thunderbolt 4 ports which also work with USB-C devices, a full-size HDMI port, two 10Gbps USB-A ports, a headphone/mic combo jack, and a Kensington lock slot. The laptop doesn’t include Wi-Fi 7 but does support Wi-Fi 6E, which was more than sufficient during testing. The speakers are downward-firing and fine for casual use—especially calls and voice content—though headphones are still preferable for richer audio.
Biometrics are handled through both the webcam, which supports Windows Hello, and a fingerprint sensor embedded in the power button. The webcam is 1080p and includes some AI-driven enhancements through ASUS’s software suite. It also has a physical privacy shutter.
Battery life was solid. I was able to get close to 10 hours with light productivity tasks and lower screen brightness. It’s possible to squeeze out even more longevity depending on the workload. More intensive tasks like video editing or gaming will drain it faster, but the battery held up well throughout a full workday when used conservatively.
Speaking of AI features, ASUS includes its AI Expert Meet software, which can transcribe and summarize meetings directly on the device. The transcription worked offline using the NPU, and the summarization ran on the Intel processor’s GPU. It wasn’t particularly fast or accurate, especially when multiple speakers were involved, but it’s a useful tool that doesn’t rely on cloud access or subscriptions.
Performance-wise, web browsing was smooth with responsive page loads. YouTube playback at 4K/60fps dropped a few frames early on, but nothing disruptive. Benchmark scores in line with similar laptops confirmed that it holds up for general tasks. Basic video editing is also possible—simple projects like stringing clips together ran without issue, though more demanding workflows would require a more powerful PC or an external GPU via Thunderbolt.
Gaming was possible at lower settings. Cyberpunk 2077 ran between 25-35 FPS at 1080p on low settings. 720p ran a lot better. But still, given the lack of a discrete GPU, it’s amazing how far integrated graphics have come. Benchmark scores were comparable to a discrete GTX 1650 Ti from just a few years ago.
Thermal performance held up under load. The system passed a 3DMark stress test with a 98.5% score and stayed impressively quiet. The fan noise is minimal and the fan only kicked in during intensive tasks like gaming, and otherwise stayed silent.
One area where the laptop didn’t perform well was Linux compatibility. I booted into Ubuntu 24.1 and found that Wi-Fi, Bluetooth, and audio didn’t work. That was a surprise given that a similar ASUS VivoBook had no issues. It’s most likely a driver situation, so expect some troubleshooting if you’re thinking about switching to or dual booting Linux.
Overall, this laptop doesn’t stand out visually, but it offers reliable performance and some features that business users might appreciate—like the three-year warranty and nice display. Depending on what you’re looking for, this one might be worth keeping an eye on as prices shift.
Disclaimer: The laptop was provided on loan from Asus. No compensation was received for this review, and no one reviewed or approved this post or my video before it was uploaded.
In my latest video, I take a look at UTM, a free and open-source virtualization app for macOS that allows users to boot up Windows, Linux distributions, retro operating systems and even other instances of MacOS. UTM provides an efficient solution without the licensing constraints and bloat of commercial alternatives like Parallels or VMware Fusion. You can find it for free on their website.
UTM is built on QEMU, an open-source emulation framework, and supports both virtualization and emulation. When running ARM-compatible operating systems in virtualization, such as Windows 11 ARM or an ARM-based Linux distribution, the performance is close to native. Emulating x86 based and other operating systems is slower but still functional.
I tested UTM on my M2 MacBook Air, that I purchased about two and a half years ago. This machine remains powerful enough for my needs on both macOS and in virtual environments. If you’re considering one of these machines, there have been some great deals lately, with prices dropping as low as $700 in some cases (compensated affiliate link).
Setting up Windows 11 ARM in UTM was straightforward. The software doesn’t provide the operating system itself, but with tools like Crystal Fetch, downloading the necessary installation files from Microsoft was simple. Once installed, Windows 11 ARM supports running both 32-bit and 64-bit x86/x64 apps through Microsoft’s built-in translation layer. This allows for smooth execution of many legacy applications, such as Microsoft FoxPro, which I demoed in the video. However, gaming performance is a different story—Windows in UTM doesn’t have GPU passthrough support, so graphically demanding applications won’t run well.
On the Linux side, UTM provides pre-configured images for quick setup. With GPU acceleration enabled for Linux, some applications run more efficiently than on Windows. File sharing between macOS and the virtualized system is also simple through the use of a shared folder, though not as seamless as drag-and-drop functionality in commercial alternatives.
UTM also allows users to emulate older operating systems designed for different processors, including Windows 95 and classic PowerPC macOS versions. Running a fully configured Windows 95 installation on a modern Mac was a fun exercise in nostalgia, complete with old files and applications from a backup of my college laptop from 1998.
The customization options in UTM are extensive. Users can tweak system configurations down to CPU architecture, RAM allocation, network adapters, and sound drivers. While this level of control can be overwhelming, many UTM users are sharing pre-built system images that offer a great starting point.
For anyone looking for a lightweight, cost-free virtualization tool on a Mac, UTM is worth trying. Whether you need occasional access to Windows, a Linux development environment, or even a retro computing setup, UTM provides a flexible and powerful option without the cost or complexity of commercial alternatives.
I decided to try something unconventional: attaching an eGPU to the system bus of a budget GMKTec G3 Plus mini PC (compensated affiliate link), curious to see if I could push the limits of what these sub $200 PCs are designed for.
The G3 Plus lacks the typical external graphics connections like USB4 or Thunderbolt. Instead, I used a workaround — an NVMe to Oculink adapter (affiliate link)—to see if this approach could effectively attach the eGPU to the system bus.
The setup process was straightforward. The G3Plus has two storage slots, one for its included NVME system drive, and a second slot for M.2 SATA disks. I imaged the existing storage drive onto an M.2 SATA drive to free up the NVMe slot, then installed the OCuLink adapter. The external GPU, a GMKTec AD-GP1 has an AMD RX 7600M XT, along with Oculink and USB 4 / Thunderbolt connection options.
Other eGPUs can work too if you get an Oculink PCIe slot adapter like I demoed in this video a few weeks ago. Many of the Oculink NVME kits come with the PCIe adapter. The cool thing about this is that you can interface just about any PCIe card with the PC.
Installation was simple, with the Oculink adapter inserting just like any NVME drive would. The G3 Plus’ slots are accessible on the top of the PC making it very easy for all of this to work.
It’s always a little nerve wracking when the “moment of truth” arrives. To my surprise the system immediately recognized the GPU right at boot and Windows 11 loaded without issue. Installing AMD’s Adrenalin drivers confirmed the GPU was fully detected, and from there, it was time to see how well it performed.
Cyberpunk 2077 was the first test, running at 1080p with ray tracing set to low. The system delivered frame rates in the range of 45 to 50 frames per second, but the CPU quickly became the bottleneck. The GPU utilization never exceeded 62%, demonstrating the limitations of pairing higher-end graphics hardware with a budget processor. Disabling ray tracing barely improved performance.
Red Dead Redemption 2 told a similar story. At 1080p on the lowest settings, frame rates fluctuated between 30 and 45 fps, depending on the complexity of the environment. The CPU remained fully maxed out, while the GPU hovered at just 30% utilization. This was a clear example of how throwing a powerful GPU into a low-end system doesn’t always yield massive gains.
Doom Eternal, however, showed a different side of the experiment. Running at 1080p with the lowest settings, the game reached 128 frames per second, dropping to the 90s in more demanding scenes. The GPU was significantly more engaged in this title, reaching 70% utilization. Turning on ray tracing caused a minor performance drop but still delivered a smooth experience, proving that certain games benefit much more from a powerful GPU than others.
Benchmarking with 3DMark’s Time Spy test revealed a significant GPU-driven boost. Without the external GPU, the mini PC scored 450 points. With the eGPU attached, the score jumped to 6,449, a stark difference that reinforced the impact of the external GPU—when the workload allowed it.
Beyond gaming, I tested an AI large language model with Ollama to see how well the setup could handle AI-based tasks. Running an 8-billion-parameter model, the GPU took full control, rapidly generating text while utilizing its compute power and 8GB of video memory to do it.
While this is not the most practical configuration, the experiment demonstrated the versatility of mini PCs when expanded through OCuLink. Despite some limitations, it was surprising to see how plug-and-play the process turned out to be.
Disclosure: I purchased the Mini PC with my own funds and GMKTec provided the eGPU to the channel free of charge for my prior review. No other compensation was received and they did not review or approve my content before it was uploaded. All opinions are my own.
UPDATE: Google has now rolled out a fix for Chromecast users who reset their devices. From their latest blog post:
For users who have performed a factory reset, you will need to update your Google Home app to the latest version (3.30.1.6 for Android and 3.30.106 for iOS) to set up your Chromecast (2nd gen) or Chromecast Audio device again. The app roll out has begun and may take a few days to roll out to everyone. We’ll post a confirmation once the roll out to all users is complete.
Users who did not reset their devices were updated a little earlier this week with an over the air firmware update. Below is the background on the situation.
I recently noticed an unexpected wave of comments on my Chromecast video from a few months ago about the second-generation Chromecast devices suddenly failing to stream content. Mine stopped working too, displaying an error message that the device wasn’t trusted.
In my latest video, we take a look at this issue and what it might mean for other useful long-lifespan devices.
I was surprised by how many of these decade-old Chromecast dongles are still in use, although perhaps I shouldn’t be. Even the 1st generation Chromecasts handle 1080p output, support popular apps, and offer a simple interface that many consumers never felt a need to replace.
Google did respond quickly to the issue and posted a brief statement on its support pages urging users not to factory reset their Chromecasts. They later pushed out a fix that updated all of those non-reset Chromecasts with presumably a new security certificate including mine. But there is still no solution for those who did a factory reset prior to the Google’s support guidance.
I’ve also been following similar concerns in the broadcast television industry, where the upcoming ATSC 3.0 standard allows for signal encryption that requires hardware-based certificates. Many of those certificates carry extended expiration dates, but the Chromecast situation serves as a reminder that even a 10-year window can seem short when a device is still perfectly functional. It would be unfortunate if these devices become e-waste simply because a DRM certificate lapses and can’t be renewed.
While the fix has given relief to those who didn’t reset their units, a portion of owners still have to wait for a workable solution. This case stands as a reminder of how dependent many gadgets are on ongoing support for restrictive DRM even when the hardware itself remains perfectly capable.
As a follow-up to last year’s Ultimate C model, these new controllers introduce some upgrades, particularly the inclusion of hall effect sticks and on the PC version hall effect triggers, a feature previously reserved for higher-end models. The Switch doesn’t support analog triggers so the Switch variant of the 2C has a standard digital trigger. Both models have reliability improvements to prevent wear and tear from heavy gaming sessions.
One aspect that immediately stood out to me is how solid these controllers feel despite their budget-friendly price. They are well-balanced, with sturdy plastics that don’t feel cheaply made.
Latency performance was impressive. I ran my usual 240 frames per second slow motion video test, and the PC version connected via USB performed exceptionally well, matching the fastest controllers I’ve tested that cost far more. Wireless connectivity on the PC version is flexible, offering both dongle and Bluetooth options, although the dongle provided the lowest latency. Latency is higher on the Switch version both in wired and wireless configurations due to the Switch’s USB and bluetooth controller interfaces.
But retro game fans will be disappointed with the directional pad. 8BitDo has refined their d-pad designs considerably over the years, but the Ultimate 2C feels like a regression. While its smooth rolling might appeal to fighting game enthusiasts, I found it problematic titles like Zelda as it introduced errant diagonals. It was hard to keep Link walking in a straight line.
There are some limited customization options on the 2C. Two additional buttons on the controller’s back allow mapping individual or multiple simultaneous button presses, though without the ability to save profiles or deeper software adjustments seen in other 8BitDo models. Most of the buttons can also have turbo functionality.
In modern gaming scenarios, modern games performed well with smooth analog controls and decent rumble feedback both on the PC and Switch. The Switch version also supports motion controls!
Overall, the Ultimate 2C controllers deliver considerable value for gamers who might need to buy a bunch of controllers for couch co-op or kids that might be a little too rough on a more expensive controller. While retro gamers will find the d-pad limitations frustrating these controllers do offer a reliable and cost-effective choice for modern games.
Disclosure: 8bitdo distributor AKNES sent these to the channel free of charge. They did not review or approve the video or this review before publishing, no other compensation was received and all opinions are my own.
Libreoffice might be familiar to tech enthusiasts as it comes preinstalled in many Linux distributions, but it’s not likely as wide known to the general public. Unlike subscription-based office suites, LibreOffice allows full ownership and control of your files without requiring an internet connection.
Installation is straightforward. Users can head to libreoffice.org, download the appropriate version, and get started. In addition to supporting most operating systems, LibreOffice also has native support for Apple Silicon and ARM-based Windows devices. The interface has a classic look reminiscent of Microsoft Office before the introduction of the ribbon menu (although that interface is an option). It feels intuitive, with essential features easily accessible without extra layers of complexity.
The suite includes a word processor (Writer), a spreadsheet application (Calc), and a presentation tool (Impress), all of which offer compatibility with Microsoft file formats. Documents, spreadsheets, and slides created in Word, Excel, and PowerPoint open in LibreOffice with minimal formatting issues. However, some complex documents may require adjustments. LibreOffice also includes Base, a database application that supports ODBC but does not fully replace Microsoft Access. Other tools like Draw, for vector graphics, and Math, for creating complex formulas, round out the suite.
LibreOffice handles older files exceptionally well. Files created in early versions of Microsoft Office that are no longer supported by modern software can often be opened without issue. This makes it a valuable tool for those with archives of older documents that need access.
One key limitation of LibreOffice is its lack of real-time collaboration. Unlike Google Docs or Microsoft 365, it does not allow multiple users to edit a document simultaneously. There is a basic collaboration feature in Calc, but changes only appear only after saving, rather than in real time. Additionally, mobile integration is not as seamless. While apps like Collabora Office enable mobile editing, the experience is limited compared to cloud-based office suites.
Chromebook users can install LibreOffice through the Linux development environment. The process involves enabling Linux in Chrome OS settings and running a few simple command-line instructions to set up the suite. Once set up, LibreOffice runs locally, allowing offline document creation and editing without reliance on Google Drive or other cloud services.
LibreOffice provides a functional, no-cost alternative to mainstream office software. It offers full control over files without requiring cloud storage or monthly fees. While it lacks some modern collaboration features, it compensates with reliability, compatibility, and an interface that feels familiar to long-time office software users. For those who prefer working offline or want to avoid subscriptions, LibreOffice is definitely worth a try.