GeForce Now Gets a Linux Client

In my latest video, we revisit GeForce Now and take a look at the new official Linux client for Nvidia’s game streaming service.

This release is not as feature-heavy as some previous updates, but it represents a meaningful change for Linux users who until now have primarily relied on browser-based access to the service. This follows a Steam Deck client that I took a look at recently.

GeForce Now is a subscription-based service that streams games users have already purchased from platforms such as Steam, GOG, Epic Games Store, Ubisoft, EA, and certain Xbox titles with PC versions. Xbox PC Game Pass titles can also be accessed if the user has an active subscription. Not every game in a user’s library is supported, as developers must opt in to cloud streaming, but the catalog covers many well-known titles. In my previous update we also looked at Nvidia’s new feature that allows users to install unsupported games too.

Because GeForce Now runs games remotely, client-side hardware requirements are relatively modest, making systems like this a practical test case. To test the client, I ran it on a very low-end system: an inexpensive mini PC powered by an Intel N100 processor. This is the type of super low cost hardware simply can’t run modern games, which makes it useful for evaluating how much of a value-add a game streaming service can provide. Back before the RAM crisis this PC was selling for well under $200.

The service is offered in multiple tiers. The free tier supports up to 1080p at 60 frames per second, includes advertisements, limits sessions to one hour, and places users in a queue for access. The Performance tier increases resolution to 1440p at 60 frames per second, while the Ultimate tier offers access to higher-end GPUs in the cloud, enabling resolutions up to 5K and frame rates as high as 240 frames per second on supported games. Both paid tiers include a monthly cap of 100 hours, with additional time available for purchase once that limit is reached.

The install process for the new linux client is functional but still feels a bit rough around the edges, particularly compared to more polished platform-native installers. The new client is designed for x64-based PCs and is currently targeted at Ubuntu 24.04 LTS. Installation is handled through a Flatpak package downloaded directly from Nvidia rather than through a distribution’s package manager. After downloading, the installer needs to be marked as executable before it can be run.

After installing the Linux client, I launched No Man’s Sky from my Steam library. As with other GeForce Now clients, the service spins up a remote PC instance running Steam, which allows cloud-synced save files to load automatically. In this case, my existing save was available without any additional steps.

Running at 4K and 60 frames per second, the game performed smoothly on the low-cost mini PC. Network statistics showed a latency of around 11 milliseconds from my home in Connecticut to Nvidia’s New Jersey data center. The system was connected via Ethernet, which remains the recommended way to use the service given the bandwidth demands of high-resolution game streaming. But a decent Wifi 6 or 7 access point should deliver adequate performance on a

I also tested the client earlier on a 1080p, 144 Hz display and was able to exceed 60 frames per second without issue, despite the limited client hardware. While the Linux client currently lacks support for features such as HDR and cloud-based G-Sync, it does support server-side options available to higher-tier subscribers, including DLSS and hardware ray tracing for compatible games.

There are some usability issues to note. Display scaling was not respected on a 4K desktop set to 200 percent scaling, resulting in very small interface elements. And the interface felt a bit slow and clunky on my low end hardware but thankfully the clunkiness went away once a game was loaded up.

Overall, the Linux client delivers a more consistent experience than running GeForce Now in a browser and makes the service more accessible to users who have adopted Linux as their primary operating system. For those with lower-end hardware, it provides a way to run demanding games using remote resources, with performance that is largely dictated by network quality rather than local specifications.

ADTH’s ATSC 3.0 Box Woes Kill the Industry’s Arguments Regarding Over the Air TV Encryption

I’ve been spending the last few days reading through the filings in the FCC’s ATSC 3.0 docket now that the comment period has closed, trying to understand how broadcasters, device makers, and industry groups are framing the next phase of the over-the-air television transition.

While I was doing that, I went upstairs to check on my own ADTH tuner, a device that’s supposed to handle encrypted ATSC 3.0 channels without needing an internet connection. It wasn’t working. Encrypted channels wouldn’t tune at all, and the box was throwing content protection errors that hadn’t been there before.

In my latest analysis piece, I talk about how widespread problems with this box tuning encrypted channels popped up just as the industry was saying there were no concerns with DRM.

That problem sent me down a familiar path. ATSC 3.0 is the planned successor to today’s ATSC 1.0 broadcast standard, and on paper it brings technical improvements. In practice, the transition has been complicated by broadcasters choosing to encrypt free, over-the-air signals. That decision has narrowed consumer choice and added layers of complexity that simply didn’t exist before. The industry’s assurances that this system is mature and reliable don’t line up with what I’m seeing in my own home.

One of the filings I reviewed came from ADTH itself. The company strongly supports the transition and argues that there are no real technical barriers to consumer devices receiving encrypted broadcasts. Encryption and digital rights management, they say, are routine in modern electronics.

That’s hard to square with my experience. After repeated errors, I tried a factory reset. Instead of fixing anything, the device dropped into a boot loop, endlessly scanning channels and rebooting. Even with an internet connection restored, it refused to recover. At that point it stopped being a TV tuner and effectively became a brick.

What made this more than a minor inconvenience was timing. We were in the middle of a significant snowstorm, the kind of situation where over-the-air television has historically been a reliable source of local information. Because the encrypted channels wouldn’t tune, that information simply wasn’t available on this device. And this doesn’t appear to be an isolated issue. I’ve heard from viewers and seen reports on Reddit and AVS Forum from people around the country whose boxes stopped working around the same time. Some even reported that disconnecting the internet made their tuners work again, which raises uncomfortable questions about how these systems are actually operating.

At the same moment consumer devices were failing, the group that oversees the encryption system, the A3SA, told the FCC it has seen no evidence of approved devices failing to work with encryption. They also suggested that any reported issues are generally resolved with firmware updates. That response glosses over a basic problem: firmware updates require an internet connection. Requiring internet access just to watch free, over-the-air television undermines one of broadcast TV’s core purposes, while adding cost and fragility.

The A3SA also describes itself as a “neutral, standards-based administrator.” From what I’ve seen, that neutrality is questionable. The group is made up of major broadcasters and has effectively decided which manufacturers can and can’t participate. SiliconDust’s HDHomeRun, a widely used network tuner, has been denied approval, while other devices with similar technical characteristics have been cleared.

Another theme running through the filings is piracy. Broadcasters cite tens of billions of dollars in losses and argue that encryption is necessary to protect their content. When you dig into the examples they reference, though, the picture changes. One high-profile piracy case they cite involved stealing encrypted signals from cable and satellite providers, not rebroadcasting free over-the-air signals.

Encryption, it appears, inconveniences only those who are viewing content lawfully – not the pirates.

Broadcasters also warn that without encryption they risk losing premium sports programming. Yet recent rights deals tell a different story. The NFL, NBA, MLB, NASCAR, and major college conferences have all committed to long-term agreements that keep marquee events on broadcast television for years to come. These deals were struck without any guarantee that over-the-air signals would be encrypted, which undercuts the argument that encryption is essential to retaining top-tier content.

The FCC has also raised questions in this filing round about consumer rights, particularly the long-standing right for consumers to record broadcasts at home for personal use. That right was established decades ago, but encryption complicates it. Circumventing DRM, even for lawful personal recording, can be illegal. The A3SA argues that internal rules already protect home recording, but those assurances are tied to current simulcasting requirements that may disappear. Once they do, the only remaining safeguards would be voluntary commitments from broadcasters whose financial incentives don’t necessarily align with consumer flexibility.

Underlying all of this is a business reality that the National Association of Broadcasters acknowledged more directly in its own filing. Encryption is about protecting retransmission fees, the charges cable and streaming providers pay to carry broadcast channels. Those fees have risen sharply over the years, and making free reception less convenient creates pressure to return to paid services. That strategy may make sense from an industry perspective, but it runs counter to the idea of broadcast spectrum as a public resource.

There’s also nothing in the current framework that limits encryption to a single system. The ATSC admits in their filing that multiple, incompatible schemes could emerge, adding yet another layer of confusion for viewers and device makers alike. At that point, the promise of ATSC 3.0 as a straightforward upgrade starts to look like something else entirely.

After reading the docket and dealing with a tuner that worked one day and failed the next, I’m left with the sense that encryption over the public airwaves is creating problems faster than it’s solving them. Broadcasters were granted access to spectrum at no cost, with the understanding that they would serve the public interest. Turning free television into a fragile, tightly controlled experience doesn’t seem consistent with that mission. I plan to file a reply in the FCC proceeding during the response window, and there’s more in these filings worth unpacking.

Stay tuned for more and see my full ATSC playlist here!

Nine Reviews in 24 Minutes – My Latest Amazon Tech Haul!

It took me six months but I finally pulled together enough random gadgets for my next Amazon Gadget Haul “lightning round” of product reviews!

Check it out here!

This time I have nine different devices to check out! A majority came in free of charge from their manufacturers, but this is not a sponsored review nor has anyone reviewed or approved this video prior to uploading. All product links below are compensated affiliate links.

The first item I looked at was the Ostation 2 Pro, a battery charging system designed for AAA and AA cells. It accepts nickel metal hydride batteries as well as Ostation’s own rechargeable lithium options, which provide a full 1.5 volts for devices that expect alkaline batteries. Batteries drop into the top and the unit mechanically feeds them into the charging bays, displaying charge status on a small screen. Once charged, batteries are deposited into a drawer at the bottom, making it easy to grab fresh ones. It can only charge two AA and two AAA batteries at a time, which limits throughput, and it does make some motor noise while operating, but it functions as a kind of battery inventory system that keeps everything in one place. The Pro version also includes a magnetic charging pad for Ostation flashlights, though the display features themselves don’t add much beyond status information.

Staying on the theme of power, I then moved to desktop chargers. One was the Joyroom Podix, a 140-watt GaN charger with two retractable USB-C cables built into the unit. It’s fairly large, which makes it less ideal for travel, but convenient for a desk setup where cables often go missing. A small display shows total power draw, and while it comes with a base and strong adhesive option to keep it in place, that adhesive feels aggressive enough to warrant caution on finished surfaces.

I also tried Anker’s new 160-watt Prime charger, which packs three USB-C ports and a built-in display into a wall charger. What sets it apart is app integration over Bluetooth, allowing real-time monitoring of power delivery and per-port configuration, including priority modes and wattage limits. It doesn’t offer remote access over Wi-Fi, but standing near the charger you can see detailed data about what each device is drawing. The physical design holds more securely in the outlet than some older Anker models I’ve used, and it’s likely to replace my existing everyday charger.

From power to input devices, I next looked at the Retro Fighters Hunter 360 controllers. These are modern replacements for the Xbox 360 controller, complete with Hall effect analog sticks and mechanical D-pad switches. They work on PCs and with the Xbox 360 itself, though the console requires a dedicated 2.4 GHz dongle per controller. The dongle is required on the 360 as it won’t connect to its built in wireless controller system. Inputn lag was minimal on both wired and wireless, and the build quality felt solid. Voice chat isn’t supported through the controller, which is one limitation for anyone still using those features on original hardware.

Next up is a Thunderbolt 5 dock from WaveLink. With Thunderbolt 5 docks now priced similarly to Thunderbolt 4 models, it makes sense to consider the newer standard even if your current computer doesn’t fully support it. You’ll be ready to go when upgrading your hardware to a Thunderbolt 5 model. The increased bandwidth allows for more demanding multi-display setups, and the dock offers multiple Thunderbolt passthrough ports along with USB-A, SD card, and audio connections. Ethernet performance was unfortunately typical of what I’ve seen on similar docks, with slightly reduced downstream speeds on macOS despite having a 2.5-gigabit port. Like most docks in this class, it relies on a large external power supply to deliver up to 140 watts to a connected computer.

Audio came up next with the latest Amazon Echo Studio which I purchased with my own funds. It’s smaller than earlier versions but still produces a wide, bass-heavy sound that feels substantial for its size. Beyond audio, it now serves as an entry point to Amazon’s Alexa Plus features, which include more conversational responses and, more interestingly, the ability to create smart home routines using plain language. I was able to set up a lighting routine simply by asking for it, without navigating menus in the app. While the assistant tends to be more verbose than earlier versions, the routine creation alone could save time for people who struggle with smart home configuration.

Another device aimed at productivity was the Plaud Note Pro, an ultra-thin voice recorder designed to live on the back of a phone. It records phone calls or ambient audio, stores hours of recordings locally, and syncs them to a phone app. From there, recordings can be transcribed and processed into meeting notes using built-in AI templates. While the subscription model and upselling are hard to ignore, the hardware itself is compact and practical, and the all-in-one workflow may appeal to people who want transcription and summaries without juggling multiple tools or knowledge of AI prompt optimizations

The final item was more of a preview: the RØDECaster Video S. It’s a compact video switcher with multiple HDMI inputs, audio inputs including XLR, and the promise of features like NDI support. I didn’t review it in depth yet, but unboxing it gave a sense of how it might fit into lower-cost video production setups, especially compared to older switchers that haven’t seen updates in a while.

These haul videos don’t run on a fixed schedule—they happen when enough interesting items pile up—and this batch covered a wide range of everyday tech problems, from keeping batteries charged to simplifying workflows at a desk. You can check out prior editions here!

Abbott Lingo Review : Over the Counter Glucose Biosensor for Non-Diabetics

I like data. I spend a lot of time looking at analytics from my YouTube channel, telemetry from devices around my house, and usage stats from the services I rely on every day. What I don’t usually have access to is real-time data about what’s going on inside my own body. But at CES this year, I ran into Abbott and was provided with a two-week trial of their over the counter Lingo continuous glucose monitor for non-diabetics.

Check it out in my latest review!

The sensor sits on the back of my arm and sends blood sugar readings to my phone every few minutes. Abbott has long made glucose monitors for people managing diabetes, but Lingo is positioned differently. It’s aimed at people without a diabetes diagnosis who want more insight into how food, exercise, and daily habits affect their blood sugar.

The hardware itself was easier to live with than I expected. The applicator looks intimidating at first glance, but the actual installation was painless for me. I didn’t feel a prick or sting—just a click, and it was done. One small omission in my box was an alcohol wipe, so you’ll want to have one handy before applying it. Once attached, the sensor stayed firmly in place through showers and daily activity, to the point where I mostly forgot it was there.

After pairing it with my phone over Bluetooth, it took about an hour for the first reading to appear. From there, the app updates roughly every five minutes. The real value comes from seeing how those numbers change in response to everyday choices. Eat lunch, and you can watch the curve start to rise. Go for a walk, and you can see how even light exercise affects the slope and duration of that spike.

Lingo tries to make this approachable by translating glucose spikes into what it calls a “Lingo score.” The score reflects how high your blood sugar rises and how long it stays elevated. One evening, I had leftover sausage and pepper pizza. The resulting spike was sharp, dipped, then rose again as digestion continued, earning me a high score for that meal. Earlier in the day, a healthier and more protein-heavy turkey sandwich produced a much smaller, shorter-lived rise.

What surprised me most was how quickly this started influencing my behavior. Knowing that a certain food would likely generate a bigger spike made me think about timing—whether I could follow it with a walk—or whether it made more sense to choose something else. The app reinforces this by suggesting simple mitigations, like light exercise after eating, and by offering challenges focused on habits rather than calorie counting.

Logging matters too. The app isn’t asking you to obsess over nutrition labels, but it does want you to note when you eat, when you exercise, and even when you’re feeling stressed. If you use a smartwatch, some of that happens automatically. In my case, dog walks detected by my watch showed up in the app without any extra effort on my part.

There’s also an educational side, with recipes, short videos, and explanations designed to help you interpret what you’re seeing. The content feels more like guidance than instruction, which fits the overall tone of the product. This isn’t positioned as a medical device for diagnosis, but rather as a feedback tool. If you did see readings that looked concerning, that would be a conversation to have with a doctor.

Abbott sells Lingo as a two week kit along with options for longer durations. I found that you can get enough data out of the device in two weeks making a longer duration purchase unnecessary. The data doesn’t disappear when the sensor comes off, and you can export it or keep it in your health app for reference later.

After a few days of use, I had a much clearer picture of how my body responds to foods I already thought I understood. That awareness alone was enough to start nudging my choices in a different direction. For something that measures just one variable, it ended up saying a lot about daily habits I don’t usually think twice about.

The RAM Crisis Explained: An Interview with Framework’s Nirav Patel

The price of memory is climbing, and it’s not just a problem for people building a new PC. RAM for laptops, desktops, phones, and tablets is getting more expensive as AI data centers absorb an increasing share of global supply. To better understand what’s happening behind the scenes, I called up Nirav Patel, CEO of PC maker Framework, to talk through how this shortage developed and what it means for consumers over the next several months.

Check out the interview in my latest video!

Patel described the current situation as a classic supply-and-demand imbalance, but on a scale the consumer market hasn’t seen before. Only a handful of companies—Micron, SK hynix, and Samsung—manufacture most of the world’s DRAM, and expanding capacity requires massive capital investment.

“What we’re seeing right now is just a massive excess of demand relative to the supply available,” Patel said.

With AI servers commanding higher margins, manufacturers are prioritizing those customers, leaving consumer products with tighter allocations. That imbalance has been building quietly for years, but it became much more visible when Micron announced it was shutting down its Crucial consumer memory brand last month. For PC builders, Crucial had long been a reliable option. Patel said the decision made sense given current conditions.

“When memory is in allocation, it doesn’t make sense to compete with your own customers,” he explained, noting that Micron supplies chips not only to large OEMs like Dell and HP, but also to other consumer memory brands.

One reason Framework has been able to navigate repeated supply disruptions—from pandemic shortages to GPU crunches and now memory—is its modular design philosophy. Patel credited flexibility as a survival tool.

“We built the product to be modular, and that gives us a lot of flexibility to navigate these kinds of environments,” he said.

Because many Framework systems are sold as DIY editions, customers can source their own memory and storage when shortages hit, sharing some of the burden rather than leaving the company entirely exposed.

The uncertainty isn’t limited to pricing. Patel described a market filled with overlapping orders, canceled allocations, and even hoarding.

“It is actually very unclear to anyone what the true ground truth is in the market when it comes to the supply and the demand,” he said.

Companies are placing duplicate orders with multiple suppliers, unsure which ones will be fulfilled. That behavior, he noted, can make shortages appear worse than they ultimately are, at least until reality catches up.

Geopolitics are also playing a role. Chinese memory maker CXMT has historically been avoided by many U.S. companies due to sanctions and long-term sourcing concerns, but Patel said that’s starting to change. “If you’re not sure where you’re going to be able to get your memory in two months, you better go and qualify every possible source,” he said, adding that some major OEMs are now testing and approving parts they previously wouldn’t have considered.

For consumers, the immediate concern is quality as prices rise and supply tightens. Patel’s advice was straightforward: stick with established brands. He doesn’t expect major manufacturers to compromise their reputations to chase short-term gains.

“Those brands are not going to torch all of their credibility in this short window of time,” he said, though he acknowledged that lesser-known vendors may try to take advantage of the situation.

While memory is the biggest constraint right now, Patel doesn’t believe every component will remain scarce long term. If memory remains the bottleneck, other parts like GPUs and storage should eventually stabilize because they can’t be deployed without sufficient RAM. In the near term, however, he expects continued volatility as the market works through excess orders and misaligned expectations.

Looking further ahead, Patel pushed back on the idea that soldered or unified memory is a solution to shortages. Even systems that place memory on the same package as the processor often rely on separately sourced components. For Framework, modular memory remains central to its roadmap, especially during periods like this. “Buy what you can afford today,” he said, “and buy solutions that let you upgrade in the future.”

Patel emphasized uncertainty as the defining market feature of the moment. AI demand has reshaped how memory is allocated, and the consumer market is now competing in a space it no longer dominates.

Curious about Framework? Check out my Framework videos here and my other interviews here!

More DOS Game Fun: Unlocking the Potential of GOG’s DRM-Free DOS Games

While the ExoDOS project serves as a comprehensive effort to preserve nearly every DOS game ever created, its massive 638-gigabyte archive can present a significant barrier to entry for users seeking just a few specific titles. For those interested in acquiring only a few games without managing large downloads, the GOG platform offers a practical alternative.

In my latest retro video, I explore how the platform’s DRM-free policy allows users to extract game data from the default installation wrapper and migrate it to other environments, such as Linux, custom DOSBox configurations, or even original retro hardware!

To demonstrate this process, I picked the classic Wing Commander 2, which is available on the platform bundled along with the first game and expansion packs for approximately three dollars—a significant reduction from its original retail cost in 1990.

The extraction process varies slightly by operating system. On a Macintosh, rather than using the Galaxy client, I downloaded the offline backup game installer. After bypassing standard security prompts to install the legacy software, the game files are typically contained within the application package. By right-clicking the executable and selecting “Show Package Contents,” then navigating to the “Resources” and “Game” subfolders, users can locate the raw game data and executables. Moving them is as easy as copying it over to a new directory or drive.

I copied these files to a separate directory to test them with Boxer, an open-source DOSBox port for macOS that has been forked for compatibility with Apple Silicon. One functional advantage of this manual extraction is the ability to enable features not active in the default wrapper. In the case of Wing Commander II, the default installation uses Sound Blaster audio; migrating the files allowed me to configure Boxer to support the Roland MT-32 soundtrack.

The procedure on Windows is equally straightforward. After running the offline installer, the necessary game data—specifically Wing Commander II’s “gamedat” folder and root files—can be found directly in the installation directory, usually located on the C drive. While the folder may contain modern cloud save data or platform-specific wrappers, these are not required for the game to function in other environments. Just like the Mac version, the game files can simply be copied out of the installation directory.

To verify the portability of these DRM-free files, I transferred the extracted Wing Commander 2 data onto a Compact Flash adapter and loaded it into my 26-year-old college laptop running Windows 98. This test confirms that the software sold through GOG remains independent of the delivery mechanism, granting users the flexibility to execute the code on the hardware or emulator of their choice. It’s a rare example of true digital ownership!

BuzzTV Powerstation P6 Review.. It’s not an Nvidia Shield

I review a lot of TV streaming boxes, and for enthusiasts the Nvidia Shield has long been the reference point. It has been around since 2015 and remains a capable device for people running their own media servers with support for full 4K Blu-ray rips, including support for Dolby Vision and lossless audio formats. It also shares its core hardware lineage with the Nintendo Switch, which gives it enough performance headroom for gaming and emulation.

While attending CES, I came across a company called BuzzTV showing a device called the Power Station 6 that they said is more powerful than the Shield. I decided to purchase one to see how it performed and whether it could serve as a realistic alternative. Spoiler alert, it doesn’t.

Check it out in my latest review!

The model I chose was the least expensive version, which includes 8 GB of RAM and sells for just under $300. There are higher-end variants with 16 GB and even 32 GB of DDR5 RAM, with the most expensive version priced at around $500. At that level, it starts competing directly with compact Ryzen-based mini PCs, which generally offer more flexibility and stronger overall performance for similar money.

All versions of the Power Station 6 use the same Rockchip RK3588 processor. Storage on the base and mid-tier models is 128 GB, while the highest-end version includes 256 GB. There is also an SD card slot and an internal bay for an NVMe SSD, allowing for quiet, solid-state expansion. Physically, the unit looks appealing but feels lightweight and somewhat hollow. The port selection is reasonable, with USB 3.0, USB 2.0, USB-C, optical audio out, gigabit Ethernet, and HDMI output rated for up to 8K. In practice, however, its usefulness as a home theater device quickly runs into limitations.

In my testing, this was not a strong Plex client. When I opened the box I was greeted with a warning that the Powerstation 6 is not to be plugged into a home theater receiver for some strange reason.

Dolby Vision was not supported, and lossless audio passthrough to my receiver did not work. While it can output 4K and 8K video, the lack of video and audio passthrough features means it doesn’t support the enthusiast-grade playback experience that the Shield is known for. That was disappointing given the price category this device occupies.

Performance is one of the few areas where the Power Station 6 shows some promise. In the 3DMark Wildlife benchmark, it slightly outperformed the Nvidia Shield, though not by a wide margin. That extra headroom shows up in emulation. GameCube titles like Wave Race using the Dolphin emulator ran at full speed in my testing. PlayStation 2 emulation was more mixed. Using NetherSX2 at minimum settings, demanding games like Burnout Revenge struggled to maintain full speed when there was a lot happening on screen. At this price point, a mini PC generally handles this workload better.

The software experience reinforces that concern. This is not an officially certified Android TV or Google TV device. Buzz TV uses its own interface, and while the Google Play Store is present, many mainstream streaming apps either cannot be installed or do not function properly. Disney+ would not play content at 4K and repeatedly errored out after only a minute or two of playback. Netflix was available only in a tablet-style version, with the TV version failing to launch entirely. Features like Dolby Vision, Dolby Atmos, and reliable HDR support were absent across these apps.

Although the device reports Widevine L1 certification, which should allow for high-resolution HDR streaming, real-world results did not reflect that capability. Compounding this is an outdated security posture. The box runs Android 13, but its most recent security patch dates back to August 2023. Google Play Protect was disabled by default, and there are numerous preinstalled apps of unclear origin. Taken together, this raises both usability and security concerns.

One area where Buzz TV clearly invested effort is the remote control. It feels solid, is backlit, and avoids the advertising buttons common on many streaming remotes. The programmable color buttons and the accompanying configuration app are well executed, and HDMI-CEC controls are easy to access. The remote ends up being the best-designed part of the product, even though it cannot compensate for the broader software and compatibility issues.

After spending time with the Power Station 6, what stood out most was how poorly integrated the overall experience felt. The interface itself is not cluttered with ads, but many of the things enthusiasts expect simply do not work. Between limited app compatibility, missing audio and video features, outdated security updates, and unusual hardware restrictions, the device falls short of what its high pricing suggests. There is some performance potential here, but in its current form, it is difficult to justify as an enthusiast-grade streaming box, especially when more capable and flexible Mini PCs exist at similar prices.

My First Cord Cutting / ATSC 3 Update of 2026

In the days leading up to the CES show and throughout the week in Las Vegas, several cord cutting news items related to the ATSC 3.0 over the air TV standard were announced. In my latest video, I provide a more in-depth overview of these developments that I touched on briefly during my CES Dispatch series.

Watch the update here!

As a recap, a central issue remains DRM encryption over the new ATSC 3.0 broadcast standard. Broadcasters are pushing to lock down over-the-air signals, limiting how viewers can receive and use content that has traditionally been freely accessible. While they say this is to prevent piracy, the real outcome is that it pushes consumers to expensive cable and streaming plans to maintain recording and time shifting features they enjoy today.

This matters because retransmission fees charged by broadcasters continue to rise at almost an exponential rate. In my area, the Broadcast TV fees are now $48.30 per month – and that’s before other cable charges. Even the most basic cable subscription will now cost consumers more than $60 monthly. Of course using an antenna to receive television is completely free.

Shortly after I began asking viewers to download and share their Comcast rate cards, Comcast removed the broadcast TV fee line item from their published rates entirely. The company says this was done to simplify pricing, but the effect is reduced transparency. The costs haven’t disappeared; they’ve simply been folded into higher base prices.

At CES, Pearl TV announced what it described as an affordable ATSC 3.0 converter box program. This is positioned as a way to lower the barrier to entry for consumers and manufacturers, but it closely resembles a similar failed effort announced in 2022 that never materialized.

The underlying root cause of Pearl’s troubles with consumer adoption hasn’t changed. Encryption and certification requirements add cost and complexity in a market that is already small. Even the proposed “affordable” devices are expected to cost under sixty dollars, still roughly double the price of many ATSC 1.0 tuners (compensated affiliate link) that include DVR functionality.

The certification process itself remains a problem. Pearl TV and the A3SA encryption body are private entities made up of the same major broadcasters, effectively controlling which devices are allowed to receive encrypted signals and ultimately be sold to consumers. This introduces a layer of private regulation on top of what has traditionally been governed by FCC standards alone.

Another announcement hinted at some movement on gateway devices, which take an antenna signal and distribute it across a home network. After years of delays, A3SA says encrypted gateway functionality is now working on a limited number of products, including the ZapperBox and an upcoming ADTH device. These solutions, however, are expensive and tightly constrained. ZapperBox requires multiple expensive proprietary devices for multi-TV households, and the ADTH approach is limited to Android and Fire TV platforms, excluding market leader Roku.

Visiting the ATSC booth at the CES show made it clear how confusing this ecosystem has become. Devices carried different combinations of NextGen TV and A3SA certifications, each with different implications for compatibility and functionality. By contrast, current ATSC 1.0 devices work simply because they can receive the signal, without needing approval from a private consortium.

There may be signs of easing tensions. An interview with SiliconDust CEO Nick Kelsey suggested that support for encrypted ATSC 3.0 signals could eventually come to HDHomeRun devices without additional hardware. That would be a meaningful shift, though it still leaves unanswered questions about support on non-Android platforms and the broader role of DRM on public airwaves.

FCC Chairman Brendan Carr addressed these issues during a CES discussion, emphasizing the public interest obligations tied to broadcast licenses. He noted that broadcasters unwilling to meet those obligations have other distribution options, from cable to online platforms, and raised the possibility of revisiting how spectrum is allocated if public interest standards are not upheld. Those comments echo questions raised by the FCC in its current ATSC 3.0 docket, particularly around whether encryption serves consumers or primarily protects broadcaster revenue.

That docket remains open for public comment, with additional opportunities to respond once broadcasters file their answers. The outcome is still uncertain, but it’s clear the FCC has heard our concerns and is waiting for the broadcasters to make their case as to why restricting access to the public airwaves will better serve the public.

CES 2026 Is a Wrap!

I’m back from a whirlwind trip to Las Vegas for CES 2026! Like last year I managed to crank out four dispatch videos along with a bonus episode that I posted last night.

You can see the full playlist here.

Like the last couple of years, this show felt very incremental insofar as new innovations were concerned. Robotics were plentiful, but very few were useful beyond doing some visually impressive demos. For example, this robot struggled with a the relatively simple task of folding laundry.

Another theme was concern over memory pricing. Deep pocketed AI companies are gobbling up all of the silicon they can find which is dramatically increasing prices for consumers and manufacturers alike. Micron recently shuttered their 30 year old Crucial consumer memory line. Every company I spoke with, large and small, is very concerned about this issue and some note that the worst has yet to be felt by consumers. What’s worse is that there is no end to it in sight.

This year’s coverage wasn’t sponsored. Everything you saw from CES this year was made possible by all of you! That includes everyone who watched and subscribed, along with those of you have contributed to the channel. That support mattered more than ever, because CES is expensive, time-consuming, and physically demanding, especially when you’re doing it solo.

By my math, CES 2026 coverage cost me around $2,500. That includes flights, hotel, food, and a lot of Uber and Lyft rides around Las Vegas. I stay at one of the lower-cost CES hotels and take advantage of the free CES shuttle when I can, but there are still plenty of events and off-site locations that require Uber or Lyft. Going alone keeps those costs manageable, and between ad revenue, affiliate links, and viewer support, I should roughly break even. Bringing additional people would change that equation dramatically, which is why many larger channels rely on sponsorships.

Sponsorships are a tricky subject at CES. I’m not opposed to them in principle, but most of the offers aimed at smaller creators come with editorial strings attached—covering specific booths, submitting footage for approval, or spending time on things that aren’t particularly interesting to viewers. I’d consider a sponsor if it genuinely improved the quality of the reporting without compromising independence, but those opportunities are rare.

From a workflow standpoint, my goal at CES is all about quick hits and efficiency. I shoot everything live, on location with my iPhone, and try to keep editing to an absolute minimum so I can get videos out quickly without staying up all night in a hotel room. That approach drives nearly every decision I make, from how I shoot to what gear I carry.

Audio is the biggest factor. The show floor is loud—often overwhelmingly so—and clean sound matters more than anything else. That’s why I stick with a handheld microphone, even though it’s not fashionable and some viewers wish my hands were free. There’s a reason TV news crews still use stick mics: they work. They isolate voices naturally, and they save time in post. For my purposes, reliability beats aesthetics every time.

The rest of the setup is intentionally simple. Fewer components mean fewer things to forget, lose, or troubleshoot while running between halls and events. The phone-based workflow continues to hold up well, and storage hasn’t been an issue on my 256GB iPhone.

This year also reinforced how much planning matters. When I was deliberate about which areas and events to cover, the reporting felt stronger and more focused. When I wandered without a plan, the results were mixed. Viewer suggestions were incredibly helpful, but relying on comment threads alone made it easy to miss good ideas. For next year, I’m planning to set up a more structured way to collect booth and topic suggestions from viewers so I can reference them quickly on the floor.

I also want to do more advance research before arriving—especially on the larger exhibit halls of the show. CES rewards preparation, and the better the groundwork, the more efficient the days become once you’re there.

As always, feedback is welcome. Hearing what worked and what didn’t helps shape how I approach future coverage. And if you’re new here, stick around—CES may be a big moment each year, but the rest of the calendar is filled with reviews, nerdy experiments, and tech commentary the same way it’s always been.

CES 2026 Dispatches Have Begun!

I’m at CES 2026 for the start of my latest Dispatch series, beginning with CES Unveiled, one of the early showcase events ahead of the main show floor. These Unveiled events are dense and efficient, with dozens of companies packed into a single room, which makes them well suited to fast-paced coverage.

This year’s Dispatch videos are intentionally lightweight in production, just me, an iPhone, and a backpack, with no sponsors influencing what gets covered.

Check out my first dispatch here!

Walking the floor, I ran into a wide mix of products, ranging from practical home tech to more experimental ideas. There were new security cameras and smart displays tied into network-attached storage systems, emphasizing local recording and the absence of subscription fees, though often requiring higher-end hardware. Music-focused gadgets showed up as well, including a “second-generation instrument “guitar hero” like device designed less around learning technique and more around casual, stress-free music creation with no musical experience required.

Mini PCs continue to evolve, and one of the more interesting concepts was a modular system that can dock into different enclosures, including a GPU base or a portable laptop shell, depending on how and where it’s being used. On the more whimsical side, I also came across a water-based drone meant for pools, capable of lighting effects and coordinated movement, clearly aimed at a niche (and wealthy) audience but at least something new and different.

Battery charging solutions were also on display, including systems that remove most of the decision-making by automatically handling orientation, charge state, and battery type.

One stand out from my first night was a compact and portable ice maker that cranks out ice cubes in 5 minutes, while others targeted professional and industrial users. A new FLIR thermal imaging device running Android positioned itself as a full workflow platform rather than a single-purpose camera, with industry-specific apps and built-in collaboration tools.

There were also familiar brands making a return. Pebble watches are back, focusing on long battery life and simple notifications rather than health tracking, alongside a new recording ring concept that lets users capture voice notes. In the outdoor automation category, autonomous yard robots continue to expand beyond mowing, with attachments for leaf collection and snow plowing, including models designed to be more affordable than earlier versions.

My dispatches are meant to be a snapshot rather than a deep dive, showcasing the range of ideas at the show. More Dispatches are coming as the show continues so be sure to subscribe to my YouTube channel and follow my CES 2026 playlist here!

Plex Q&A: Optimizing Video vs. Live Transcoding (sponsored post)

I took a look at a Plex feature that has been around for a while but hasn’t come up much in my previous coverage: media optimization. The idea is straightforward: instead of relying on your Plex server to transcode video on the fly every time someone watches something on a limited connection, you can have the server create alternate, smaller versions of your media in advance. Those optimized files sit alongside the original and can be played back directly, reducing the load on the server.

My latest video dives into this feature here.

In normal use, Plex’s transcoder does a good job adapting high-bitrate files for bandwidth challenged connections, but that assumes the server has enough CPU or hardware acceleration available at the moment someone presses play. If it doesn’t, or if multiple users are competing for those resources, performance can suffer. Optimization shifts that work to the background, letting the server do the heavy lifting ahead of time rather than in real time.

The feature lives in the Plex web app. From the menu attached to a movie or episode, there’s an option to optimize the media. You can also access it from the title’s detail page. When you start an optimization job, Plex lets you give the optimized version a custom name and choose a quality preset. The built-in options are aimed at common use cases like mobile viewing or TV playback, with defined resolutions and bitrates, but there’s also a custom mode if you want more control.

Custom optimization profiles allow you to pick specific resolutions and bitrates, ranging from 1080p down to very low-bandwidth 480p options. These profiles can be named and reused, which makes it easier to target particular devices or scenarios. Once an optimization job is started, the server processes it in the background and uses hardware video encoding if it’s available, which can significantly speed things up.

There are also some useful controls when optimizing TV series. You can limit optimization to unwatched episodes and cap the number of optimized files Plex keeps around. As episodes are watched, Plex can automatically delete the older optimized versions and generate new ones for upcoming episodes. That adds a layer of automation that helps keep storage usage in check.

When it comes time to watch something, Plex exposes the different versions through a “watch version” option. From there, you can explicitly choose between the original file and any optimized versions that exist.

Management of these files is centralized in the server settings under optimized versions. From that screen, you can see what’s currently being processed, what’s finished, and delete optimized media you no longer need. There’s also the ability to reorder optimization jobs if you want one item to be processed before others in the queue.

One additional setting ties optimization to the transcoder configuration. If HEVC optimization is enabled, Plex can create optimized files using HEVC where possible, which can deliver better quality at lower bitrates. That can be especially useful if storage and bandwidth efficiency is a priority.

Overall, optimization feels most useful for servers with limited processing power, older hardware, or libraries full of high-bitrate content that’s frequently accessed remotely. By preparing alternate versions in advance, more users can enjoy content simultaneously.

Disclosure: This video was a paid sponsorship by Plex. They did not review or approve this content prior to uploading and all opinions are my own.