ATSC 3.0 DRM Opponents Make Their Case to the FCC

The transition from the current over-the-air television standard to NextGenTV, or ATSC 3.0, continues to generate significant debate, particularly regarding the decision by many broadcasters to encrypt their signals.

In my latest video, I take a look at the filings from organizations and individuals opposing the implementation of Digital Rights Management (DRM) on the public airwaves.

This issue moved from theoretical to practical for me recently during the Super Bowl. I was unable to tune into the game over the air because my local NBC affiliate had encrypted their channel, and the legacy ATSC 1.0 signal was unreliable at my location, forcing me to stream the event instead.

I submitted my own filing to the FCC docket, effectively mirroring the arguments I raised in my prior video on this topic regarding the industry’s justification for encryption. To circumvent file size limitations on the docket, I attached a PowerPoint presentation with embedded video evidence, a method that allows for the submission of multimedia documentation under the 100-megabyte limit. This approach is useful for anyone wishing to demonstrate the real-world impact of these restrictions, such as devices failing to decrypt channels they are theoretically certified to receive.

One of the most comprehensive filings came from Public Knowledge, a consumer advocacy group. They commended the FCC for scrutinizing the issue but raised substantial concerns about the A3SA, the authority managing the encryption program. Public Knowledge argued that the A3SA operates without meaningful external oversight, maintaining confidential licensing terms and opaque decision-making processes. They contend this entity acts as a private gatekeeper to the public airwaves without accountability to consumers or public interest stakeholders.

Public Knowledge also highlighted the potential for consumer confusion arising from the current certification regime. There are now two distinct logos for consumers to navigate: the NextGenTV logo and the A3SA logo. A device might carry the NextGen TV certification, like the HDHomeRun gateway I use, yet lack the ability to decrypt content. Conversely, a device like the Zapperbox may have A3SA certification for decryption but lack the NextGenTV designation. During a recent visit to a major electronics retailer, I observed that neither logo was displayed on television sets that support the new standard, suggesting that this certification system has yet to effectively reach the consumer marketplace.

Furthermore, Public Knowledge drew a parallel between the current situation and the “broadcast flag” rule from the previous digital transition. They argued that the A3SA certification requirements essentially function as a new, more sophisticated broadcast flag, allowing broadcasters to dictate which devices can receive programming and potentially restricting recording capabilities. They also reminded the Commission that the FCC’s 2017 order to begin the ATSC 3.0 transition emphasized that encrypted programming should not require special equipment supplied by the broadcaster, a standard the current regime may be failing to meet.

Opposition also came from within the broadcast industry itself. Weigel Broadcasting, which operates stations reaching a vast majority of US households, filed comments expressing concern over the direction taken by larger broadcasting consortiums. Weigel presented evidence suggesting that some competitors view the new standard primarily as a vehicle for monetization, such as integrating gambling platforms or treating the spectrum as a financial asset rather than a public service. They acknowledged that the current implementation of DRM has created adoption hurdles and suggested that if encryption must exist, it should not require a persistent internet connection—a requirement that has already caused functionality issues with some commercially available tuners as noted in my prior video.

The Consumer Technology Association (CTA), which represents device manufacturers, also weighed in. While their filing focused largely on opposing a mandate for ATSC 3.0 tuners in all televisions, they acknowledged the friction caused by DRM. This is a complex position for the CTA, as the encryption technology being used is owned by Google, a major industry player and CTA member, yet the implementation is harming member companies like SiliconDust (also a member). Their filing recommends that the Commission continue to monitor the intersection of DRM and the new standard, a notable admission from an organization that typically advocates against government intervention in their industry.

Similarly, the NCTA, representing cable and internet providers, cited encryption as a complicating factor that adds cost and technical challenges to the transition. They argued that these complexities support their stance against a forced transition to the new standard, noting that the need to support new audio and interactive formats is already a heavy burden without the additional layer of decryption requirements.

For those who have experienced issues with encrypted channels or malfunctioning hardware, the opportunity to place these experiences on the record is closing. The reply deadline for this docket is February 18. Under FCC rules, new filings at this stage must be in direct response to arguments already present in the record. This provides a narrow window for consumers to submit evidence countering the claims made by broadcasters, such as documenting instances where “offline” DRM failed to function as advertised. The record is currently being shaped by these final arguments, and the volume and specificity of these replies may influence the Commission’s next steps.

You can get more information about how to file here. I also did a video on the topic here.

Is the 2022-2026 Macbook Air The Greatest Laptop of All Time?

Typically, purchasing a laptop involves a compromise. If the budget is limited, one usually has to sacrifice performance, battery life, or portability. Finding a machine that adequately addresses all three requirements is rare, yet over the last few years, my 2022 MacBook Air M2 has largely managed to balance these competing needs. Despite the release of newer models, this device remains a significant benchmark for what a portable computer can achieve – and new versions cost less than the one I bought almost four years ago. Check out current offerings on Amazon (compensated affiliate link).

I take a deeper dive in my latest video.

Looking back at the hardware after nearly four years of daily use, the durability is notable. While there is some minor cosmetic wear—specifically some color rubbing off on the sides and the accumulation of oil on the keyboard—the metal chassis has held up against standard knocks and bumps. The display has maintained its brightness without flickering, and the keyboard, a departure from Apple’s lousy scissor-switch mechanism, remains fully functional with no stuck keys. Weighing in at roughly 2.7 pounds, the device is balanced enough to be handled with one hand, a feature that aids its portability.

From a port standpoint, the inclusion of the MagSafe charging connector was a practical decision. It frees up the two Thunderbolt ports for peripherals and prevents the laptop from being pulled off a surface if the cable is snagged. While the computer side of the magsafe cable is proprietary, the other end is standard USB-C. The Thunderbolt ports will still charge the laptop if using a desktop docking station.

The primary limitation regarding connectivity remains the inability to natively drive two external displays, a feature reserved for the “Pro” tier devices. However, for a single-monitor setup, the clamshell mode functions effectively as a desktop replacement.

When I originally purchased this unit, I opted for the 16GB RAM configuration rather than the base 8GB, a decision that appears to have contributed significantly to the machine’s longevity. Interestingly, a comparable configuration today—equipped with the newer M4 chip—actually costs approximately $400 less than what this M2 model cost in 2022. While the new chips offer performance gains, the 10-core GPU in this older model still handles demanding tasks competently.

Battery performance has been perhaps the most consistent aspect of the ownership experience. Across extensive travel and full days of conferences, I have yet to encounter a low-battery notification during standard operational hours. Even after approximately three and a half years and 364 charge cycles, the battery has retained about 89% of its original health. This endurance persists even when the machine is subjected to heavier workloads that typically drain portable devices quickly.

Regarding those workloads, the machine handles 4K video editing at 60 frames per second without significant friction. Using Final Cut Pro, scrubbing through footage and rendering effects happens almost instantaneously. It is a level of responsiveness often absent in lower-end Windows laptops running similar software like DaVinci Resolve. While I did not purchase this machine specifically for video production, it has proven capable of serving as a mobile editing station when I need to travel light.

The architecture also supports robust virtualization. Using UTM, I have been able to run the ARM version of Windows 11 alongside Ubuntu Linux, and even emulate older environments like Mac OS 9 and Windows 95 simultaneously. The performance is stable enough to browse the web within the virtualized Windows environment or run office applications in Linux without noticeable slowdowns.

Gaming on Apple Silicon has also evolved. With titles ported to the native architecture, performance on a fanless laptop is surprisingly viable. Running Cyberpunk 2077 on low settings yields a steady 30 frames per second. While it doesn’t reach the high frame rates of a dedicated gaming rig, it offers a playable experience for casual sessions. The lack of active cooling means the system might throttle under sustained load, but I have not observed significant performance drops during use.

Finally, the device shows promise with local AI workloads. In the video I demoed the Locally app that connects to open-source models like Gemma. My aging laptop, which released a few months before the commercial introduction of ChatGPT, processes queries with reasonable speed. While newer chips are optimized further for these tasks, the unified memory architecture allows this older model to handle basic language models and light automation without excessive memory or processing penalties.

Given its sustained performance across varied tasks—from virtualization to media creation—I see no urgency to upgrade to the M4 generation. The M2 MacBook Air continues to function as a reliable, well-constructed tool that meets daily professional demands. For those who can find this model on the secondary market or on sale, it represents a hardware investment that still offers substantial utility years after its initial release.

GMKTec K15 Mini PC Review

I recently received the new GMKTec K15, marking my first mini PC review of 2026. If I had to characterize this device with a single analogy, I would describe it as the Toyota Camry of its category: It is neither a stripped-down budget device nor a high-end powerhouse; rather, it occupies a functional middle ground. You can find it on Amazon here (compensated affiliate link).

See it in action in my latest review!

The system is built around the Intel Core Ultra 125U processor from the Meteor Lake family. This chip features a 12-core architecture—comprising two performance cores, eight efficiency cores, and two low-power efficiency cores—delivering a total of 14 threads. My unit arrived equipped with 32 GB of DDR5-4800 RAM and a 1 TB NVMe SSD. While the current price sits higher than it otherwise would due to the volatility of memory prices, if things do let up it should sell for less than its current price.

Despite the cost, the expandability is notable; the system supports up to 96 GB of RAM and features three NVMe slots, which is generous for a device of this footprint.

Connectivity is a strong suit for the K15. The front panel includes a 10Gbps USB-C port and three USB-A ports. The rear I/O offers 40 gigabit USB 4 port, which is Thunderbolt compatible, dual 2.5GbE Ethernet ports, and an Oculink port. The Oculink addition is particularly useful for those interested in external GPUs, as it connects directly to the PCIe bus, offering superior bandwidth compared to USB 4. During my tests, the Wi-Fi 6 chipset performed well, maintaining speeds close to gigabit levels, and the variety of ports suggests this unit could easily be repurposed as a home server.

In terms of daily performance, the K15 handles standard desktop workloads efficiently. Web navigation is snappy, and 4K video streaming presented no issues aside from the expected minor frame drops upon initial loading. Content creation capabilities, however, have a clear ceiling. When editing 4K video in DaVinci Resolve, simple cuts and transitions were smooth, but the system bogged down significantly when attempting complex color grading or heavy effects. It is serviceable for basic edits, but anything more demanding would necessitate an external graphics solution.

Gaming performance aligns with the limitations of the integrated graphics and the reduced GPU performance on this 125U processor vs. the higher end 125H. Testing Cyberpunk 2077 at 1080p with the lowest settings resulted in frame rates hovering between 25 and 30 frames per second. It’s certainly playable, but lagging behind some of the more higher end mini PCs. While it struggles with modern, graphically intensive titles, it is perfectly adequate for older games or emulation. Thermals were well-managed throughout these stress tests; the CPU temperature stayed around 43°C, and the fan noise was minimal, likely due to a larger fan design that moves air efficiently at lower RPMs.

The device arguably shines brightest when running Linux. My experience with the OS was seamless, with all hardware—including Wi-Fi and Bluetooth—detected immediately. The system felt more responsive on Linux than on Windows, which has become increasingly bloated. Between the stable performance, the quiet operation, and the extensive storage options, the K15 stands out as a sensible, if modest, choice for a reliable workstation.

Disclosure: GMKTec sent the K15 to the channel free of charge but no other compensation was received. They did not review or approve my review prior to publication and all opinions are my own.

This Was the Best Selling Game Console of 1976

To commemorate my upcoming 50th birthday, I acquired a piece of technology that shares my birth year: the Coleco Telstar, a video game console released in 1976. It’s the subject of my latest retro video!

I purchased this device for a local historical society project celebrating the United States’ 250th year, intended to demonstrate to younger generations what home entertainment looked like when the country turned 200. The unit, a Pong clone, was manufactured by the Coleco, formerly known as the Connecticut Leather Company making this quite relevant for a local Connecticut historical society!

This specific model, the 6040, was the first edition released by Coleco. Its market success was largely due to its price point; while competitors like the Magnavox Odyssey and Atari’s Pong console retailed for approximately $100, the Telstar launched at just $50. Adjusted for inflation, that $50 price tag is roughly $290 today. This aggressive pricing strategy helped the company sell over a million units, a figure surpassed only by a Nintendo Pong clone sold exclusively in the Japanese market.

Internally, the device is distinct from modern consoles as it lacks a central processing unit. Instead, it operates using a specific chip, the AY-3-8500, which has the game logic hardcoded directly into its circuitry. Because the software is fixed on the chip, the system is not programmable. It generates sound through a built-in speaker rather than the television set and connects to displays via an analog RF connector, originally designed to work with a switch box on the VHF band’s channel 3. While a power connector was available as an add-on, the device was primarily intended to run on six C batteries.

The gameplay experience is controlled by knobs that move paddles on the screen, with a difficulty slider available to adjust the game mechanics. The console features three variations: a standard tennis-style Pong game, a single-player handball mode, and a hockey game where players control both a goalie and a forward. Upon testing this specific unit, I noted several functional issues consistent with its age, including a stuck game selector switch and a malfunctioning difficulty slider that fails to resize the paddles correctly on the “pro” setting.

This device represents the entry of Coleco into the video game market, a venture that eventually led to the release of the legendary ColecoVision console and the less successful ADAM personal computer. The Telstar remained on the market for approximately two years before the company shifted focus to handheld games and programmable consoles. It serves as a historical marker for home gaming in 1976, predating the significant technological leap that occurred just a decade later with the introduction of titles like The Legend of Zelda.

How I’m Using Plex in 2026 (sponsored post)

I’ve been using Plex for well over a decade now, long before any sponsorships entered the picture, and it remains the backbone of how I manage and watch my media at home and on the road. As a point of disclosure, this video and the transcript it’s based on are part of a paid sponsorship with Plex, but they did not review or approve the content beforehand.

My current Plex server runs on Unraid, which has proven to be a flexible choice that makes installing the Docker version of Plex super easy. Right now, the server itself is a small Beelink ME Mini NAS/PC paired with a USB-connected multi-bay SATA enclosure. It’s not a particularly elegant setup in terms of cabling, but it’s been reliable.

One of the reasons I’ve stuck with Unraid is how easy it is to migrate from one machine to another. Moving from an earlier NAS box with thermal issues to the current setup was simply a matter of transferring the Unraid external boot drive and disk array. The system came back online without any any configuration drama, which makes incremental upgrades far less painful.

The processor in this server is a low-power Intel N150, and in practice it has been more than sufficient. It handles multiple Plex transcodes at once and still leaves enough headroom for other Docker containers I run alongside it. That experience has reinforced my view that you don’t need particularly powerful hardware for a small, well-tuned home server so long as your processor supports hardware transcoding. The Intel N100 and N150 chips are available in many affordable mini PCs and entry-level NAS devices.

I also maintain a second Plex server offsite at a family member’s house, running on a Synology NAS. That system serves double duty as a test bed and as an offsite backup destination, giving me control over where my data lives. To connect everything together securely, I rely on Tailscale. It allows me to access my servers remotely without exposing them directly to the internet, and I can limit access to specific people and devices. That balance between convenience and security has worked well for my use case.

Most of my serious viewing happens at home, particularly higher-bitrate Blu-ray rips that I watch in my home theater. That setup centers around an older LG OLED television paired with an Nvidia Shield from the 2019 generation. Despite its age, the TV still delivers excellent image quality, and the Shield handles Dolby Vision playback from both streaming services and locally ripped discs.

With proper audio passthrough enabled, lossless Dolby Atmos tracks make it from the server to the sound system untouched, which is exactly what I want for that kind of content. I also enable refresh-rate switching so films play back at their native 24 frames per second, avoiding unnecessary judder.

Over time, I’ve built up a sizable library, and lately I’ve found myself revisiting older television series. Plex’s ability to shuffle episodes has become a surprisingly useful feature, especially for shows I know well and don’t feel the need to watch in order. It turns familiar series into something closer to background comfort viewing, without much thought required.

Live TV is another part of my setup, using an HDHomeRun tuner integrated into Plex. I can mix over-the-air channels with streaming channels in a single guide, and when I’m traveling, I can even watch my local channels remotely. Plex doesn’t currently support ATSC 3.0 broadcasts due to encryption and audio codec limitations, so recordings are limited to ATSC 1.0. I also handle actual recording through the HDHomeRun app, with Plex pointed at the directory where those recordings are stored so both systems can access them.

One of the more recent additions to my workflow is Plex’s watch list feature. When I hear about a show or movie that sounds interesting, I add it to the list from my phone. Later, when I sit down to watch something, Plex shows me not just the title but where it’s available, whether that’s on my own server, a friend’s server, or a streaming service. It’s a practical way to reduce the time spent deciding what to watch, especially when free time is limited. The same interface also surfaces trailers and upcoming episode release dates, which acts as a lightweight reminder system.

Music is handled through Plex as well. I’ve been slowly ripping decades’ worth of CDs into lossless files, which now live alongside my video library. Most listening happens through the Plexamp app on my phone, both at home and remotely. For travel, I’ll download albums or playlists directly to the device. While wireless headphones limit some of the benefits of lossless audio, using wired headphones makes a noticeable difference, especially on long flights.

Speaking of travel, the download feature has also been useful for loading TV episodes onto a tablet before trips using the Plex mobile client, letting me watch without relying on in-flight connectivity.

Looking back, Plex has stayed in my workflow because it’s made managing and accessing my media more straightforward. It brings together local files, live TV, and streaming discovery in a way that reduces friction rather than adding to it. For me, that efficiency is the real value, and it’s why the system I set up years ago continues to evolve rather than being replaced.