My Toyota Sienna Van is Now a Lemon Due to an Unaddressed Recall..

Back in December, I shared information regarding a recall affecting my 2025 Toyota Sienna. As of today, March 5 2026, the vehicle has been sitting at the dealership without a resolution. The van has been out of service for almost 90 days, having been at the dealer since December 12th. I’m about to take Toyota to Lemon Law Court here in Connecticut.

See more in my latest video!

The recall addresses an issue where the second-row seats rails have a risk of losing their structural integrity and pose a risk of injury due to defective welds. The manufacturer’s notice explicitly stated no one should sit in these seats until a remedy is performed. While the manufacturer instructed dealers to pull the vehicles from lots on October 7th, 2025, my notice did not arrive until 66 days later. To date, no remedy or timeline for a fix has been communicated.

This situation impacts approximately 50,000 Sienna vans. Faced with a vehicle that cannot be safely used as intended, I researched the lemon law in my home state of Connecticut.

Connecticut requires that a vehicle be a new vehicle under two years old, have less than 24,000 miles, and exhibit a condition that substantially impairs its use, safety, or value. Given that I purchased a seven-passenger van and two of the middle seats cannot be used, the impairment is clear. Furthermore, Connecticut law provides eligibility if a vehicle has been out of service for repair for a cumulative total of 30 days or more.

I have filed a lemon law complaint with the state, and it has been accepted for a hearing. At the hearing, I will make my arguments for either a replacement or a refund. For other owners dealing with this extended recall, researching state-specific lemon laws is a practical step. Resources like Justia provide a 50-state survey of lemon laws across the United States, detailing varying procedures.

While the process in Connecticut is designed so consumers can file without an attorney, legal counsel may be consulted if the hearing process is intimidating. Following my hearing, I will share my presentation and arguments so other owners have something they can use in their own hearings. Stay tuned!

California Law to Require Age Verification on All Operating Systems (Including Linux)

Recently, a new California law signed by Governor Gavin Newsom caught my attention due to its potential impact on the open-source community, specifically Linux users. The legislation mandates that operating systems for PCs and other general computing devices like tablets and phones must implement a form of age verification during the initial account setup process.

I take a look at the implications of this law in my latest video.

While California is not the only state pursuing such measures—Texas recently faced legal hurdles over a similar law—this development raises questions about how open-source organizations, rather than traditional corporate entities, will comply.

The text of the California bill, which was signed on October 13, 2025, and takes effect on January 1, 2027, calls for an interface that requires the account holder to provide their birth date or age. This information generates a signal regarding the user’s age bracket—categorized as under 13, 13 to 16, 16 to 18, or over 18—to be read and enforced by applications within a covered app store.

The legislation defines an operating system provider broadly enough to include independent developers creating Linux distributions. Furthermore, a covered application store is defined as a publicly available online service, which could encompass command-line package managers used daily by Linux administrators.

From a practical standpoint, the current requirement relies entirely on self-reporting. Users are asked to volunteer their age, meaning anyone could input inaccurate information to bypass restrictions. Despite this, the penalties for non-compliance are clearly defined. Operating system makers face civil penalties ranging from $2,500 for negligent violations to $7,500 for intentional violations per “affected child.” If a developer has internal data showing a user’s actual age differs from the self-reported signal, they are legally obligated to act on that information or face action from the California Attorney General.

The implications for Linux distributions are notable. Commercial entities with a business nexus in California, such as the organizations behind Ubuntu and Fedora, will likely implement the necessary prompts to comply.

However, smaller projects face a different reality. Many distributions are maintained by volunteer groups without the financial resources or organizational structures to shield them from liability. Midnight BSD has already modified its software license to exclude California residents, but this legal maneuver may not satisfy California regulators if the software remains accessible for download within the state’s borders.

This legislative push is not confined to the West Coast. My home state of Connecticut is currently evaluating controls for minors on the internet, and Colorado is exploring operating system-level age verification. Texas attempted to regulate app stores before a federal court blocked the law, citing First Amendment concerns regarding its broad application. The absence of a unified federal privacy law has resulted in a fragmented regulatory landscape across different regions.

Historically, some internet users have responded to localized regulations by migrating to decentralized platforms. When Discord faced scrutiny over its age verification methods that included video selfies and government IDs, users began exploring open-source alternatives like Revolt and Matrix. These self-hosted and federated platforms demonstrate how technical communities can circumvent centralized data collection and restrictive legal mandates.

As the 2027 deadline approaches, it is likely that many Linux distributions will simply integrate a birth date or age prompt into their installation screens to mitigate legal risks. The technical challenge of passing that age signal consistently to various package managers and standalone applications remains a logistical hurdle. The coming months will test how far state authorities are willing to go in enforcing these mandates on the broader open-source software ecosystem.

DOS Games in a Browser? DOS.ZONE Review

For my annual Christmas retro video, I explored the Exodos project, a method for downloading and playing a vast library of classic DOS games. While functional, it requires significant disk space, BitTorrent downloads and some complexity. It also lacks native compatibility with MacOS and Linux. Following a recommendation from my friend Adam of TechOdyssey, I recently tested an alternative approach called DOS.zone.

Check it out in my latest video!

DOS.zone is a web-based emulation platform designed to run legacy DOS and Windows 95 games directly within a browser. During my testing on an M2 MacBook Air using the Brave browser, games like Doom booted quickly and ran at standard speeds without requiring any software installation or command-line management. The platform currently hosts approximately 2,000 titles, which is a smaller library compared to the Exodos project, but it focuses on immediate accessibility.

The service operates primarily as a free service, downloading and executing the game files locally in the browser rather than streaming them from a server. The DOS games run in DosBox or DosBox-X which has been ported to Javascript (more on that later).

Game progress can be saved to the browser’s local storage, provided the user clicks the designated save icon before exiting. Because local browser data can be cleared or lost, DOS.zone offers an optional subscription for a few dollars a month that enables cloud synchronization for save files. The save files can be synced across devices too.

The emulation includes various adjustable settings to tailor the experience. Users can modify the DOSBox performance by toggling the auto-adjust feature and manually setting CPU cycles, which I found necessary to stabilize the frame rate in titles like Wing Commander. Other options include mouse capture, on-screen control scaling, and an image smoothing toggle for those who prefer altered graphics over the original pixelated rendering. Notably, the platform currently lacks native game controller support, relying instead on keyboard controls or external software mapping.

Technically, DOS.zone distinguishes itself from other in-browser solutions by supporting Windows 95 environments and 3dfx hardware acceleration. Loading a game like Road Rash prompts a brief Windows 95 boot sequence before launching the application. While this specific browser port lacks the original full-screen video and music, it runs consistently at 60 frames per second. The 3dfx support extends to titles like the original Grand Theft Auto and various hardware demos, rendering hardware-accelerated graphics entirely within the browser window.

The platform also integrates a multiplayer hub where users can join active network sessions for games like Quake or Half-Life Deathmatch. In my experience, some visual assets load dynamically during the initial session, which can cause minor stuttering, but the performance stabilizes once the caching is complete.

Because the underlying technology utilizes the open-source JS-DOS API, the emulation extends to mobile devices. Testing Need for Speed on a smartphone demonstrated that the platform automatically maps necessary game controls to the touchscreen, enabling mobile gameplay without requiring a dedicated app installation.

For users seeking a broader library, the Internet Archive remains a viable alternative with over 8,800 browser-playable DOS titles. However, DOS.zone provides a more specialized technical implementation with its inclusion of 3dfx, Windows 95 support, and built-in multiplayer routing, offering a highly accessible route to revisiting legacy software without the need to manually configure local emulators.

See more retro here!

I own a piece of Star Trek’s Ten Forward!

I recently acquired a physical piece of the set from the television series Star Trek: The Next Generation (TNG)! The item is a small square of fabric from a backdrop utilized for the TNG Ten Forward set, serving as the curtain positioned behind the windows. The fabric incorporates Mylar reflective material, which production crews illuminated to simulate the appearance of stars.

I obtained this item directly from Doug Drexler, a former production staff member on the series. Drexler sold off portions of his personal collection to raise funds for an upcoming documentary detailing his life and work. The documentary project was recently financed through a successful Kickstarter campaign.

If you’re a fan of sci-fi shows there’s a good chance you’ve seen Drexler’s work. His credits include Star Trek TNG, Deep Space Nine, Voyager, and Enterprise. His recent work includes the amazing Picard Season 3, the Orville and more. His Facebook page has some awesome behind the scenes photos and videos of his work.

I’m looking forward to the documentary!

GMKTec K13 Mini PC Review

GMKTec keeps cranking out new Mini PCs despite the price pressure of RAM shortages. In my latest Mini PC review, I check out their new K13 powered by an Intel “Meteor Lake” Core Ultra 7 256V.

Check it out in my latest video!

It is equipped with 16 gigabytes of DDR5-8533 memory. This memory is soldered to the mainboard and cannot be upgraded, which may be a limiting factor depending on specific requirements.

Pricing currently sits at approximately $669 on GMKTec’s website (compensated affiliate link), with a 512-gigabyte solid-state drive, while the one-terabyte version retails for $720. Typically a PC like this would be much less expensive, but market constraints on memory prices are driving these budget PCs into higher price categories. I’d suggest checking out Amazon’s prices too and look out for sales and promotions (compensated affiliate link).

Storage is expandable by removing a single screw on the bottom panel, which features a bright green design contrasting with the black upper chassis. This reveals an additional NVMe slot, offering the possibility of adding another drive or utilizing an Oculink adapter, though the device lacks a dedicated native Oculink port.

Connectivity options include two front-facing USB 3.2 ports capable of 10-gigabit-per-second data transfers and a standard audio jack. The rear panel houses a 5-gigabit Ethernet port, which reached expected speeds during network testing and is double the speed of the 2.5 gigabit ports typically found on Mini PCs.

Wireless connectivity is handled by a Wi-Fi 6E RZ616 chipset, providing consistent throughput and not demonstrating some of the Wi-Fi constraints I sometimes see on mini PCs.

Additional rear ports include a USB 2.0 connection, an HDMI output supporting 4K resolution at 60 hertz, and two USB4 40 gigabit ports with Thunderbolt compatibility. The system draws a maximum of 70 to 75 watts under load, allowing it to be powered via a 100-watt USB-C connection to one of those USB 4.0 ports or with the included power supply that attaches to a separate power connector.

Out of the box, the hardware is set to a balanced power profile in the BIOS, and adjusting this to its high-performance setting is necessary to utilize the processor’s full capabilities. The system runs Windows 11 Pro, but it notably included a pre-installed Chinese-language voice assistant called Cherry AI. This addition diverges from the manufacturer’s typical practice of providing clean operating system installations, though no malware was detected during security scans.

In practical use, web browsing and office applications function predictably, yielding a score of 32.5 on the Speedometer benchmark. Basic 4K video editing in DaVinci Resolve operates smoothly for standard cuts, though applying visual effects increases render times due to the reliance on integrated graphics.

For gaming I was impressed with the Intel chip’s on board ARC graphics. Testing Cyberpunk 2077 at 1080p on the lowest settings resulted in frame rates between 45 and 50 per second in complex environments, occasionally reaching 60 in less demanding areas. I saw similar performance with No Man’s Sky at similar settings. Emulation of PlayStation 2 software ran at full speed at the PS2’s standard definition native resolution with some room to improve graphical fidelity inside the emulator.

The system scored 4,375 on the 3DMark Time Spy benchmark and completed the associated stress test with a 99.6% pass rate. During this testing period, CPU temperatures reached 60 degrees Celsius; the exterior casing became warm, but fan noise remained minimal.

The K13 also demonstrated compatibility with Ubuntu Linux, with standard network, audio and video drivers functioning correctly. The unit ships with a VESA mounting plate for attachment to external displays, offering flexible deployment options for those seeking to minimize their hardware footprint.

All in the K13, is a bit pricey due to memory constrainsts, but it is a solid performer. If we lived in different times this would definitely be a PC to be excited about.

Disclosure: GMKTec provided the Mini PC to the channel free of charge. No other compensation was received and they did not review or approve this content or the video prior to publication.

ATSC 3.0 TV Encryption Update: The Final Arguments are In..

The final arguments regarding the encryption of over-the-air television have been filed with the FCC, and now it’s in the Commission’s hands. In my latest ATSC 3.0 analysis video, we take a look at how broadcasters responded to encryption concerns.

After reviewing hundreds of pages of documents, it appears the industry’s rebuttal to consumer concerns relies heavily on dismissing documented technical failures as mere anecdotes while asserting that encryption is necessary for the future of broadcast media.

The National Association of Broadcasters (NAB) has characterized reports of DRM failure—such as devices refusing to tune channels—as “early deployment friction” that does not justify stalling a national transition. They argue that individual complaints do not reflect systemic flaws. Yet, this stance contradicts the experience of users who have found that encryption often breaks the basic functionality of a television.

For instance, the A3SA, the body managing the encryption keys, argues that software-based devices require internet-based updates for bug fixes. This requirement introduces a significant dependency on internet connectivity for a medium that is marketed as being free and accessible over the air.

I recently demonstrated this vulnerability when an ADTH set-top box, which marketing materials claimed did not require an internet connection, failed to tune encrypted channels during a snowstorm. This inability to access weather information during an emergency challenges the industry’s assurance that content protection does not impede public safety messaging.

Beyond technical reliability, the industry posits that DRM is essential to combat piracy and secure content for sports broadcasting. The A3SA cited a media report claiming billions in losses due to piracy, yet the article in question focused on cable and streaming theft rather than the unauthorized capture of over-the-air signals.

Historically, DRM has been less about stopping piracy—which remains rampant despite encryption—and more about siloing users into specific hardware and software platforms. By making free over-the-air reception more difficult, broadcasters may be incentivizing consumers to stick with paid cable or streaming packages. Furthermore, claims that major sports leagues will withhold content without encryption are not supported by the current landscape, where broadcast contracts are being renewed for extended periods without such mandates being public.

There is also a significant question regarding the neutrality of the A3SA, which acts as the sole gatekeeper for approving tuning devices. While the organization claims to be neutral, it is comprised of major broadcast entities. This structure effectively allows the industry to pick winners and losers in the hardware market.

Manufacturers of popular gateway devices, such as Silicon Dust’s HDHomeRun, have been unable to secure certification under the current regime. The A3SA’s standards remain opaque and protected by non-disclosure agreements, preventing independent verification by even the FCC and effectively locking out devices that distribute signals across a home network to non-Android devices.

Ironically, while the industry argues that DRM protects consumers from the security risks of illicit streaming, the approved hardware itself presents security concerns. The ADTH box mentioned earlier was found to be running an Android security patch level from 2021, leaving it vulnerable to years of known exploits.

It seems unlikely the FCC will mandate a hard transition to ATSC 3.0 in the near term given the abysmal consumer adoption rates. The current ecosystem is too fragmented, and the cost and complexity of encryption have slowed adoption to a crawl.

And ultimately for consumers, they’re really not getting as much as they did during the prior transition. Back in the early 2000s TV viewers went from analog standard definition signals to digital high definition ones – a huge jump in visual fidelity. While ATSC 3.0’s HEVC video encoding is certainly noticeable for enthusiasts, I doubt most mainstream consumers will notice much of change.

I believe a probable outcome is a “frozen conflict” where the FCC ends the simulcast mandate, allowing stations to voluntarily switch to 3.0 if they choose, while potentially authorizing more efficient video codecs like MPEG-4 for the existing ATSC 1.0 standard.

This would allow the legacy standard to improve and remain viable, effectively leaving ATSC 3.0 to succeed or fail on its own merits without a government mandate forcing consumers to upgrade. We may end up with a better-looking version of the television service we already have, while the next-generation standard struggles to find its footing.

The Nostalgia TV for Plex App Turns Your Plex Meida into a Retro Cable TV Experience! (sponsored post)

My latest sponsored Plex post takes a look at Nostalgia TV, an independent application that provides an alternative user interface for Plex media servers. While not an official Plex product, the app utilizes the Plex API to connect to a user’s existing libraries and present content through a 1990s-style cable television interface. This allows users to view their own media files as a series of linear, “live” broadcast channels.

Check it out in the video here!

The application is currently available only for Android and runs on both mobile devices and television-based hardware. You can find it on the Google Play store here.

Setting it up is relatively straightforward compared to other similar tools; it does not require additional server-side installations like Docker. Once the app is pointed toward a Plex server and specific libraries are selected, it automatically generates a variety of themed channels. On the free tier, users have access to about five or six channels that match content based on library metadata, such as children’s programming or specific movie genres.

A pro version is available for a one-time fee of $20, which unlocks deeper customization and additional features. This includes the ability to add or remove channels, change the visual theme—ranging from a “Rad Lad” 80s monitor style to a more polished “Premium” look—and enable commercial breaks. These commercials are pulled from a user’s own designated Plex library and serve to pad the timing of shows so they stick to 15-minute scheduling increments.

While the core functionality of tuning between channels is notably fast, the application is in its early stages and is a bit buggy especially when it comes to customization.

Configuration via a remote control can be cumbersome, though the app includes a local web remote feature that allows for easier channel editing through a web browser. Within this web interface, users can adjust content flow using methods like random shuffling, sequential blocks, or “cyclic” ordering to maintain episode chronology. Unfortunately it’s not currently possible to build an hour-by-hour schedule – the app fills in the channel guide automatically.

Beyond the interface itself, the app integrates with the standard Plex ecosystem by reporting playback status back to the server dashboard, supporting both direct play and transcoding when necessary.

This project serves as a practical example of the extensibility now possible through the Plex API. By leveraging the server’s existing handling of video playback and library management, independent developers are able to focus entirely on creating niche user experiences.

Next month’s video will show case some things that I “vibe coded” using AI tools and connecting them to the Plex API. Stay tuned!

Disclosure: this is a paid sponsorship from Plex. However they did not review or approve this video or post prior to publication.

Geekom X16 Pro Laptop Review

Geekom, a company traditionally focused on desktop mini PCs, recently expanded its hardware portfolio to include laptops. My latest video review takes a look at their 16-inch model, the Geekom X16 Pro, to see how their engineering translates to a portable form factor.

Check out the video here!

The model I evaluated is now available on Amazon (compensated affiliate link) and I suspect pricing is going to fluctuate wildly due to memory supply constraints. The hardware configuration includes an Intel Core Ultra 9 185H processor, 32 gigabytes of soldered 7500 MHz DDR5 RAM, and a user-upgradable two-terabyte NVMe solid-state drive. Geekom also offers a 14-inch variant (compensated affiliate link) with an OLED screen and an Intel Core Ultra 5 125H processor at a slightly lower cost.

The visual output on my review unit is handled by a 16-inch IPS LCD panel with a 2560 by 1600 resolution and a 16:10 aspect ratio. The refresh rate reaches 120 Hz, and the display produces 400 nits of brightness while covering 100% of the sRGB color gamut.

The chassis is constructed entirely of metal and weighs 2.8 pounds, or 1.27 kilograms. The weight distribution allows the lid to be opened with one hand, which typically indicates thoughtful structural engineering. Inside, a generous 74.92 watt-hour battery provides approximately 12 to 13 hours of standard usage with conservative brightness settings.

Input devices presented a mixed experience during testing. The backlit keyboard features a full number pad, though the key travel feels a bit spongy. The trackpad design falls short of standard expectations for this price tier. A physical barrier separates the left and right click zones, rendering the center unclickable unless utilizing Windows’ tap-to-click software feature. The physical click mechanism also ceases to register past the vertical midpoint of the pad.

The laptop includes a 2-megapixel, 1080p webcam at 30 frames per second with a physical privacy shutter, which is sufficient for standard video conferencing but lags in quality vs. other more established brands in visual quality. Security features include a fingerprint reader integrated into the power button, though facial recognition is absent.

Connectivity options are varied, featuring a USB Type-C port for charging and 10 Gbps data transfer, a full-size HDMI 2.0 port, and a USB 4.0 port capable of 40 Gbps data, display output, and power delivery. The right side houses a micro SD card slot, two 10 Gbps USB-A ports, and a headphone jack. Wireless connectivity relies on a Wi-Fi 6E radio, which achieved 800 megabits per second downstream and over one gigabit upstream on my multi-gig network. Audio is delivered via downward-firing stereo speakers that reach adequate volume levels due to chassis acoustics, though they lack low-end frequency response.

In terms of performance, the X16 Pro ships with Windows 11 Pro and the Geekom PC Manager software, which facilitates quick power mode adjustments and data cloning from previous Windows 10 machines. General web browsing and 4K 60fps video playback operated smoothly, yielding a Speedometer benchmark score of 29.4. For creative tasks, basic 4K 60fps video editing in DaVinci Resolve was responsive, but the integrated GPU struggled with advanced visual effects. Gaming performance on titles like Cyberpunk 2077 at 1080p on low settings hovered around 40 to 45 frames per second. The system scored 4,128 on the 3DMark Time Spy benchmark, placing its graphical capabilities roughly in line with older entry-level discrete GPUs from around 5 years ago.

Thermal management proved effective, passing the 3DMark stress test at 99.2% with a final CPU temperature of 57 degrees Celsius. This cooling capacity comes with an acoustic tradeoff, as the internal fan generates noticeable noise under load, rivaling levels typically found in gaming laptops.

For those interested in alternative operating systems, testing with the latest version of Ubuntu demonstrated comprehensive hardware compatibility, recognizing components like Wi-Fi, Bluetooth, and audio without issue.

The Geekom X16 Pro balances a capable processing package and substantial battery capacity within a very lightweight metal chassis. This is probably one of the lightest 16″ laptops I’ve looked at. While the core specifications offer dependable performance across general computing and light creative tasks, potential buyers will need to weigh these benefits against the limitations of the trackpad and the acoustic profile of the cooling system under sustained loads.

Disclosure: Geekom sent the laptop to the channel free of charge, however no other compensation was received and they did not review or approve this content prior to publication.

DSpico Review : An affordable flash cartridge for Nintendo DS handhelds

Twenty years after the release of the Nintendo DS, managing the system’s physical media presents an ongoing logistical challenge for users. In my latest retro video, I take a look at the DSPico, an open-source flash cartridge designed to boot digital copies of DS games directly on original hardware. These sell for around $20 on Aliexpress (compensated affiliate link)

Check out the review here!

The device retails for approximately $20, with shipping bringing the total cost to around $30. It is built around a Raspberry Pi RP2040 microcontroller and features a USB-C port alongside an SD card slot. The pre-assembled model I examined shipped with an 8-gigabyte SD card. The hardware requires a FAT32 format, and while some users have reported software lockups when using larger 64-gigabyte and 128-gigabyte cards, developers are currently addressing these bugs through firmware updates on GitHub. Updating the firmware requires connecting the cartridge to a computer via USB-C and transferring the necessary files to the card.

The cartridge has full access to the SD card, so you can store ROM files in an organizational structure of your choice. The DSPico reads and writes save files to the SD card and saves function just like they do on a regular cartridge. But it does not support save states like some Gameboy cartridges do.

The DSpico is compatible with the original DS, the DS Lite, the DSi, and the 3DS line. It only runs standard DS software, meaning it cannot be used to load 3DS-specific titles. On compatible hardware like the DSi and the 3DS, the DSPico also supports DSiware titles that were originally distributed strictly over the Internet – currently the only flash cart to do so.

Beyond preservation of out of print games, the DSPico serves as a loader for homebrew projects. I tested a few community projects, including a 3D role-playing game currently in development called WolveSlayer and a port of Lemmings. Both games played without issues. LemmingsDS goes beyond just a ROM file and utilizes a process where secondary assets are pulled directly from the SD card after the initial ROM loads.

As the secondary market for physical DS cartridges continues to experience price inflation, the DS Pico presents a functional method for accessing older software libraries without requiring the original media. Because the entire project is open-source, individuals have the choice to assemble the hardware themselves using the public repository or purchase pre-manufactured units from existing suppliers. The active development surrounding the device suggests that this two-decade-old handheld platform will remain accessible for the foreseeable future.

Disclosure: The DSPico was provided free of charge by the Aliexpress seller linked above. No other compensation was received and they did not review or approve this content prior to publication.

Your ISP Is Spying On You..

Recently, I reviewed a 2021 Federal Trade Commission report detailing the data collection practices of six internet service providers. The report examined AT&T, Verizon, T-Mobile, Google Fiber, Comcast Xfinity, and Charter Spectrum Communications. It found that standard consumer privacy measures, such as web browser tracking protections, are ineffective against ISPs because many utilize a “supercookie” to persistently track network activity.

In my latest video, I dive into this topic and look at what you can do to stop this data collection.

Because households share a single internet connection, this tracking encompasses all users on the network, including children. ISPs gather information by observing the websites a household visits, the frequency and duration of those visits, and the amount of data transferred. Providers can send a user’s IP address to an ad affiliate, who then passes it to a data broker to build an informational profile. This data extends beyond basic demographics, categorizing users by religious affiliation, ethnicity, and political leanings.

The sale of this information presents distinct privacy risks. Beyond targeted advertising, the FTC report indicates that scammers can purchase access to these profiles. Additionally, a 2019 Motherboard report revealed that bounty hunters were able to buy customer location data originating from AT&T, T-Mobile, and Sprint phones. Despite these practices, consumer engagement with ISP privacy policies remains low. The FTC found that the provider with the highest engagement saw only 6.7 percent of subscribers look at their privacy pages.

I examined my own provider, Comcast Xfinity, to understand their specific policies. Comcast stated in a 2017 blog post and on their current privacy pages that they do not sell personal information without affirmative opt-in consent. However, agreeing to their terms of service during the initial account sign-up functions as that consent.

Navigating Comcast’s privacy section reveals numerous documents and a complex process for managing data disclosures. Users can opt out of certain disclosures, such as participation in audience measurement or personalized ads, but the application of these settings to broader tracking methods is ambiguous.

The ability to view, change, or delete the specific data an ISP holds depends heavily on state laws. For residents in states with applicable laws, Comcast provides a form to request a download of stored data, which includes account information, behavioral inferences, and details about telecommunication usage.

I submitted a data download request over a week ago, a process Comcast notes can take up to 30 days to fulfill. Until comprehensive federal regulations are established, the responsibility remains on the individual subscriber to navigate these varied settings and actively opt out of data collection.

I will be back with an update once Comcast hands over my data. Stay tuned!

Gadget Haul 13! Projectors, Handheld Gaming, Chargers and an Apple Watch iPod

I recently gathered a collection of consumer electronics, ranging from repurposed legacy hardware to a new Anker projector, to evaluate their utility and performance. Check out the full list here (compensated affiliate link – all others below are too).

Check it out in my latest video!

The first item, the RePod, functions as a chassis designed to repurpose an older Apple Watch into a standalone music player resembling an iPod. The device features a physical scroll wheel that mechanically engages the watch’s digital crown, allowing for list navigation, though the center button is non-functional, requiring users to touch the screen for selection.

It accommodates 44-millimeter watch models and permits charging via the standard Apple magnetic charger through an exposed rear port. While it removes fitness tracking capabilities due to the lack of wrist contact, the metal enclosure offers a viable use case for retired hardware, provided the user disables the watch’s locking mechanism to avoid repetitive passcode entry.

For power management, I tested the Anker Prime 3-in-1 wireless charging station, a foldable unit sent by the manufacturer that supports the Qi2 charging standard. The fold-up portable charging station includes a magnetic pad for phones, a pop-out Apple Watch charger, and a base for airpods or other Qi compatible devices. The main wireless charger is capable of delivering up to 25 watts to compatible devices. To manage thermal output during high-speed charging, the unit incorporates an active cooling fan, which is super quiet but can be toggled off via a capacitive button. The package includes a power adapter and features a weighted base with rubber footing to maintain stability during use.

Addressing connectivity over longer distances, I evaluated a 5-meter fiber optic USB-C cable. Unlike standard copper cables which often suffer signal degradation at this length, this bidirectional optical cable supports 10 gigabit per second data transfer and 60 watts of power delivery. In my testing with a fast external drive and Blackmagic disk speed software, the cable maintained read and write speeds comparable to shorter interconnects, hovering around one gigabyte per second. However, potential users should note that the optical design does not support DisplayPort alt-mode, rendering it unsuitable for video transmission to monitors.

Moving to visual media, I examined the Soundcore Nebula P1i smart projector, an entry-level LED unit with a brightness rating of 380 ANSI lumens, necessitating a dark environment for optimal viewing. A distinguishing feature is its pair of rotatable 10-watt speakers, which can be oriented to project sound forward, backward or upward. The device runs a certified version of Google TV, ensuring native support for streaming applications like Netflix, though the interface demonstrated some sluggishness during navigation. Regarding gaming latency, high-speed camera tests revealed an input lag of approximately 20 to 22 frames at 240 frames per second; while not comparable to a dedicated monitor, this result indicates very good performance for casual gaming within the projector category.

Finally, the haul included the AYANEO Pocket Air Mini, an Android-based handheld gaming device featuring a 4:3 aspect ratio IPS display which is well suited for retro titles. The hardware utilizes Hall effect joysticks and triggers, providing precise control without drift. Performance is driven by a MediaTek Helio G90T processor, which I found sufficient for emulating consoles up to the Sega Dreamcast era. Attempts to run PlayStation 2 or GameCube titles resulted in inconsistent frame rates, and the internal fan became intrusive when high-performance modes were engaged. The device supports memory expansion via microSD and includes a 3.5mm headphone jack, with the 3-gigabyte RAM model offering slightly better headroom for operations than the base model.

As I continue to acquire these types of items, the compilation format appears to offer a more efficient method for covering the steady influx of consumer technology accessories and niche devices. I will continue to separate individual segments for my Gadget Picks channel, but for now, this consolidated approach allows for a broader survey of the current gadget landscape.

See more hauls here!

Disclosure: The projector and charging station came in free of charge from Anker. The cable came in free of charge through the Amazon Vine program. No other compensation was received and no one reviewed or approved this content prior to uploading. I paid for the gaming handheld and the RePod with my own funds.

GL.iNet Comet Remote KVM Review (GL-RM1)

I picked the GL.iNet Comet KVM (compensated affiliate link) the other day, an entry-level remote KVM device designed to provide hardware-level access to computers and other HDMI-enabled equipment.

You can see it in action in my latest review!

I purchased this unit to facilitate remote administration without relying on software-based solutions. Unlike traditional remote desktop applications, the Comet captures the HDMI output from a target device and emulates keyboard and mouse input via USB, allowing for control through a standard web browser. Because it operates independently of the host machine’s operating system, it provides access to the BIOS and functions even when the target computer is not fully booted.

The hardware setup is straightforward, though it requires a wired ethernet connection as this specific model lacks Wi-Fi capabilities. The device features an HDMI input, a USB-C port for keyboard and mouse emulation, a USB host port for external storage and an Ethernet port for network connectivity. It is powered via a separate USB-C connection. During my testing, I connected the Comet to a headless mini PC. The device successfully emulated the peripherals, allowing me to navigate the BIOS and initiate a Windows boot sequence remotely from a Mac browser.

A notable feature of the Comet is its independence from mandatory cloud services. While a cloud option exists, the device does not bind itself to external servers by default, offering users greater control over their data privacy. For remote access outside the local network, the unit supports Tailscale and Zero Tier, allowing for secure VPN connections without opening firewall ports. However, users accessing the device via a browser may encounter security warnings due to the default security certificate, a configuration issue that lacks clear documentation for resolution.

In terms of performance, the Comet handles video streaming adequately for administrative tasks, with a latency of approximately 30 to 40 milliseconds on a local network. This delay makes it unsuitable for fast-paced gaming, though it supports resolutions up to 4K at 30Hz, with 1080p at 60Hz being the standard configuration. Audio pass-through is supported but must be manually enabled in the settings. The interface also includes a Wake-on-LAN feature, which can identify and boot compatible devices on the network.

File transfer capabilities are present but limited. The device utilizes a virtual media mounting system where files are uploaded to the Comet and then presented to the target computer as a USB drive. Transfer speeds are restricted by the USB 2.0 interface, resulting in slower performance for larger files. Additionally, the mobile experience is currently suboptimal; the browser interface on tablets is difficult to navigate, and the dedicated mobile app requires a cloud account, which contradicts the self-hosted preference of many users.

I also tested the device with non-standard hardware, specifically a MiSTer FPGA retro gaming setup. The Comet successfully allowed for remote control of the interface and basic operation of emulated systems, although mouse alignment and clipboard pasting were inconsistent.

Despite some rough edges, the device serves its intended purpose effectively, particularly for scenarios where installing remote desktop software is impractical or would interfere with performance benchmarking. I intend to integrate this tool into my workflow for managing test units remotely around the house and when I’m not at home.

ATSC 3.0 DRM Opponents Make Their Case to the FCC

The transition from the current over-the-air television standard to NextGenTV, or ATSC 3.0, continues to generate significant debate, particularly regarding the decision by many broadcasters to encrypt their signals.

In my latest video, I take a look at the filings from organizations and individuals opposing the implementation of Digital Rights Management (DRM) on the public airwaves.

This issue moved from theoretical to practical for me recently during the Super Bowl. I was unable to tune into the game over the air because my local NBC affiliate had encrypted their channel, and the legacy ATSC 1.0 signal was unreliable at my location, forcing me to stream the event instead.

I submitted my own filing to the FCC docket, effectively mirroring the arguments I raised in my prior video on this topic regarding the industry’s justification for encryption. To circumvent file size limitations on the docket, I attached a PowerPoint presentation with embedded video evidence, a method that allows for the submission of multimedia documentation under the 100-megabyte limit. This approach is useful for anyone wishing to demonstrate the real-world impact of these restrictions, such as devices failing to decrypt channels they are theoretically certified to receive.

One of the most comprehensive filings came from Public Knowledge, a consumer advocacy group. They commended the FCC for scrutinizing the issue but raised substantial concerns about the A3SA, the authority managing the encryption program. Public Knowledge argued that the A3SA operates without meaningful external oversight, maintaining confidential licensing terms and opaque decision-making processes. They contend this entity acts as a private gatekeeper to the public airwaves without accountability to consumers or public interest stakeholders.

Public Knowledge also highlighted the potential for consumer confusion arising from the current certification regime. There are now two distinct logos for consumers to navigate: the NextGenTV logo and the A3SA logo. A device might carry the NextGen TV certification, like the HDHomeRun gateway I use, yet lack the ability to decrypt content. Conversely, a device like the Zapperbox may have A3SA certification for decryption but lack the NextGenTV designation. During a recent visit to a major electronics retailer, I observed that neither logo was displayed on television sets that support the new standard, suggesting that this certification system has yet to effectively reach the consumer marketplace.

Furthermore, Public Knowledge drew a parallel between the current situation and the “broadcast flag” rule from the previous digital transition. They argued that the A3SA certification requirements essentially function as a new, more sophisticated broadcast flag, allowing broadcasters to dictate which devices can receive programming and potentially restricting recording capabilities. They also reminded the Commission that the FCC’s 2017 order to begin the ATSC 3.0 transition emphasized that encrypted programming should not require special equipment supplied by the broadcaster, a standard the current regime may be failing to meet.

Opposition also came from within the broadcast industry itself. Weigel Broadcasting, which operates stations reaching a vast majority of US households, filed comments expressing concern over the direction taken by larger broadcasting consortiums. Weigel presented evidence suggesting that some competitors view the new standard primarily as a vehicle for monetization, such as integrating gambling platforms or treating the spectrum as a financial asset rather than a public service. They acknowledged that the current implementation of DRM has created adoption hurdles and suggested that if encryption must exist, it should not require a persistent internet connection—a requirement that has already caused functionality issues with some commercially available tuners as noted in my prior video.

The Consumer Technology Association (CTA), which represents device manufacturers, also weighed in. While their filing focused largely on opposing a mandate for ATSC 3.0 tuners in all televisions, they acknowledged the friction caused by DRM. This is a complex position for the CTA, as the encryption technology being used is owned by Google, a major industry player and CTA member, yet the implementation is harming member companies like SiliconDust (also a member). Their filing recommends that the Commission continue to monitor the intersection of DRM and the new standard, a notable admission from an organization that typically advocates against government intervention in their industry.

Similarly, the NCTA, representing cable and internet providers, cited encryption as a complicating factor that adds cost and technical challenges to the transition. They argued that these complexities support their stance against a forced transition to the new standard, noting that the need to support new audio and interactive formats is already a heavy burden without the additional layer of decryption requirements.

For those who have experienced issues with encrypted channels or malfunctioning hardware, the opportunity to place these experiences on the record is closing. The reply deadline for this docket is February 18. Under FCC rules, new filings at this stage must be in direct response to arguments already present in the record. This provides a narrow window for consumers to submit evidence countering the claims made by broadcasters, such as documenting instances where “offline” DRM failed to function as advertised. The record is currently being shaped by these final arguments, and the volume and specificity of these replies may influence the Commission’s next steps.

You can get more information about how to file here. I also did a video on the topic here.

Is the 2022-2026 Macbook Air The Greatest Laptop of All Time?

Typically, purchasing a laptop involves a compromise. If the budget is limited, one usually has to sacrifice performance, battery life, or portability. Finding a machine that adequately addresses all three requirements is rare, yet over the last few years, my 2022 MacBook Air M2 has largely managed to balance these competing needs. Despite the release of newer models, this device remains a significant benchmark for what a portable computer can achieve – and new versions cost less than the one I bought almost four years ago. Check out current offerings on Amazon (compensated affiliate link).

I take a deeper dive in my latest video.

Looking back at the hardware after nearly four years of daily use, the durability is notable. While there is some minor cosmetic wear—specifically some color rubbing off on the sides and the accumulation of oil on the keyboard—the metal chassis has held up against standard knocks and bumps. The display has maintained its brightness without flickering, and the keyboard, a departure from Apple’s lousy scissor-switch mechanism, remains fully functional with no stuck keys. Weighing in at roughly 2.7 pounds, the device is balanced enough to be handled with one hand, a feature that aids its portability.

From a port standpoint, the inclusion of the MagSafe charging connector was a practical decision. It frees up the two Thunderbolt ports for peripherals and prevents the laptop from being pulled off a surface if the cable is snagged. While the computer side of the magsafe cable is proprietary, the other end is standard USB-C. The Thunderbolt ports will still charge the laptop if using a desktop docking station.

The primary limitation regarding connectivity remains the inability to natively drive two external displays, a feature reserved for the “Pro” tier devices. However, for a single-monitor setup, the clamshell mode functions effectively as a desktop replacement.

When I originally purchased this unit, I opted for the 16GB RAM configuration rather than the base 8GB, a decision that appears to have contributed significantly to the machine’s longevity. Interestingly, a comparable configuration today—equipped with the newer M4 chip—actually costs approximately $400 less than what this M2 model cost in 2022. While the new chips offer performance gains, the 10-core GPU in this older model still handles demanding tasks competently.

Battery performance has been perhaps the most consistent aspect of the ownership experience. Across extensive travel and full days of conferences, I have yet to encounter a low-battery notification during standard operational hours. Even after approximately three and a half years and 364 charge cycles, the battery has retained about 89% of its original health. This endurance persists even when the machine is subjected to heavier workloads that typically drain portable devices quickly.

Regarding those workloads, the machine handles 4K video editing at 60 frames per second without significant friction. Using Final Cut Pro, scrubbing through footage and rendering effects happens almost instantaneously. It is a level of responsiveness often absent in lower-end Windows laptops running similar software like DaVinci Resolve. While I did not purchase this machine specifically for video production, it has proven capable of serving as a mobile editing station when I need to travel light.

The architecture also supports robust virtualization. Using UTM, I have been able to run the ARM version of Windows 11 alongside Ubuntu Linux, and even emulate older environments like Mac OS 9 and Windows 95 simultaneously. The performance is stable enough to browse the web within the virtualized Windows environment or run office applications in Linux without noticeable slowdowns.

Gaming on Apple Silicon has also evolved. With titles ported to the native architecture, performance on a fanless laptop is surprisingly viable. Running Cyberpunk 2077 on low settings yields a steady 30 frames per second. While it doesn’t reach the high frame rates of a dedicated gaming rig, it offers a playable experience for casual sessions. The lack of active cooling means the system might throttle under sustained load, but I have not observed significant performance drops during use.

Finally, the device shows promise with local AI workloads. In the video I demoed the Locally app that connects to open-source models like Gemma. My aging laptop, which released a few months before the commercial introduction of ChatGPT, processes queries with reasonable speed. While newer chips are optimized further for these tasks, the unified memory architecture allows this older model to handle basic language models and light automation without excessive memory or processing penalties.

Given its sustained performance across varied tasks—from virtualization to media creation—I see no urgency to upgrade to the M4 generation. The M2 MacBook Air continues to function as a reliable, well-constructed tool that meets daily professional demands. For those who can find this model on the secondary market or on sale, it represents a hardware investment that still offers substantial utility years after its initial release.

GMKTec K15 Mini PC Review

I recently received the new GMKTec K15, marking my first mini PC review of 2026. If I had to characterize this device with a single analogy, I would describe it as the Toyota Camry of its category: It is neither a stripped-down budget device nor a high-end powerhouse; rather, it occupies a functional middle ground. You can find it on Amazon here (compensated affiliate link).

See it in action in my latest review!

The system is built around the Intel Core Ultra 125U processor from the Meteor Lake family. This chip features a 12-core architecture—comprising two performance cores, eight efficiency cores, and two low-power efficiency cores—delivering a total of 14 threads. My unit arrived equipped with 32 GB of DDR5-4800 RAM and a 1 TB NVMe SSD. While the current price sits higher than it otherwise would due to the volatility of memory prices, if things do let up it should sell for less than its current price.

Despite the cost, the expandability is notable; the system supports up to 96 GB of RAM and features three NVMe slots, which is generous for a device of this footprint.

Connectivity is a strong suit for the K15. The front panel includes a 10Gbps USB-C port and three USB-A ports. The rear I/O offers 40 gigabit USB 4 port, which is Thunderbolt compatible, dual 2.5GbE Ethernet ports, and an Oculink port. The Oculink addition is particularly useful for those interested in external GPUs, as it connects directly to the PCIe bus, offering superior bandwidth compared to USB 4. During my tests, the Wi-Fi 6 chipset performed well, maintaining speeds close to gigabit levels, and the variety of ports suggests this unit could easily be repurposed as a home server.

In terms of daily performance, the K15 handles standard desktop workloads efficiently. Web navigation is snappy, and 4K video streaming presented no issues aside from the expected minor frame drops upon initial loading. Content creation capabilities, however, have a clear ceiling. When editing 4K video in DaVinci Resolve, simple cuts and transitions were smooth, but the system bogged down significantly when attempting complex color grading or heavy effects. It is serviceable for basic edits, but anything more demanding would necessitate an external graphics solution.

Gaming performance aligns with the limitations of the integrated graphics and the reduced GPU performance on this 125U processor vs. the higher end 125H. Testing Cyberpunk 2077 at 1080p with the lowest settings resulted in frame rates hovering between 25 and 30 frames per second. It’s certainly playable, but lagging behind some of the more higher end mini PCs. While it struggles with modern, graphically intensive titles, it is perfectly adequate for older games or emulation. Thermals were well-managed throughout these stress tests; the CPU temperature stayed around 43°C, and the fan noise was minimal, likely due to a larger fan design that moves air efficiently at lower RPMs.

The device arguably shines brightest when running Linux. My experience with the OS was seamless, with all hardware—including Wi-Fi and Bluetooth—detected immediately. The system felt more responsive on Linux than on Windows, which has become increasingly bloated. Between the stable performance, the quiet operation, and the extensive storage options, the K15 stands out as a sensible, if modest, choice for a reliable workstation.

Disclosure: GMKTec sent the K15 to the channel free of charge but no other compensation was received. They did not review or approve my review prior to publication and all opinions are my own.

This Was the Best Selling Game Console of 1976

To commemorate my upcoming 50th birthday, I acquired a piece of technology that shares my birth year: the Coleco Telstar, a video game console released in 1976. It’s the subject of my latest retro video!

I purchased this device for a local historical society project celebrating the United States’ 250th year, intended to demonstrate to younger generations what home entertainment looked like when the country turned 200. The unit, a Pong clone, was manufactured by the Coleco, formerly known as the Connecticut Leather Company making this quite relevant for a local Connecticut historical society!

This specific model, the 6040, was the first edition released by Coleco. Its market success was largely due to its price point; while competitors like the Magnavox Odyssey and Atari’s Pong console retailed for approximately $100, the Telstar launched at just $50. Adjusted for inflation, that $50 price tag is roughly $290 today. This aggressive pricing strategy helped the company sell over a million units, a figure surpassed only by a Nintendo Pong clone sold exclusively in the Japanese market.

Internally, the device is distinct from modern consoles as it lacks a central processing unit. Instead, it operates using a specific chip, the AY-3-8500, which has the game logic hardcoded directly into its circuitry. Because the software is fixed on the chip, the system is not programmable. It generates sound through a built-in speaker rather than the television set and connects to displays via an analog RF connector, originally designed to work with a switch box on the VHF band’s channel 3. While a power connector was available as an add-on, the device was primarily intended to run on six C batteries.

The gameplay experience is controlled by knobs that move paddles on the screen, with a difficulty slider available to adjust the game mechanics. The console features three variations: a standard tennis-style Pong game, a single-player handball mode, and a hockey game where players control both a goalie and a forward. Upon testing this specific unit, I noted several functional issues consistent with its age, including a stuck game selector switch and a malfunctioning difficulty slider that fails to resize the paddles correctly on the “pro” setting.

This device represents the entry of Coleco into the video game market, a venture that eventually led to the release of the legendary ColecoVision console and the less successful ADAM personal computer. The Telstar remained on the market for approximately two years before the company shifted focus to handheld games and programmable consoles. It serves as a historical marker for home gaming in 1976, predating the significant technological leap that occurred just a decade later with the introduction of titles like The Legend of Zelda.

How I’m Using Plex in 2026 (sponsored post)

I’ve been using Plex for well over a decade now, long before any sponsorships entered the picture, and it remains the backbone of how I manage and watch my media at home and on the road. As a point of disclosure, this video and the transcript it’s based on are part of a paid sponsorship with Plex, but they did not review or approve the content beforehand.

My current Plex server runs on Unraid, which has proven to be a flexible choice that makes installing the Docker version of Plex super easy. Right now, the server itself is a small Beelink ME Mini NAS/PC paired with a USB-connected multi-bay SATA enclosure. It’s not a particularly elegant setup in terms of cabling, but it’s been reliable.

One of the reasons I’ve stuck with Unraid is how easy it is to migrate from one machine to another. Moving from an earlier NAS box with thermal issues to the current setup was simply a matter of transferring the Unraid external boot drive and disk array. The system came back online without any any configuration drama, which makes incremental upgrades far less painful.

The processor in this server is a low-power Intel N150, and in practice it has been more than sufficient. It handles multiple Plex transcodes at once and still leaves enough headroom for other Docker containers I run alongside it. That experience has reinforced my view that you don’t need particularly powerful hardware for a small, well-tuned home server so long as your processor supports hardware transcoding. The Intel N100 and N150 chips are available in many affordable mini PCs and entry-level NAS devices.

I also maintain a second Plex server offsite at a family member’s house, running on a Synology NAS. That system serves double duty as a test bed and as an offsite backup destination, giving me control over where my data lives. To connect everything together securely, I rely on Tailscale. It allows me to access my servers remotely without exposing them directly to the internet, and I can limit access to specific people and devices. That balance between convenience and security has worked well for my use case.

Most of my serious viewing happens at home, particularly higher-bitrate Blu-ray rips that I watch in my home theater. That setup centers around an older LG OLED television paired with an Nvidia Shield from the 2019 generation. Despite its age, the TV still delivers excellent image quality, and the Shield handles Dolby Vision playback from both streaming services and locally ripped discs.

With proper audio passthrough enabled, lossless Dolby Atmos tracks make it from the server to the sound system untouched, which is exactly what I want for that kind of content. I also enable refresh-rate switching so films play back at their native 24 frames per second, avoiding unnecessary judder.

Over time, I’ve built up a sizable library, and lately I’ve found myself revisiting older television series. Plex’s ability to shuffle episodes has become a surprisingly useful feature, especially for shows I know well and don’t feel the need to watch in order. It turns familiar series into something closer to background comfort viewing, without much thought required.

Live TV is another part of my setup, using an HDHomeRun tuner integrated into Plex. I can mix over-the-air channels with streaming channels in a single guide, and when I’m traveling, I can even watch my local channels remotely. Plex doesn’t currently support ATSC 3.0 broadcasts due to encryption and audio codec limitations, so recordings are limited to ATSC 1.0. I also handle actual recording through the HDHomeRun app, with Plex pointed at the directory where those recordings are stored so both systems can access them.

One of the more recent additions to my workflow is Plex’s watch list feature. When I hear about a show or movie that sounds interesting, I add it to the list from my phone. Later, when I sit down to watch something, Plex shows me not just the title but where it’s available, whether that’s on my own server, a friend’s server, or a streaming service. It’s a practical way to reduce the time spent deciding what to watch, especially when free time is limited. The same interface also surfaces trailers and upcoming episode release dates, which acts as a lightweight reminder system.

Music is handled through Plex as well. I’ve been slowly ripping decades’ worth of CDs into lossless files, which now live alongside my video library. Most listening happens through the Plexamp app on my phone, both at home and remotely. For travel, I’ll download albums or playlists directly to the device. While wireless headphones limit some of the benefits of lossless audio, using wired headphones makes a noticeable difference, especially on long flights.

Speaking of travel, the download feature has also been useful for loading TV episodes onto a tablet before trips using the Plex mobile client, letting me watch without relying on in-flight connectivity.

Looking back, Plex has stayed in my workflow because it’s made managing and accessing my media more straightforward. It brings together local files, live TV, and streaming discovery in a way that reduces friction rather than adding to it. For me, that efficiency is the real value, and it’s why the system I set up years ago continues to evolve rather than being replaced.

GeForce Now Gets a Linux Client

In my latest video, we revisit GeForce Now and take a look at the new official Linux client for Nvidia’s game streaming service.

This release is not as feature-heavy as some previous updates, but it represents a meaningful change for Linux users who until now have primarily relied on browser-based access to the service. This follows a Steam Deck client that I took a look at recently.

GeForce Now is a subscription-based service that streams games users have already purchased from platforms such as Steam, GOG, Epic Games Store, Ubisoft, EA, and certain Xbox titles with PC versions. Xbox PC Game Pass titles can also be accessed if the user has an active subscription. Not every game in a user’s library is supported, as developers must opt in to cloud streaming, but the catalog covers many well-known titles. In my previous update we also looked at Nvidia’s new feature that allows users to install unsupported games too.

Because GeForce Now runs games remotely, client-side hardware requirements are relatively modest, making systems like this a practical test case. To test the client, I ran it on a very low-end system: an inexpensive mini PC powered by an Intel N100 processor. This is the type of super low cost hardware simply can’t run modern games, which makes it useful for evaluating how much of a value-add a game streaming service can provide. Back before the RAM crisis this PC was selling for well under $200.

The service is offered in multiple tiers. The free tier supports up to 1080p at 60 frames per second, includes advertisements, limits sessions to one hour, and places users in a queue for access. The Performance tier increases resolution to 1440p at 60 frames per second, while the Ultimate tier offers access to higher-end GPUs in the cloud, enabling resolutions up to 5K and frame rates as high as 240 frames per second on supported games. Both paid tiers include a monthly cap of 100 hours, with additional time available for purchase once that limit is reached.

The install process for the new linux client is functional but still feels a bit rough around the edges, particularly compared to more polished platform-native installers. The new client is designed for x64-based PCs and is currently targeted at Ubuntu 24.04 LTS. Installation is handled through a Flatpak package downloaded directly from Nvidia rather than through a distribution’s package manager. After downloading, the installer needs to be marked as executable before it can be run.

After installing the Linux client, I launched No Man’s Sky from my Steam library. As with other GeForce Now clients, the service spins up a remote PC instance running Steam, which allows cloud-synced save files to load automatically. In this case, my existing save was available without any additional steps.

Running at 4K and 60 frames per second, the game performed smoothly on the low-cost mini PC. Network statistics showed a latency of around 11 milliseconds from my home in Connecticut to Nvidia’s New Jersey data center. The system was connected via Ethernet, which remains the recommended way to use the service given the bandwidth demands of high-resolution game streaming. But a decent Wifi 6 or 7 access point should deliver adequate performance on a

I also tested the client earlier on a 1080p, 144 Hz display and was able to exceed 60 frames per second without issue, despite the limited client hardware. While the Linux client currently lacks support for features such as HDR and cloud-based G-Sync, it does support server-side options available to higher-tier subscribers, including DLSS and hardware ray tracing for compatible games.

There are some usability issues to note. Display scaling was not respected on a 4K desktop set to 200 percent scaling, resulting in very small interface elements. And the interface felt a bit slow and clunky on my low end hardware but thankfully the clunkiness went away once a game was loaded up.

Overall, the Linux client delivers a more consistent experience than running GeForce Now in a browser and makes the service more accessible to users who have adopted Linux as their primary operating system. For those with lower-end hardware, it provides a way to run demanding games using remote resources, with performance that is largely dictated by network quality rather than local specifications.

ADTH’s ATSC 3.0 Box Woes Kill the Industry’s Arguments Regarding Over the Air TV Encryption

I’ve been spending the last few days reading through the filings in the FCC’s ATSC 3.0 docket now that the comment period has closed, trying to understand how broadcasters, device makers, and industry groups are framing the next phase of the over-the-air television transition.

While I was doing that, I went upstairs to check on my own ADTH tuner, a device that’s supposed to handle encrypted ATSC 3.0 channels without needing an internet connection. It wasn’t working. Encrypted channels wouldn’t tune at all, and the box was throwing content protection errors that hadn’t been there before.

In my latest analysis piece, I talk about how widespread problems with this box tuning encrypted channels popped up just as the industry was saying there were no concerns with DRM.

That problem sent me down a familiar path. ATSC 3.0 is the planned successor to today’s ATSC 1.0 broadcast standard, and on paper it brings technical improvements. In practice, the transition has been complicated by broadcasters choosing to encrypt free, over-the-air signals. That decision has narrowed consumer choice and added layers of complexity that simply didn’t exist before. The industry’s assurances that this system is mature and reliable don’t line up with what I’m seeing in my own home.

One of the filings I reviewed came from ADTH itself. The company strongly supports the transition and argues that there are no real technical barriers to consumer devices receiving encrypted broadcasts. Encryption and digital rights management, they say, are routine in modern electronics.

That’s hard to square with my experience. After repeated errors, I tried a factory reset. Instead of fixing anything, the device dropped into a boot loop, endlessly scanning channels and rebooting. Even with an internet connection restored, it refused to recover. At that point it stopped being a TV tuner and effectively became a brick.

What made this more than a minor inconvenience was timing. We were in the middle of a significant snowstorm, the kind of situation where over-the-air television has historically been a reliable source of local information. Because the encrypted channels wouldn’t tune, that information simply wasn’t available on this device. And this doesn’t appear to be an isolated issue. I’ve heard from viewers and seen reports on Reddit and AVS Forum from people around the country whose boxes stopped working around the same time. Some even reported that disconnecting the internet made their tuners work again, which raises uncomfortable questions about how these systems are actually operating.

At the same moment consumer devices were failing, the group that oversees the encryption system, the A3SA, told the FCC it has seen no evidence of approved devices failing to work with encryption. They also suggested that any reported issues are generally resolved with firmware updates. That response glosses over a basic problem: firmware updates require an internet connection. Requiring internet access just to watch free, over-the-air television undermines one of broadcast TV’s core purposes, while adding cost and fragility.

The A3SA also describes itself as a “neutral, standards-based administrator.” From what I’ve seen, that neutrality is questionable. The group is made up of major broadcasters and has effectively decided which manufacturers can and can’t participate. SiliconDust’s HDHomeRun, a widely used network tuner, has been denied approval, while other devices with similar technical characteristics have been cleared.

Another theme running through the filings is piracy. Broadcasters cite tens of billions of dollars in losses and argue that encryption is necessary to protect their content. When you dig into the examples they reference, though, the picture changes. One high-profile piracy case they cite involved stealing encrypted signals from cable and satellite providers, not rebroadcasting free over-the-air signals.

Encryption, it appears, inconveniences only those who are viewing content lawfully – not the pirates.

Broadcasters also warn that without encryption they risk losing premium sports programming. Yet recent rights deals tell a different story. The NFL, NBA, MLB, NASCAR, and major college conferences have all committed to long-term agreements that keep marquee events on broadcast television for years to come. These deals were struck without any guarantee that over-the-air signals would be encrypted, which undercuts the argument that encryption is essential to retaining top-tier content.

The FCC has also raised questions in this filing round about consumer rights, particularly the long-standing right for consumers to record broadcasts at home for personal use. That right was established decades ago, but encryption complicates it. Circumventing DRM, even for lawful personal recording, can be illegal. The A3SA argues that internal rules already protect home recording, but those assurances are tied to current simulcasting requirements that may disappear. Once they do, the only remaining safeguards would be voluntary commitments from broadcasters whose financial incentives don’t necessarily align with consumer flexibility.

Underlying all of this is a business reality that the National Association of Broadcasters acknowledged more directly in its own filing. Encryption is about protecting retransmission fees, the charges cable and streaming providers pay to carry broadcast channels. Those fees have risen sharply over the years, and making free reception less convenient creates pressure to return to paid services. That strategy may make sense from an industry perspective, but it runs counter to the idea of broadcast spectrum as a public resource.

There’s also nothing in the current framework that limits encryption to a single system. The ATSC admits in their filing that multiple, incompatible schemes could emerge, adding yet another layer of confusion for viewers and device makers alike. At that point, the promise of ATSC 3.0 as a straightforward upgrade starts to look like something else entirely.

After reading the docket and dealing with a tuner that worked one day and failed the next, I’m left with the sense that encryption over the public airwaves is creating problems faster than it’s solving them. Broadcasters were granted access to spectrum at no cost, with the understanding that they would serve the public interest. Turning free television into a fragile, tightly controlled experience doesn’t seem consistent with that mission. I plan to file a reply in the FCC proceeding during the response window, and there’s more in these filings worth unpacking.

Stay tuned for more and see my full ATSC playlist here!

Nine Reviews in 24 Minutes – My Latest Amazon Tech Haul!

It took me six months but I finally pulled together enough random gadgets for my next Amazon Gadget Haul “lightning round” of product reviews!

Check it out here!

This time I have nine different devices to check out! A majority came in free of charge from their manufacturers, but this is not a sponsored review nor has anyone reviewed or approved this video prior to uploading. All product links below are compensated affiliate links.

The first item I looked at was the Ostation 2 Pro, a battery charging system designed for AAA and AA cells. It accepts nickel metal hydride batteries as well as Ostation’s own rechargeable lithium options, which provide a full 1.5 volts for devices that expect alkaline batteries. Batteries drop into the top and the unit mechanically feeds them into the charging bays, displaying charge status on a small screen. Once charged, batteries are deposited into a drawer at the bottom, making it easy to grab fresh ones. It can only charge two AA and two AAA batteries at a time, which limits throughput, and it does make some motor noise while operating, but it functions as a kind of battery inventory system that keeps everything in one place. The Pro version also includes a magnetic charging pad for Ostation flashlights, though the display features themselves don’t add much beyond status information.

Staying on the theme of power, I then moved to desktop chargers. One was the Joyroom Podix, a 140-watt GaN charger with two retractable USB-C cables built into the unit. It’s fairly large, which makes it less ideal for travel, but convenient for a desk setup where cables often go missing. A small display shows total power draw, and while it comes with a base and strong adhesive option to keep it in place, that adhesive feels aggressive enough to warrant caution on finished surfaces.

I also tried Anker’s new 160-watt Prime charger, which packs three USB-C ports and a built-in display into a wall charger. What sets it apart is app integration over Bluetooth, allowing real-time monitoring of power delivery and per-port configuration, including priority modes and wattage limits. It doesn’t offer remote access over Wi-Fi, but standing near the charger you can see detailed data about what each device is drawing. The physical design holds more securely in the outlet than some older Anker models I’ve used, and it’s likely to replace my existing everyday charger.

From power to input devices, I next looked at the Retro Fighters Hunter 360 controllers. These are modern replacements for the Xbox 360 controller, complete with Hall effect analog sticks and mechanical D-pad switches. They work on PCs and with the Xbox 360 itself, though the console requires a dedicated 2.4 GHz dongle per controller. The dongle is required on the 360 as it won’t connect to its built in wireless controller system. Inputn lag was minimal on both wired and wireless, and the build quality felt solid. Voice chat isn’t supported through the controller, which is one limitation for anyone still using those features on original hardware.

Next up is a Thunderbolt 5 dock from WaveLink. With Thunderbolt 5 docks now priced similarly to Thunderbolt 4 models, it makes sense to consider the newer standard even if your current computer doesn’t fully support it. You’ll be ready to go when upgrading your hardware to a Thunderbolt 5 model. The increased bandwidth allows for more demanding multi-display setups, and the dock offers multiple Thunderbolt passthrough ports along with USB-A, SD card, and audio connections. Ethernet performance was unfortunately typical of what I’ve seen on similar docks, with slightly reduced downstream speeds on macOS despite having a 2.5-gigabit port. Like most docks in this class, it relies on a large external power supply to deliver up to 140 watts to a connected computer.

Audio came up next with the latest Amazon Echo Studio which I purchased with my own funds. It’s smaller than earlier versions but still produces a wide, bass-heavy sound that feels substantial for its size. Beyond audio, it now serves as an entry point to Amazon’s Alexa Plus features, which include more conversational responses and, more interestingly, the ability to create smart home routines using plain language. I was able to set up a lighting routine simply by asking for it, without navigating menus in the app. While the assistant tends to be more verbose than earlier versions, the routine creation alone could save time for people who struggle with smart home configuration.

Another device aimed at productivity was the Plaud Note Pro, an ultra-thin voice recorder designed to live on the back of a phone. It records phone calls or ambient audio, stores hours of recordings locally, and syncs them to a phone app. From there, recordings can be transcribed and processed into meeting notes using built-in AI templates. While the subscription model and upselling are hard to ignore, the hardware itself is compact and practical, and the all-in-one workflow may appeal to people who want transcription and summaries without juggling multiple tools or knowledge of AI prompt optimizations

The final item was more of a preview: the RØDECaster Video S. It’s a compact video switcher with multiple HDMI inputs, audio inputs including XLR, and the promise of features like NDI support. I didn’t review it in depth yet, but unboxing it gave a sense of how it might fit into lower-cost video production setups, especially compared to older switchers that haven’t seen updates in a while.

These haul videos don’t run on a fixed schedule—they happen when enough interesting items pile up—and this batch covered a wide range of everyday tech problems, from keeping batteries charged to simplifying workflows at a desk. You can check out prior editions here!