Abbott Lingo Review : Over the Counter Glucose Biosensor for Non-Diabetics

I like data. I spend a lot of time looking at analytics from my YouTube channel, telemetry from devices around my house, and usage stats from the services I rely on every day. What I don’t usually have access to is real-time data about what’s going on inside my own body. But at CES this year, I ran into Abbott and was provided with a two-week trial of their over the counter Lingo continuous glucose monitor for non-diabetics.

Check it out in my latest review!

The sensor sits on the back of my arm and sends blood sugar readings to my phone every few minutes. Abbott has long made glucose monitors for people managing diabetes, but Lingo is positioned differently. It’s aimed at people without a diabetes diagnosis who want more insight into how food, exercise, and daily habits affect their blood sugar.

The hardware itself was easier to live with than I expected. The applicator looks intimidating at first glance, but the actual installation was painless for me. I didn’t feel a prick or sting—just a click, and it was done. One small omission in my box was an alcohol wipe, so you’ll want to have one handy before applying it. Once attached, the sensor stayed firmly in place through showers and daily activity, to the point where I mostly forgot it was there.

After pairing it with my phone over Bluetooth, it took about an hour for the first reading to appear. From there, the app updates roughly every five minutes. The real value comes from seeing how those numbers change in response to everyday choices. Eat lunch, and you can watch the curve start to rise. Go for a walk, and you can see how even light exercise affects the slope and duration of that spike.

Lingo tries to make this approachable by translating glucose spikes into what it calls a “Lingo score.” The score reflects how high your blood sugar rises and how long it stays elevated. One evening, I had leftover sausage and pepper pizza. The resulting spike was sharp, dipped, then rose again as digestion continued, earning me a high score for that meal. Earlier in the day, a healthier and more protein-heavy turkey sandwich produced a much smaller, shorter-lived rise.

What surprised me most was how quickly this started influencing my behavior. Knowing that a certain food would likely generate a bigger spike made me think about timing—whether I could follow it with a walk—or whether it made more sense to choose something else. The app reinforces this by suggesting simple mitigations, like light exercise after eating, and by offering challenges focused on habits rather than calorie counting.

Logging matters too. The app isn’t asking you to obsess over nutrition labels, but it does want you to note when you eat, when you exercise, and even when you’re feeling stressed. If you use a smartwatch, some of that happens automatically. In my case, dog walks detected by my watch showed up in the app without any extra effort on my part.

There’s also an educational side, with recipes, short videos, and explanations designed to help you interpret what you’re seeing. The content feels more like guidance than instruction, which fits the overall tone of the product. This isn’t positioned as a medical device for diagnosis, but rather as a feedback tool. If you did see readings that looked concerning, that would be a conversation to have with a doctor.

Abbott sells Lingo as a two week kit along with options for longer durations. I found that you can get enough data out of the device in two weeks making a longer duration purchase unnecessary. The data doesn’t disappear when the sensor comes off, and you can export it or keep it in your health app for reference later.

After a few days of use, I had a much clearer picture of how my body responds to foods I already thought I understood. That awareness alone was enough to start nudging my choices in a different direction. For something that measures just one variable, it ended up saying a lot about daily habits I don’t usually think twice about.

The RAM Crisis Explained: An Interview with Framework’s Nirav Patel

The price of memory is climbing, and it’s not just a problem for people building a new PC. RAM for laptops, desktops, phones, and tablets is getting more expensive as AI data centers absorb an increasing share of global supply. To better understand what’s happening behind the scenes, I called up Nirav Patel, CEO of PC maker Framework, to talk through how this shortage developed and what it means for consumers over the next several months.

Check out the interview in my latest video!

Patel described the current situation as a classic supply-and-demand imbalance, but on a scale the consumer market hasn’t seen before. Only a handful of companies—Micron, SK hynix, and Samsung—manufacture most of the world’s DRAM, and expanding capacity requires massive capital investment.

“What we’re seeing right now is just a massive excess of demand relative to the supply available,” Patel said.

With AI servers commanding higher margins, manufacturers are prioritizing those customers, leaving consumer products with tighter allocations. That imbalance has been building quietly for years, but it became much more visible when Micron announced it was shutting down its Crucial consumer memory brand last month. For PC builders, Crucial had long been a reliable option. Patel said the decision made sense given current conditions.

“When memory is in allocation, it doesn’t make sense to compete with your own customers,” he explained, noting that Micron supplies chips not only to large OEMs like Dell and HP, but also to other consumer memory brands.

One reason Framework has been able to navigate repeated supply disruptions—from pandemic shortages to GPU crunches and now memory—is its modular design philosophy. Patel credited flexibility as a survival tool.

“We built the product to be modular, and that gives us a lot of flexibility to navigate these kinds of environments,” he said.

Because many Framework systems are sold as DIY editions, customers can source their own memory and storage when shortages hit, sharing some of the burden rather than leaving the company entirely exposed.

The uncertainty isn’t limited to pricing. Patel described a market filled with overlapping orders, canceled allocations, and even hoarding.

“It is actually very unclear to anyone what the true ground truth is in the market when it comes to the supply and the demand,” he said.

Companies are placing duplicate orders with multiple suppliers, unsure which ones will be fulfilled. That behavior, he noted, can make shortages appear worse than they ultimately are, at least until reality catches up.

Geopolitics are also playing a role. Chinese memory maker CXMT has historically been avoided by many U.S. companies due to sanctions and long-term sourcing concerns, but Patel said that’s starting to change. “If you’re not sure where you’re going to be able to get your memory in two months, you better go and qualify every possible source,” he said, adding that some major OEMs are now testing and approving parts they previously wouldn’t have considered.

For consumers, the immediate concern is quality as prices rise and supply tightens. Patel’s advice was straightforward: stick with established brands. He doesn’t expect major manufacturers to compromise their reputations to chase short-term gains.

“Those brands are not going to torch all of their credibility in this short window of time,” he said, though he acknowledged that lesser-known vendors may try to take advantage of the situation.

While memory is the biggest constraint right now, Patel doesn’t believe every component will remain scarce long term. If memory remains the bottleneck, other parts like GPUs and storage should eventually stabilize because they can’t be deployed without sufficient RAM. In the near term, however, he expects continued volatility as the market works through excess orders and misaligned expectations.

Looking further ahead, Patel pushed back on the idea that soldered or unified memory is a solution to shortages. Even systems that place memory on the same package as the processor often rely on separately sourced components. For Framework, modular memory remains central to its roadmap, especially during periods like this. “Buy what you can afford today,” he said, “and buy solutions that let you upgrade in the future.”

Patel emphasized uncertainty as the defining market feature of the moment. AI demand has reshaped how memory is allocated, and the consumer market is now competing in a space it no longer dominates.

Curious about Framework? Check out my Framework videos here and my other interviews here!

More DOS Game Fun: Unlocking the Potential of GOG’s DRM-Free DOS Games

While the ExoDOS project serves as a comprehensive effort to preserve nearly every DOS game ever created, its massive 638-gigabyte archive can present a significant barrier to entry for users seeking just a few specific titles. For those interested in acquiring only a few games without managing large downloads, the GOG platform offers a practical alternative.

In my latest retro video, I explore how the platform’s DRM-free policy allows users to extract game data from the default installation wrapper and migrate it to other environments, such as Linux, custom DOSBox configurations, or even original retro hardware!

To demonstrate this process, I picked the classic Wing Commander 2, which is available on the platform bundled along with the first game and expansion packs for approximately three dollars—a significant reduction from its original retail cost in 1990.

The extraction process varies slightly by operating system. On a Macintosh, rather than using the Galaxy client, I downloaded the offline backup game installer. After bypassing standard security prompts to install the legacy software, the game files are typically contained within the application package. By right-clicking the executable and selecting “Show Package Contents,” then navigating to the “Resources” and “Game” subfolders, users can locate the raw game data and executables. Moving them is as easy as copying it over to a new directory or drive.

I copied these files to a separate directory to test them with Boxer, an open-source DOSBox port for macOS that has been forked for compatibility with Apple Silicon. One functional advantage of this manual extraction is the ability to enable features not active in the default wrapper. In the case of Wing Commander II, the default installation uses Sound Blaster audio; migrating the files allowed me to configure Boxer to support the Roland MT-32 soundtrack.

The procedure on Windows is equally straightforward. After running the offline installer, the necessary game data—specifically Wing Commander II’s “gamedat” folder and root files—can be found directly in the installation directory, usually located on the C drive. While the folder may contain modern cloud save data or platform-specific wrappers, these are not required for the game to function in other environments. Just like the Mac version, the game files can simply be copied out of the installation directory.

To verify the portability of these DRM-free files, I transferred the extracted Wing Commander 2 data onto a Compact Flash adapter and loaded it into my 26-year-old college laptop running Windows 98. This test confirms that the software sold through GOG remains independent of the delivery mechanism, granting users the flexibility to execute the code on the hardware or emulator of their choice. It’s a rare example of true digital ownership!

BuzzTV Powerstation P6 Review.. It’s not an Nvidia Shield

I review a lot of TV streaming boxes, and for enthusiasts the Nvidia Shield has long been the reference point. It has been around since 2015 and remains a capable device for people running their own media servers with support for full 4K Blu-ray rips, including support for Dolby Vision and lossless audio formats. It also shares its core hardware lineage with the Nintendo Switch, which gives it enough performance headroom for gaming and emulation.

While attending CES, I came across a company called BuzzTV showing a device called the Power Station 6 that they said is more powerful than the Shield. I decided to purchase one to see how it performed and whether it could serve as a realistic alternative. Spoiler alert, it doesn’t.

Check it out in my latest review!

The model I chose was the least expensive version, which includes 8 GB of RAM and sells for just under $300. There are higher-end variants with 16 GB and even 32 GB of DDR5 RAM, with the most expensive version priced at around $500. At that level, it starts competing directly with compact Ryzen-based mini PCs, which generally offer more flexibility and stronger overall performance for similar money.

All versions of the Power Station 6 use the same Rockchip RK3588 processor. Storage on the base and mid-tier models is 128 GB, while the highest-end version includes 256 GB. There is also an SD card slot and an internal bay for an NVMe SSD, allowing for quiet, solid-state expansion. Physically, the unit looks appealing but feels lightweight and somewhat hollow. The port selection is reasonable, with USB 3.0, USB 2.0, USB-C, optical audio out, gigabit Ethernet, and HDMI output rated for up to 8K. In practice, however, its usefulness as a home theater device quickly runs into limitations.

In my testing, this was not a strong Plex client. When I opened the box I was greeted with a warning that the Powerstation 6 is not to be plugged into a home theater receiver for some strange reason.

Dolby Vision was not supported, and lossless audio passthrough to my receiver did not work. While it can output 4K and 8K video, the lack of video and audio passthrough features means it doesn’t support the enthusiast-grade playback experience that the Shield is known for. That was disappointing given the price category this device occupies.

Performance is one of the few areas where the Power Station 6 shows some promise. In the 3DMark Wildlife benchmark, it slightly outperformed the Nvidia Shield, though not by a wide margin. That extra headroom shows up in emulation. GameCube titles like Wave Race using the Dolphin emulator ran at full speed in my testing. PlayStation 2 emulation was more mixed. Using NetherSX2 at minimum settings, demanding games like Burnout Revenge struggled to maintain full speed when there was a lot happening on screen. At this price point, a mini PC generally handles this workload better.

The software experience reinforces that concern. This is not an officially certified Android TV or Google TV device. Buzz TV uses its own interface, and while the Google Play Store is present, many mainstream streaming apps either cannot be installed or do not function properly. Disney+ would not play content at 4K and repeatedly errored out after only a minute or two of playback. Netflix was available only in a tablet-style version, with the TV version failing to launch entirely. Features like Dolby Vision, Dolby Atmos, and reliable HDR support were absent across these apps.

Although the device reports Widevine L1 certification, which should allow for high-resolution HDR streaming, real-world results did not reflect that capability. Compounding this is an outdated security posture. The box runs Android 13, but its most recent security patch dates back to August 2023. Google Play Protect was disabled by default, and there are numerous preinstalled apps of unclear origin. Taken together, this raises both usability and security concerns.

One area where Buzz TV clearly invested effort is the remote control. It feels solid, is backlit, and avoids the advertising buttons common on many streaming remotes. The programmable color buttons and the accompanying configuration app are well executed, and HDMI-CEC controls are easy to access. The remote ends up being the best-designed part of the product, even though it cannot compensate for the broader software and compatibility issues.

After spending time with the Power Station 6, what stood out most was how poorly integrated the overall experience felt. The interface itself is not cluttered with ads, but many of the things enthusiasts expect simply do not work. Between limited app compatibility, missing audio and video features, outdated security updates, and unusual hardware restrictions, the device falls short of what its high pricing suggests. There is some performance potential here, but in its current form, it is difficult to justify as an enthusiast-grade streaming box, especially when more capable and flexible Mini PCs exist at similar prices.

My First Cord Cutting / ATSC 3 Update of 2026

In the days leading up to the CES show and throughout the week in Las Vegas, several cord cutting news items related to the ATSC 3.0 over the air TV standard were announced. In my latest video, I provide a more in-depth overview of these developments that I touched on briefly during my CES Dispatch series.

Watch the update here!

As a recap, a central issue remains DRM encryption over the new ATSC 3.0 broadcast standard. Broadcasters are pushing to lock down over-the-air signals, limiting how viewers can receive and use content that has traditionally been freely accessible. While they say this is to prevent piracy, the real outcome is that it pushes consumers to expensive cable and streaming plans to maintain recording and time shifting features they enjoy today.

This matters because retransmission fees charged by broadcasters continue to rise at almost an exponential rate. In my area, the Broadcast TV fees are now $48.30 per month – and that’s before other cable charges. Even the most basic cable subscription will now cost consumers more than $60 monthly. Of course using an antenna to receive television is completely free.

Shortly after I began asking viewers to download and share their Comcast rate cards, Comcast removed the broadcast TV fee line item from their published rates entirely. The company says this was done to simplify pricing, but the effect is reduced transparency. The costs haven’t disappeared; they’ve simply been folded into higher base prices.

At CES, Pearl TV announced what it described as an affordable ATSC 3.0 converter box program. This is positioned as a way to lower the barrier to entry for consumers and manufacturers, but it closely resembles a similar failed effort announced in 2022 that never materialized.

The underlying root cause of Pearl’s troubles with consumer adoption hasn’t changed. Encryption and certification requirements add cost and complexity in a market that is already small. Even the proposed “affordable” devices are expected to cost under sixty dollars, still roughly double the price of many ATSC 1.0 tuners (compensated affiliate link) that include DVR functionality.

The certification process itself remains a problem. Pearl TV and the A3SA encryption body are private entities made up of the same major broadcasters, effectively controlling which devices are allowed to receive encrypted signals and ultimately be sold to consumers. This introduces a layer of private regulation on top of what has traditionally been governed by FCC standards alone.

Another announcement hinted at some movement on gateway devices, which take an antenna signal and distribute it across a home network. After years of delays, A3SA says encrypted gateway functionality is now working on a limited number of products, including the ZapperBox and an upcoming ADTH device. These solutions, however, are expensive and tightly constrained. ZapperBox requires multiple expensive proprietary devices for multi-TV households, and the ADTH approach is limited to Android and Fire TV platforms, excluding market leader Roku.

Visiting the ATSC booth at the CES show made it clear how confusing this ecosystem has become. Devices carried different combinations of NextGen TV and A3SA certifications, each with different implications for compatibility and functionality. By contrast, current ATSC 1.0 devices work simply because they can receive the signal, without needing approval from a private consortium.

There may be signs of easing tensions. An interview with SiliconDust CEO Nick Kelsey suggested that support for encrypted ATSC 3.0 signals could eventually come to HDHomeRun devices without additional hardware. That would be a meaningful shift, though it still leaves unanswered questions about support on non-Android platforms and the broader role of DRM on public airwaves.

FCC Chairman Brendan Carr addressed these issues during a CES discussion, emphasizing the public interest obligations tied to broadcast licenses. He noted that broadcasters unwilling to meet those obligations have other distribution options, from cable to online platforms, and raised the possibility of revisiting how spectrum is allocated if public interest standards are not upheld. Those comments echo questions raised by the FCC in its current ATSC 3.0 docket, particularly around whether encryption serves consumers or primarily protects broadcaster revenue.

That docket remains open for public comment, with additional opportunities to respond once broadcasters file their answers. The outcome is still uncertain, but it’s clear the FCC has heard our concerns and is waiting for the broadcasters to make their case as to why restricting access to the public airwaves will better serve the public.

CES 2026 Is a Wrap!

I’m back from a whirlwind trip to Las Vegas for CES 2026! Like last year I managed to crank out four dispatch videos along with a bonus episode that I posted last night.

You can see the full playlist here.

Like the last couple of years, this show felt very incremental insofar as new innovations were concerned. Robotics were plentiful, but very few were useful beyond doing some visually impressive demos. For example, this robot struggled with a the relatively simple task of folding laundry.

Another theme was concern over memory pricing. Deep pocketed AI companies are gobbling up all of the silicon they can find which is dramatically increasing prices for consumers and manufacturers alike. Micron recently shuttered their 30 year old Crucial consumer memory line. Every company I spoke with, large and small, is very concerned about this issue and some note that the worst has yet to be felt by consumers. What’s worse is that there is no end to it in sight.

This year’s coverage wasn’t sponsored. Everything you saw from CES this year was made possible by all of you! That includes everyone who watched and subscribed, along with those of you have contributed to the channel. That support mattered more than ever, because CES is expensive, time-consuming, and physically demanding, especially when you’re doing it solo.

By my math, CES 2026 coverage cost me around $2,500. That includes flights, hotel, food, and a lot of Uber and Lyft rides around Las Vegas. I stay at one of the lower-cost CES hotels and take advantage of the free CES shuttle when I can, but there are still plenty of events and off-site locations that require Uber or Lyft. Going alone keeps those costs manageable, and between ad revenue, affiliate links, and viewer support, I should roughly break even. Bringing additional people would change that equation dramatically, which is why many larger channels rely on sponsorships.

Sponsorships are a tricky subject at CES. I’m not opposed to them in principle, but most of the offers aimed at smaller creators come with editorial strings attached—covering specific booths, submitting footage for approval, or spending time on things that aren’t particularly interesting to viewers. I’d consider a sponsor if it genuinely improved the quality of the reporting without compromising independence, but those opportunities are rare.

From a workflow standpoint, my goal at CES is all about quick hits and efficiency. I shoot everything live, on location with my iPhone, and try to keep editing to an absolute minimum so I can get videos out quickly without staying up all night in a hotel room. That approach drives nearly every decision I make, from how I shoot to what gear I carry.

Audio is the biggest factor. The show floor is loud—often overwhelmingly so—and clean sound matters more than anything else. That’s why I stick with a handheld microphone, even though it’s not fashionable and some viewers wish my hands were free. There’s a reason TV news crews still use stick mics: they work. They isolate voices naturally, and they save time in post. For my purposes, reliability beats aesthetics every time.

The rest of the setup is intentionally simple. Fewer components mean fewer things to forget, lose, or troubleshoot while running between halls and events. The phone-based workflow continues to hold up well, and storage hasn’t been an issue on my 256GB iPhone.

This year also reinforced how much planning matters. When I was deliberate about which areas and events to cover, the reporting felt stronger and more focused. When I wandered without a plan, the results were mixed. Viewer suggestions were incredibly helpful, but relying on comment threads alone made it easy to miss good ideas. For next year, I’m planning to set up a more structured way to collect booth and topic suggestions from viewers so I can reference them quickly on the floor.

I also want to do more advance research before arriving—especially on the larger exhibit halls of the show. CES rewards preparation, and the better the groundwork, the more efficient the days become once you’re there.

As always, feedback is welcome. Hearing what worked and what didn’t helps shape how I approach future coverage. And if you’re new here, stick around—CES may be a big moment each year, but the rest of the calendar is filled with reviews, nerdy experiments, and tech commentary the same way it’s always been.

CES 2026 Dispatches Have Begun!

I’m at CES 2026 for the start of my latest Dispatch series, beginning with CES Unveiled, one of the early showcase events ahead of the main show floor. These Unveiled events are dense and efficient, with dozens of companies packed into a single room, which makes them well suited to fast-paced coverage.

This year’s Dispatch videos are intentionally lightweight in production, just me, an iPhone, and a backpack, with no sponsors influencing what gets covered.

Check out my first dispatch here!

Walking the floor, I ran into a wide mix of products, ranging from practical home tech to more experimental ideas. There were new security cameras and smart displays tied into network-attached storage systems, emphasizing local recording and the absence of subscription fees, though often requiring higher-end hardware. Music-focused gadgets showed up as well, including a “second-generation instrument “guitar hero” like device designed less around learning technique and more around casual, stress-free music creation with no musical experience required.

Mini PCs continue to evolve, and one of the more interesting concepts was a modular system that can dock into different enclosures, including a GPU base or a portable laptop shell, depending on how and where it’s being used. On the more whimsical side, I also came across a water-based drone meant for pools, capable of lighting effects and coordinated movement, clearly aimed at a niche (and wealthy) audience but at least something new and different.

Battery charging solutions were also on display, including systems that remove most of the decision-making by automatically handling orientation, charge state, and battery type.

One stand out from my first night was a compact and portable ice maker that cranks out ice cubes in 5 minutes, while others targeted professional and industrial users. A new FLIR thermal imaging device running Android positioned itself as a full workflow platform rather than a single-purpose camera, with industry-specific apps and built-in collaboration tools.

There were also familiar brands making a return. Pebble watches are back, focusing on long battery life and simple notifications rather than health tracking, alongside a new recording ring concept that lets users capture voice notes. In the outdoor automation category, autonomous yard robots continue to expand beyond mowing, with attachments for leaf collection and snow plowing, including models designed to be more affordable than earlier versions.

My dispatches are meant to be a snapshot rather than a deep dive, showcasing the range of ideas at the show. More Dispatches are coming as the show continues so be sure to subscribe to my YouTube channel and follow my CES 2026 playlist here!

Plex Q&A: Optimizing Video vs. Live Transcoding (sponsored post)

I took a look at a Plex feature that has been around for a while but hasn’t come up much in my previous coverage: media optimization. The idea is straightforward: instead of relying on your Plex server to transcode video on the fly every time someone watches something on a limited connection, you can have the server create alternate, smaller versions of your media in advance. Those optimized files sit alongside the original and can be played back directly, reducing the load on the server.

My latest video dives into this feature here.

In normal use, Plex’s transcoder does a good job adapting high-bitrate files for bandwidth challenged connections, but that assumes the server has enough CPU or hardware acceleration available at the moment someone presses play. If it doesn’t, or if multiple users are competing for those resources, performance can suffer. Optimization shifts that work to the background, letting the server do the heavy lifting ahead of time rather than in real time.

The feature lives in the Plex web app. From the menu attached to a movie or episode, there’s an option to optimize the media. You can also access it from the title’s detail page. When you start an optimization job, Plex lets you give the optimized version a custom name and choose a quality preset. The built-in options are aimed at common use cases like mobile viewing or TV playback, with defined resolutions and bitrates, but there’s also a custom mode if you want more control.

Custom optimization profiles allow you to pick specific resolutions and bitrates, ranging from 1080p down to very low-bandwidth 480p options. These profiles can be named and reused, which makes it easier to target particular devices or scenarios. Once an optimization job is started, the server processes it in the background and uses hardware video encoding if it’s available, which can significantly speed things up.

There are also some useful controls when optimizing TV series. You can limit optimization to unwatched episodes and cap the number of optimized files Plex keeps around. As episodes are watched, Plex can automatically delete the older optimized versions and generate new ones for upcoming episodes. That adds a layer of automation that helps keep storage usage in check.

When it comes time to watch something, Plex exposes the different versions through a “watch version” option. From there, you can explicitly choose between the original file and any optimized versions that exist.

Management of these files is centralized in the server settings under optimized versions. From that screen, you can see what’s currently being processed, what’s finished, and delete optimized media you no longer need. There’s also the ability to reorder optimization jobs if you want one item to be processed before others in the queue.

One additional setting ties optimization to the transcoder configuration. If HEVC optimization is enabled, Plex can create optimized files using HEVC where possible, which can deliver better quality at lower bitrates. That can be especially useful if storage and bandwidth efficiency is a priority.

Overall, optimization feels most useful for servers with limited processing power, older hardware, or libraries full of high-bitrate content that’s frequently accessed remotely. By preparing alternate versions in advance, more users can enjoy content simultaneously.

Disclosure: This video was a paid sponsorship by Plex. They did not review or approve this content prior to uploading and all opinions are my own.

GMKTec M8 Mini PC Review

My latest mini PC review takes a look at the GMKtec M8, a mid-range mini PC that sits comfortably between entry-level systems and higher-end compact desktops. It’s built around AMD’s older Ryzen 5 6650H, a six-core, twelve-thread processor, paired with 16 GB of DDR5 memory and a 512 GB NVMe SSD. On paper, it’s not cutting-edge hardware, but in practice it feels capable enough for most everyday workloads without calling too much attention to its limitations m8.

Check it out in my latest video review.

What stood out immediately was the port selection, especially given the price point, which was quite reasonable when I recorded the video. You can see current pricing over at Amazon (compensated affiliate link).

On the front, GMKtec includes both an OcuLink port and a 40 Gbit-per-second Thunderbolt compatible USB 4 port. OcuLink is still relatively uncommon on systems in this price category, but it opens the door to directly attaching PCI Express devices like external GPUs with less overhead than Thunderbolt.

The USB4 port performed as expected when I tested it with an external Thunderbolt SSD, delivering transfer speeds consistent with a full-bandwidth implementation. Alongside those are two USB-A ports, a combined headphone and microphone jack, and the power button. Around back, there’s another mix of USB ports, HDMI and DisplayPort outputs, and dual 2.5-gigabit Ethernet connections.

Display support was solid in my testing. While GMK advertises support for up to three 8K displays, I don’t have an 8K panel on hand. With multiple 4K displays connected, everything worked as expected through HDMI, DisplayPort, and the USB 4 port via a dongle. Networking performance was also better than I usually see on small PCs. Both Ethernet ports hit their rated speeds, and the Wi-Fi 6E adapter delivered strong throughput, including upstream speeds that cleared a gigabit on my network.

Internally, there are some tradeoffs. The 16 GB of DDR5 memory is soldered, so RAM upgrades aren’t an option. Storage, however, is more flexible. After unscrewing the rubber feet and opening the case, I was able to access the Wi-Fi card and space for two NVMe drives, which makes dual-boot setups feasible.

Out of the box, the system ships with Windows 11 Pro pre-installed. The operating system comes activated with a proper license.

For basic use, the M8 behaved the way I’d expect a six-core Ryzen 6000 series processor to behave. Web browsing at 4K60 felt responsive, with smooth scrolling and no obvious slowdowns. Media playback was similarly uneventful in a good way, with only the occasional dropped frame during Youtube 4K60 playback, nothing I would have noticed without looking for it with the “stats for nerds” diagnostics screen enabled. Benchmark testing put it in line with other systems in this price range with similar processors.

Light video editing was workable as well. Simple 4K timelines with basic transitions played back reasonably smoothly, though this is not the kind of machine I’d recommend for heavy editing without adding an external GPU. That option is there, though, and connecting one through OCuLink or USB4 would dramatically change what the system is capable of doing.

Gaming is where expectations need to be managed. Modern, demanding titles like Cyberpunk 2077 are playable, but only at low settings. At 1080p, performance hovered around 30 frames per second, with better results at 720p, where frame rates climbed into the mid-40s and occasionally higher in less complex scenes. In that sense, the experience reminded me a bit of a Steam Deck connected to a monitor. Emulation, on the other hand, was a strong point. PlayStation 2 emulation at native resolution ran at full speed, and older systems performed without issue.

Thermally, the system held up well under sustained load, passing stress tests without significant throttling. The fan is audible in performance mode, which runs the processor at its full 40-watt envelope, but it’s not among the loudest mini PCs I’ve tested. BIOS options allow you to dial things back with balanced and quiet modes if noise is a concern, trading off some performance in exchange for lower fan activity.

I also spent some time with Linux, booting a recent Ubuntu release. Hardware detection was smooth across the board, including Wi-Fi, Bluetooth, audio, and networking, which suggests the M8 would be a comfortable choice for Linux users or anyone planning a dual-boot setup.

Taken as a whole, the GMK M8 feels like a system built around practical choices. You give up upgradeable memory, but for a reasonable price you get unusually fast I/O for the class, solid networking, and performance that’s adequate for everything from everyday computing to light creative work and emulation.

See all of my Mini PC reviews here.

Disclosure: GMKTec sent the Mini PC to the channel free of charge. However they did not review or approve this content prior to publication, no other compensation was received, and all opinions are my own.

Holiday Retro : The eXoDOS and eXoWin9x Projects Seek to Preserve 80s and 90s PC Gaming in a Single Collection

Every year around Christmas I try to find a piece of retro technology to feature on the channel, and this time I landed on something for fans of 80s and 90s PC games. The eXoDOS project is an attempt to make nearly the entire history of DOS gaming accessible with a single click. With that project largely done, the group is now focusing in on the Windows 95/98 era with eXoWin9X.

In my latest retro video I take a look at both running on a lower end Mini PC.

Similar to projects like Emudeck, the eXo project has its entire library of games preconfigured and ready to run mostly with just a single click to get going. Scripts for each game determine the best emulator (either Dosbox or 86box) along with the best settings for optimal performance.

ExoDOS is downloaded from the Retro-Exo site and can be installed either as a massive full archive or as a much smaller “lite” version that pulls down individual games on demand. The full collection weighs in at well over 600 GB, but the lighter option lets games download as you play them, after which they stay local. Setup is handled through a batch file, and once installation finishes everything runs through LaunchBox. The result is a browsable library of roughly 7,600 DOS games, searchable by title, publisher, or hardware features. This is mostly a Windows-centric project although there are some patches to get it working on Linux.

To see how well this works on modest hardware, I ran everything on a midrange mini PC with a Ryzen 6650H processor and 16 GB of RAM (compensated affiliate link). That turned out to be more than sufficient, even for titles that originally required specialized hardware.

One example is Wing Commander II, which in this setup includes the CD-ROM edition with speech and Roland MT-32 audio. Selecting the MT-32 option recreates a sound experience that was out of reach for many players in the early 1990s, when the Roland synth hardware was expensive and uncommon. For those leaning in on the nostalgia, Soundblaster FM synthesis is also an option.

What stood out immediately is how quickly these games launch. Game controllers work out of the box, manuals are included as PDFs, and supplemental materials like box art and disk images are bundled alongside the games.

The archive also functions as a memory jogger. Games that are half-remembered from BBS downloads or shareware disks tend to be here, including titles like Night Raid, a Paratrooper-style game that circulated widely on BBS’s in the early 1990s. For adventure fans, the collection includes both floppy and CD-ROM versions of games like Space Quest IV and many others from Sierra and Lucasarts.

ExoDOS also organizes games by technical capabilities, including a playlist of DOS titles that supported early glide/3Dfx 3D acceleration. Running something like Battle Arena Toshinden with emulated 3dfx support shows how well these setups scale, even if performance varies slightly depending on host hardware and settings. The important part is that the environment detects and configures the right components automatically.

Alongside ExoDOS is the newer project called ExoWin9x, which applies the same philosophy to Windows 95 and Windows 98 games. These titles run inside carefully optimized virtual machines that avoid duplicating full Windows installations for every game. Instead, system changes are swapped in as needed, saving space and simplifying management. At the moment the collection covers games from the mid-1995 and 1996, with more planned for the future.

Running Windows-era games like Beavis and Butt-Head: Virtual Stupidity or Wing Commander IV highlights how much effort has gone into preservation beyond just making the games start. Virtual CD-ROMs are fully browsable, bonus videos are intact, and even obscure developer easter eggs remain accessible. Different emulators are used depending on what a game needs, and the system quietly selects the appropriate one.

Downloading these projects can only be done over BitTorrent given the huge file sizes involved with each. But once it’s done, it’s done.

What ties all of this together is the focus on removing friction. These projects prioritize playing over configuring, while still preserving the original context of the software. Instead of reconstructing old setups from memory, the experience becomes as simple as browsing, clicking, and playing.

See more retro videos here!

Texas Files Suit Against Smart TV Makers Over Spying Features

Back in October, I started a video series looking at “Automatic Content Recognition” (ACR) which is a technology that modern smart televisions use to collect data about what people are watching. The televisions take actual visual and audio snapshots of what is on the screen several times a second.

In my latest video on this topic, I look at a new set of lawsuits filed by the Texas Attorney General against five major TV manufacturers—Sony, Samsung, LG, Hisense, and TCL—over their use of this technology. The Texas AG even scored an early victory, requiring Hisense to turn off their ACR systems for Texas residents.

The lawsuits were all filed in a Texas state court, which means any outcomes would apply only to Texas residents. Still, they offer a detailed look at how this technology works and how aggressively it is being deployed.

The frequency of ACR sampling varies by manufacturer and model, but in some cases the sampling happens multiple times per second. That information is converted into a digital fingerprint and sent to a remote server, where it is matched against a massive database of media. Once identified, the viewing data can be sold to advertisers and data brokers.

As I noted in my earlier videos, ACR can also analyze anything coming into the television through HDMI, including cable boxes, streaming devices, and video game consoles. The lawsuits allege that video games are a big area of data collection for TV manufacturers and data brokers, which raises questions about whether they are illegally capturing data from children under the age of 13.

Marketing materials cited by the Texas Attorney General in the lawsuits suggest that some companies use this data to track users across devices and platforms, following them from their TV screens to social media sites and other parts of the internet. In one example cited in the filings, LG is accused of collecting screen data as frequently as every 10 milliseconds and building detailed consumer profiles that may include political interests, religious viewing habits, and other personal characteristics.

Another major issue raised in the lawsuits is informed consent when users are asked to opt into these features. While most smart TVs technically require users to opt in before data collection begins, the Attorney General argues that the process is designed to push users toward agreement. Opting in is often a single click, while opting out can require navigating dozens of menu options spread across multiple screens. In some cases, declining data collection disables core smart TV features altogether, effectively forcing users to choose between privacy and functionality. Screens shown in the lawsuits for brands like TCL and Hisense often lack a clear “disagree” option, while others rely on confusing layouts that make refusal unintuitive.

Samsung is also accused of misleading customers by claiming it does not collect video or screen content. The state argues that even if the company only transmits hashed fingerprints rather than raw images, the end result is the same because the system can still identify exactly what is being watched.

The Attorney General is seeking jury trials, civil penalties of up to $10,000 per violation, and additional restraining orders against the manufacturers. Beyond the legal details, the lawsuits highlight how valuable viewing data has become. It helps explain how large televisions can be sold so cheaply: the hardware is often subsidized by ongoing data collection.

For viewers who are concerned about this practice, the most reliable option remains disconnecting the television from the internet entirely and using an Apple TV that has stronger privacy controls. Even then, avoiding tracking altogether is difficult, but these cases shed light on just how much data smart TVs can collect—and how little most users may realize about what is happening in the background.

I’ll continue watching how these lawsuits develop, since they may signal whether other states are willing to challenge an industry practice that has largely operated out of public view until now.

The QMTech MiSTer Clone is Affordable and Available!

The latest video in my MiSTer series features a system from QMTech, a fully assembled clone that, at the moment, is actually available to buy over at Aliexpress (not an affiliate link). Given how difficult it has been to find MiSTer hardware in stock over the past few years, I was curious to see how this one would stack up, especially at its asking price. So I ordered one to find out!

Check it out in my latest review!

For anyone unfamiliar with MiSTer, the appeal lies in how FPGA hardware recreates the original logic of classic consoles and computers. Instead of translating software instructions the way an emulator does, the FPGA is reconfigured to behave like the original hardware itself. That approach is particularly valuable for complex systems such as the Sega Saturn or Sega 32X, where multiple processors need to operate in parallel with precise timings. The result is very low input latency and timing behavior that closely matches the original machines, whether the output is going to a modern flat panel or a CRT based display.

The QMTech unit is based around a cloned DE10-Nano FPGA board with an integrated heatsink and fan, paired with a custom analog I/O board. From a compatibility standpoint, it behaves like any other standard MiSTer setup, with full support for the existing ecosystem of cores and tools. In day-to-day use, it feels no different from other MiSTer systems I’ve tested, including the MiSTer Pi I looked at last year. There are two versions for QMTech devices for sale, one priced lower for U.S. buyers and a higher-priced option for international customers, but both arrive fully built rather than as kits.

Physically, the QMTech system is straightforward. The built-in USB hub provides four ports, which is fewer than some other MiSTer builds, though adding an external hub is an easy workaround. Ethernet is included, but there is no onboard Wi-Fi or Bluetooth, so wireless connectivity requires a USB adapter for updates. A SNAC port is present for connecting original controllers directly to supported cores, which allowed me to use a Nintendo Zapper with the NES core on a CRT just as I would on original hardware.

The unit ships with a 32 GB SD card with a basic MiSTer installation already in place. As with other preconfigured systems, some additional setup is still required to get everything working the way I prefer, including running updater scripts and making configuration tweaks. Since the hardware is fully MiSTer-compatible, the setup process is identical to other systems and well documented elsewhere.

On the back, the system offers HDMI for modern displays and analog video output suitable for VGA monitors or CRT televisions with component RGB inputs. With the appropriate cable, the analog output delivers a clean signal that looks amazing on a late-model CRT. Audio for CRT televisions and monitors is available via analog output or optical and the system powers on immediately when plugged in, as there is no physical power switch.

To see how the hardware handled more demanding workloads, I spent time running several cores known to stress the MiSTer platform. Arcade titles like Street Fighter Alpha 3 ran without issue, even after extended periods in attract mode, suggesting that both cooling and memory stability were solid. Switching between cores was quick, and the system handled rapid transitions from late-1990s arcade hardware to mid-1980s home computers without complaint.

I also tested computer and console cores that are often used as benchmarks for system stability. Amiga demos and games ran cleanly, Neo Geo titles like King of Fighters 2003 loaded and played as expected, and Sega Saturn games such as Daytona USA worked within the known limitations of the MiSTer’s memory configuration. The Nintendo 64 core, which has matured significantly, performed well across the titles I tried, and other complex systems like the Sega 32X behaved correctly without any issues.

Even cartridge-based games with custom chips, such as Star Fox on the Super Nintendo, ran properly, demonstrating that the necessary co-processors were being accurately reproduced. At the other end of the spectrum, earlier consoles like the Atari 2600 and ColecoVision also worked as expected, complete with the quirks of their original control schemes.

To round things out, I ran memory stress tests at 130 MHz for an extended period and saw no errors. While the RAM could be overclocked slightly, there was no real benefit in doing so, as none of the existing cores require more than the standard operating speed.

After spending time with it, the QMtech MiSTer left me with the impression that it is a competent and well-executed implementation of the platform. It handled everything I threw at it, stayed cool, and ran quietly in the process even with its tiny on board fan. For a device that is currently available to purchase at a relatively accessible price point, that combination is noteworthy, especially in a market where MiSTer hardware is often difficult to find at all.

2025 Year in Review and 2026 Channel Plans

As the year winds down, I wanted to take a moment to look at where things on the channel stand and where they’re headed next.

You can check it out in my latest video.

Viewership climbed to just under nine million views, up from about 7.8 million last year, and subscriber growth also ticked up. That tells me something is working, even if it’s not always obvious what that something is. I still approach the channel as a generalist, largely because that’s how it started more than a decade ago, but the platform and its audience have changed. Many subscribers don’t see every upload anymore, which remains a point of frustration I have with the platform.

To make sure people don’t miss uploads, I’ve leaned more on the weekly email newsletter, the daily digest, and the blog, which serves as an archive and an alternative way to follow along. If you’re reading this you likely know about these already!

One area that stood out this year was audience retention. Regular viewers now make up about ten percent of the audience, a sharp increase from last year which was under 1%. The audience itself continues to skew older, which reflects how much YouTube has evolved. While shorter videos tend to attract younger viewers, long-form content remains where I spend most of my time, both as a creator and a viewer.

In total, I uploaded 175 long-form videos this year, along with a smaller number of shorts and live streams. Live content slowed down compared to previous years, partly due to scheduling, but it’s something I plan to revisit, especially around events like CES. Looking at what resonated most with viewers, it was clear that topical and consumer-focused videos outperformed traditional product reviews. Issues that directly affect viewers, like privacy concerns and our ongoing ATSC 3.0 DRM fight, drew the most attention.

That shift has influenced how I’m thinking about the year ahead. I’ve started rebuilding the Gadget Picks channel, now focused on smaller gadget reviews that may not find a large audience on the main channel but still serve a purpose elsewhere, particularly on Amazon. Amazon itself has become a more important platform, quietly adding social features that make it worthwhile to publish there and diversify beyond YouTube. Follow me on Amazon here!

Product reviews remain a core part of the business, accounting for a significant share of revenue through platform revenue sharing and affiliate links, even if they don’t always align with every viewer’s interests. That tension between sustainability and audience interest is something I’m still trying to solve, but the numbers suggest progress.

Early next year begins, as usual, with CES in Las Vegas. I’ll be covering the show solo again, focusing on fast, on-the-ground dispatches that give a sense of what’s new and interesting without a lot of polish. Those videos have connected well with viewers in recent years, and I plan to stick with that approach.

Beyond CES, the main channel will continue leaning into consumer advocacy topics, building on the momentum around issues like broadcast encryption and other consumer focused topics. Even when outcomes are uncertain, raising awareness and engaging regulators feels like work worth doing.

Product reviews will still be part of the mix, especially for larger items that fit the audience here, while smaller reviews will be on Gadget Picks and production-focused content will live on my production nerd channel. It’s not the simplest structure, but it reflects the reality of how platforms and audiences behave today. I don’t expect explosive growth, but steady progress has long been my strategy, and after more than a decade on the platform, that’s enough to keep me moving forward.

Intellivision Sprint Review – A great recreation of a classic

The Intellivision Sprint is a newly released console that looks and feels like it belongs to the early 1980s. With faux woodgrain panels and metallic accents, it closely resembles the original Intellivision, the Mattel-produced system that competed with the Atari 2600. What’s notable here is that this hardware is now coming from Atari itself, following its acquisition of the Intellivision brand.

You can see it in action in my latest review.

The console is smaller than the original unit, but the controllers retain the familiar size and layout. They are wireless and include the full numeric keypad that defined the original experience, along with physical overlays that slide over the buttons to indicate game-specific functions. Forty-five games come preinstalled, drawing from much of the classic Intellivision library. While the system does not support original cartridges, it does provide alternative ways to run additional software via its USB port.

Inside, the hardware is relatively modest, built around an ARM processor running games through emulation. The emulator chosen here is well regarded within the Intellivision community, and everything I tested ran as expected. Video output is limited to 720p at 60 frames per second, and the system does not require an internet connection to function. Power is supplied over USB-C, though no power adapter is included.

On the back of the unit are USB ports used for firmware updates and for connecting wired controllers. While the system does not support Bluetooth pairing with third-party controllers, plugging one in directly is an option. Using a modern USB controller had more input lag compared with the included wireless controllers.

The included controllers use Intellivision’s distinctive disc-style directional control, which behaves more like a rocker pad – kind of an early pre-cursor to the modern d-pad. It’s a design that can take some getting used to (especially with the side buttons), but it closely matches how the games were originally designed to be played.

Turning the system on highlights how much attention was paid to physical details. The power switch has a firm, mechanical click that feels deliberately old-fashioned. The main menu provides individual information screens for each game, including a visual reference for the controller overlays. Games can be rated and marked as favorites, making it easier to return to specific titles later.

Playing through the built-in library underscores how many of these games were designed around shared, two person competitive play. Titles like Shark! Shark! are simple in structure but clearly more fun when a friend is playing too. The controls feel close to how I remember them from years ago.

One standout experience was B-17 Bomber, a game I had heard about but never previously played. It makes use of synthesized speech and places you in various roles aboard a bomber during missions, switching stations through the keypad. The voices announce the direction of incoming fighters along with the proximity of the desired target. For an early 80’s game it’s pretty impressive.

The system also allows additional games to be loaded from a USB drive. With the right setup, titles like Pac-Man and Donkey Kong can be run even though they are not included with the console. Definitely check out GenXGrownUp’s tutorial on geting this right. This works reliably once configured, but the hardware is particular about USB drives. Several modern sticks I tried were not recognized, while an older, generic drive worked without issue.

I didn’t own an Intellivision growing up, so nostalgia isn’t driving my reaction here. Even so, the effort put into replicating the look, feel, and behavior of the original hardware is evident. This is clearly a niche product aimed at a limited audience, but it treats that audience seriously. If you’re an Intellivision fan or Intellivision curious, definitely check this one out. I don’t think it’ll be around for long.

Last year, Atari gave its own classic console a modern refresh with the Atari 2600+. That one works with original Atari cartridge games and controllers. It even comes packed in with a 10-in-1 cart that runs on original hardware too!

Check out more retro reviews here!

Zapperbox’s “Big Deal” on DRM Gateway Devices is a Bigger Deal for Them Vs. Consumers

Over the next few weeks we are likely to see the broadcast industry tout “gateways” that work with their private, opaque DRM regulatory framework. The first one you will hear about comes from Zapperbox. Last month they released news of a “Big Deal,” stating their device now allows in-home streaming of DRM content from one Zapperbox device to another. 

While this is a “big deal” given how difficult the private, opaque DRM regulation has made the simple act of watching television, it underscores how difficult it’s been for the industry to implement a feature that has worked on ATSC 1.0 broadcasts for nearly two decades. But this is far from being at parity with the ATSC 1.0 experience – at the moment the Zapperbox solution only works with other expensive Zapperbox devices. 

Unfortunately for consumers, getting functionality back that DRM has taken away will result in a significantly higher cost. Since this only works with Zapperbox devices, consumers will need to purchase a Zapperbox tuner starting at a whopping $199 for a single tuner device, $274 for a dual tuner device or $300 for a quad tuner. 

On top of that, consumers need additional Zapperbox hardware for each of their televisions. Their “Zapper Mini” client device currently sells for $139 each. And if that’s not enough, Zapperbox requires a subscription for its whole home DVR features to record content for an additional $5 monthly/$29.99 a year or $240 for a lifetime subscription. Quad tuner device subscriptions cost even more. So a three TV set up will cost $552 + subscription fees. 

On ATSC 1.0 devices gateway tuners like the Tablo can be purchased for far less right now with no subscription fee (compensated affiliate link) and will work with the smart TV or streaming devices consumers already have. An ATSC 1.0 SiliconDust HDHomerun also costs less than that Zapperbox (compensated affiliate link) and will work with nearly every streaming platform in existence along with mobile devices too. That’s because there is not an expensive and complicated private, opaque regulatory scheme driving up cost. 

This certainly is a “big deal” for Zapperbox as the A3SA is currently picking the winners and losers in this space. But shouldn’t the market decide instead? 

My Best Tech of 2025

Over the course of the past year, I reviewed a wide range of tech products, and as the calendar wraps up, I like to take a step back and look at the ones that stood out to me after extended use. This list is limited to things I actually reviewed during the year, which means some notable products are absent simply because I never had them in hand. What I enjoy most about this process is that many of the items I cover tend to be a bit off the beaten path, and revisiting them offers a useful snapshot of how they held up beyond the initial review period.

You can watch my Best of 2025 video here!

Among PCs, the GMKtec G3 Plus was the most compelling system I looked at this year insofar as its bang for the buck. It is a compact Windows mini PC built around Intel’s N150 processor, with support for up to 16 GB of memory and dual storage devices. Despite its low cost, performance proved solid for everyday computing, Linux installations, and light server duties. Pricing has crept up due to broader market pressures, but it remains accessible, especially given its flexibility. Find it here on Amazon or direct from GMKTec (compensated affiliate links).

Another system built on the same processor, the Beelink ME Mini, distinguished itself as a home server platform thanks to its six NVMe slots. I have been running it continuously as a Plex server under Unraid (compensated affiliate link), alongside additional Docker containers, and it has been reliable, quiet, and well cooled over several months of use. You can find the ME Mini on Amazon or direct at Beelink’s website.

In gaming hardware, one of the more interesting devices was GMKtec’s AD-GP1 external GPU. It pairs an AMD RX 7600M XT with multiple connection options, including Thunderbolt, USB4, and OCuLink. Beyond using it as a conventional external GPU for laptops, I also experimented with connecting it directly to a mini PC via OCuLink, which opened up some unconventional but functional configurations. You can find it on Amazon here or direct at GMKTec’s website (compensated affiliate links).

Another gaming highlight was Lenovo’s Legion Go S, a handheld PC running SteamOS. It offers a modest performance and display upgrade over the Steam Deck and signals a broader ecosystem of licensed SteamOS devices that should expand further in the coming years. You can find the Legion Go S at Best Buy (compensated affiliate link).

For retro enthusiasts, the SummerCart64 stood out as an affordable flash cartridge for the Nintendo 64. Based on an open-source design and produced by multiple manufacturers, it enables playback of the full N64 library, including 64DD titles and modern homebrew software. It also worked seamlessly with the Analogue 3D console in my testing. Find one at Aliexpress (compensated affiliate link).

On the software side, I spent time with NES-to-SNES game ports developed by Infidelity. These ports preserve the original gameplay while reducing hardware-related limitations such as flicker and adding small quality-of-life improvements. They run on original hardware as well as emulators, making them broadly accessible. You can download the ROMs here!

In the camera and photo space, the Kodak-branded Slide N Scan offered a practical, if imperfect, solution for digitizing negatives and slides. Image quality is limited, but the speed and simplicity of the workflow make it useful for casual archiving and sharing. When paired with modern AI-based enhancement tools, the resulting images can be significantly improved, which extends the usefulness of the hardware beyond its original capabilities. Find the Slide N Scan at Amazon (compensated affiliate link).

Two free applications also earned spots on the list. LocalSend provides a straightforward way to transfer files across platforms on a local network, effectively filling the gap left by proprietary solutions like AirDrop. It has become a regular part of my workflow. You can find LocalSend here.

UTM, available on the Mac, offers virtualization and emulation support for both ARM and Intel operating systems. It allowed me to quickly spin up Windows, Linux, and even classic operating systems without relying on heavier commercial software, making it a practical tool for testing and legacy access. UTM can be downloaded here.

There were also a few honorable mentions. The Mister Pi offered a low-cost turn-key entry into the MiSTer FPGA ecosystem, but limited availability kept it from wider recommendation. But be on the lookout for the new SuperStation One by the same manufacturer which promises to be a more readily available (and more elegant) Mister solution.

A UniFi 10-gigabit Ethernet adapter proved to be a reliable and quiet option for high-speed low cost networking over Thunderbolt or USB4. You can find it at B&H.

Finally, the 8BitDo Ultimate 2C controller demonstrated that a low-priced gamepad does not have to feel disposable, making it suitable for both casual and multiplayer use. The days of the lousy “little sibling” controller are finally over. Find it on Amazon (compensated affiliate link).

As I head into the next year, my 14th doing this YouTube thing, I’ll be once again attending CES for a series of dispatch videos. I expect that same mix of mainstream and niche hardware to continue shaping what I cover, and I appreciate everyone who followed along as I tested and revisited these devices throughout the year. More to come!

Disclosure: the GMKTEc, Beelink and 8bitdo devices came into the channel free of charge. The Lenovo Legion Go S was provided on loan from Lenovo. No other compensation was received, the brands did not suggest, review or approve content prior to publication, and all opinions are my own.

2025 Toyota Sienna Recall : A Tale of Betrayal by a Once Trusted Brand..

I bought a new Toyota Sienna with my wife in January of 2025, a Woodland Edition that replaced our 2019 Sienna. It was our third Toyota, following a Highlander and a previous 2019 Sienna. Until recently I had no reason to question the brand. The vehicle itself has been solid, and nothing about the driving experience suggested there was a serious issue lurking beneath the surface.

That changed when a recall notice arrived in the mail in mid-December. The letter explained that Siennas manufactured between January and July of 2025 may have defective middle-row seat rails. In certain high-speed collisions, those seats could lose structural integrity if occupied, increasing the risk of injury. Toyota’s guidance was blunt: no one should sit in the second-row seats while the vehicle is moving until a fix is implemented. At the time of the notice, no remedy had been defined.

Explore more in my most recent commentary video.

What troubled me was not just the defect, but the timeline. According to Toyota’s own filings, the company became aware by September that the seats could dislodge in a crash. A voluntary safety campaign decision was made on October 1, and the National Highway Traffic Safety Administration was notified shortly thereafter. Dealers were also informed at that time and instructed not to sell affected vehicles. Yet as a customer, I did not learn about the issue until roughly two months later, despite continuing to drive my family in a vehicle Toyota already knew had a potentially serious safety problem.

Toyota did issue a press release when the recall was filed, but it was easy to miss if you are not actively following automotive news. When I asked Toyota’s PR department why customers were not contacted sooner, I was told that assembling mailing lists takes time and that federal regulations allow up to 60 days for notification by first-class mail. I was also told there is no comprehensive digital database of owner contact information. That explanation rang hollow, especially after customer service was able to pull up my details immediately using the VIN when I called them.

There is also the role of the dealership. I purchased this vehicle locally, from a dealer that has sold me multiple cars over the years. They had the same information Toyota had in early October, yet there was no proactive outreach to customers who had recently driven off the lot in affected vehicles. A phone call warning families about a seating restriction would not have required a regulatory mandate, only a basic sense of responsibility and duty of care for customers.

Seeking a workaround introduced a second layer of frustration. The service bulletin indicated that impacted customers were eligible for a loaner or a rental vehicle with a daily allowance. When I contacted the dealer, I was told there were no loaners available and that any replacement vehicle would be “whatever was on hand.” The option of a rental was initially dismissed, despite being clearly outlined in Toyota’s own documentation. It took several calls between the dealership and corporate support before a rental was finally arranged.

For now, we will be driving a rented minivan on Toyota’s dime while waiting for the company to determine how it will address the defect. The inconvenience is manageable, but the experience has shaken my confidence.

This was not a minor oversight or a cosmetic issue. It involved seating where children ride, and it carried acknowledged safety risks. Knowing that both the manufacturer and dealers were aware of the problem months before customers were directly notified is difficult to reconcile with the trust that brand loyalty is built on.

I still like the vehicle, and I still want this to be resolved properly. But this episode raises broader questions about how companies communicate with customers when safety is at stake, and whether meeting the minimum regulatory requirement is an adequate substitute for timely, direct warnings.

Kensington TB800 EQ Trackball Review

I don’t usually get early access to new phones or laptops, but every so often something more niche shows up instead. In this case, it’s a pre-release trackball from Kensington, the new TB800 EQ.

I first saw it at a local trade show a few weeks ago, and Kensington later sent one over for me to try out. The TB800 is available for preorder through Kensington directly and on Amazon (compensated affiliate link). If you buy it through Amazon, you can choose between different ball colors; the unit I’ve been using has a burgundy ball, though a silver option is also available.

One of the first things I noticed is how securely the ball is held in place. It snaps into the housing with enough resistance that it won’t fall out even if you flip the device upside down. That may sound minor, but anyone who has used older trackballs knows how easy it is for the ball to pop loose when moving the unit from one office to another.

Like other Kensington trackballs, there’s a large central scroll wheel, and this one has a solid feel with a bit of weight to it. By default, it spins freely and supports accelerated scrolling. Pressing a mechanical button on the top switches it into a click-by-click mode with an actual mechanical detent so you can physically feel each step as you scroll line by line. In addition to that main wheel, there are two more scroll wheels built into the device on the left and right hand side. One handles horizontal scrolling, while another can be used for zooming, depending on the application. In a spreadsheet, for example, I was able to scroll vertically, move left and right across columns, and zoom in and out without touching the keyboard.

Those extra wheels are positioned far enough away from the main buttons that they don’t get triggered accidentally. After using the trackball over several days, I didn’t find myself activating them unintentionally during normal use. They also have a balanced resistance, so they don’t feel loose, but they aren’t stiff either. If you decide you don’t want to use one or more of the scroll wheels at all, Kensington included physical switches on the bottom of the device that let you disable them individually. That avoids the need to dig into software profiles just to turn a wheel off for a particular task.

There are additional programmable buttons along the top, which by default handle actions like browser back and forward as well as volume control. These, along with the rest of the buttons and scollers, can be reassigned through Kensington’s software.

Connectivity is another area where the TB800 offers flexibility. It can connect via USB-C, through a USB-C wireless dongle, or over Bluetooth, with support for pairing to two Bluetooth devices. A button on the side lets you toggle between connection modes, making it possible to move quickly between multiple computers or tablets. Kensington estimates around four months of battery life per charge, depending on usage.

On the performance side, there’s an on-the-fly DPI switch to adjust pointer sensitivity, along with a polling rate button labeled in hertz. When connected via USB or the wireless dongle, the polling rate can be set as high as 1,000 Hz, which puts it in the same range as many gaming mice. Bluetooth connections don’t support the higher polling rates, but for wired or dongle use, the higher setting results in smoother cursor movement, especially on high-refresh-rate displays.

The configuration software, Kensington Connect, presents a visual layout of the device and allows extensive customization. Any button can be mapped to a wide range of actions, including macros, text snippets, system controls, or application-specific functions. There’s an easy mode for basic assignments and an advanced mode that allows combinations of buttons to trigger additional actions. Pointer behavior, DPI steps, polling rates, and scroll wheel functions can all be adjusted, and profiles can be set on a per-application basis so the controls behave differently in, say, a video editor versus a spreadsheet.

Trackballs have always attracted users who want a high degree of control, and that’s clearly the audience Kensington is aiming for here. I’ve been using Kensington trackballs for decades, going back to one I bought around 1989 or 1990 for an Apple IIgs that still works today. The TB800 feels like a continuation of that lineage, with large, accessible buttons that accommodate different hand sizes and grip styles. All in a very nice solid offering for Trackball fans.

Disclosure: Kensington sent the trackball to me free of charge. No other compensation was received and they have not reviewed or approved this content prior to upload.

Using Gemini AI’s “Nano Banana Pro” To Enhance Old Digital Photos

On my Gadget Picks channel, I reviewed the Kodak Charmera, a cheap keychain sized, 1.6-megapixel camera whose main appeal seems to be less about image quality and more about novelty. The camera is sold as a Labubu-style blind-box product, with different designs packaged randomly, and that scarcity has led some scalpers to charge far more than its original price. Amazon does have them in stock at the time of this writing (compensated affiliate link).

The image quality straight out of the camera is pretty bad—similar to what one might experience from an early consumer digital camera. But could Google’s new Nano Banana Pro AI model fix these images up and make them look modern? That’s what I explore in my latest video.

The Charmera produces images that are noisy, soft, and lacking in detail. On their own, they are barely usable. Using a prompt that Gemini itself helped generate, I fed in a selfie taken at my desk. The original file was a blur of digital noise, but the output that came back was far more detailed, with accurate colors and recognizable objects in the background. While there was some smoothing that made the image look slightly retouched, it largely preserved what was actually there.

That initial result led me to try a variety of other images. I photographed a small holiday decoration, a candle, my dog, and an outdoor scene, all using the Charmera. In each case, Gemini produced images that looked closer to what I might expect from a modern smartphone. Details that simply were not visible in the original files appeared in the processed versions, from textures on a figurine to fur and reflections. The framing and perspective stayed consistent, even when depth-of-field effects were introduced.

The experiment didn’t stop with new photos. I also revisited digital images from the late 1990s, taken with a Kodak DC120 camera. Many of those files I saved at very low resolutions, such as 320×240, which were the sharpest looking on my 1024×768 display at the time but look especially rough on today’s high-resolution displays. Running those decades-old images through Gemini produced mixed but often striking results. In some cases, textures and facial details appeared that made the photos feel contemporary, even though the originals had almost no usable information at the pixel level.

I also found Nano Banana to be a great compliment to another Kodak-licensed product, the Slide N Scan photo negative scanner. The scanner is inexpensive (comparatively) and can rapidly scan photo negatives and slides. But the output quality is nowhere near where it needs to be for professional use. But Gemini was able to dramatically transform a few of the images I fed through it from that scanner.

Not every result was faithful to the original. In some images, Gemini appeared to invent details when there wasn’t enough data to work with. A dog’s fur texture changed noticeably, and in one image of me running with my dog, my face was clearly not my own.

Scanned photos from books and yearbooks were generally handled well, including colorization, but there were occasional distortions in faces or text. Logos and lettering were sometimes incorrect or duplicated, especially when the source material was ambiguous or mirrored.

I also found that context mattered. When I scanned a 1994-era Polaroid of my Powerbook 180c and a Newton I had to give Gemini more specific hints about what was in the image. Gemini convincingly recreated the devices and dropped them in place. At first glance it looked amazing. But some elements—particularly text—were reconstructed inaccurately. In the below example you’ll see that Gemini replaced the “Macintosh” text on the computer with “Powerbook.”

Working through these examples made it clear that tools like Gemini are doing something very close to what modern smartphone cameras already do. Computational photography has shifted the process away from simply capturing light and toward interpreting data. In that sense, using Gemini on an extremely poor image from a toy camera is not all that different vs. what happens inside many smartphones today.

Used carefully, it can make old or low-quality images usable again. But it can very quickly cross the line from enhancement into fabrication. That balance is something worth keeping in mind as these tools become more accessible and more powerful.

Check Your Cable Bill for Increased Broadcast TV Fees!

If you’re still paying for cable TV, you might be seeing a sizable increase this month on your local TV fee paid to broadcasters. In my part of Connecticut that increase is substantial. According to the latest Comcast rate card, that fee is going up over $10 to $48.30 a month! It sits outside any contract pricing, so even subscribers locked into a package are getting hit with this increase.

In my latest video, we take a look at what this fee is all about and how broken the system is.

Comcast publishes very detailed rate cards that break down every charge and bundle for each of the markets they serve. But I can’t access cards from other regions without logging in as a customer. Because of that, I put together a form where viewers can share what they’re seeing locally.

I went back to an older video I made in 2018 where I had pulled this same section from the rate card. At the time, the fee was only $8. In under eight years, that’s a 500 percent increase!

Much of this money is going to large broadcast groups like Nexstar, Sinclair, Gray, and Scripps. Nexstar, for example, is currently asking regulators for permission to grow even larger by taking over Tegna. As more people cancel cable, the subscriber base that funds retransmission consent fees keeps shrinking, and the broadcasters have been raising rates to maintain the revenue they’ve grown accustomed to. Many of these companies now rely on retransmission for half or more of their income, regardless of how many people actually watch their stations.

The natural question is how broadcasters are allowed to keep raising these fees. The answer lies in the retransmission consent framework. Cable companies once had to carry every local station for free under “must-carry,” but court decisions in the 1980s and a 1992 law shifted the landscape. Broadcasters can now choose between must-carry or negotiating a paid consent agreement. Nearly all of them opt for the paid agreement. Cable providers, meanwhile, are required to negotiate in good faith and can’t walk away. Broadcasters, on the other hand, can pull their signals if they’re unhappy, and the cable company can’t replace a local station with the same network from another market. If the ABC affiliate in my area is owned by Nexstar, that’s the one Comcast has to carry—no alternatives.

Cable companies also must place local stations on their most basic tier. Years ago that tier was called “lifeline cable,” but with a $48.30 broadcast fee added on, even a “lifeline” subscription has become expensive.

The FCC is revisiting national broadcast ownership rules, which has drawn in comments from groups across the political spectrum. One proposal from the International Center for Law and Economics argues for eliminating the retransmission consent system entirely and treating broadcasters more like any other content supplier under copyright and contract law. That could allow cable companies to negotiate outside their markets and potentially reduce costs by choosing among multiple affiliates of the same network. It wouldn’t preserve local newscasts, but it could give cable companies some leverage they don’t currently have.

Streaming services like YouTube TV and Hulu are not subject to the same rules that bind cable companies. Broadcasters want that changed, which would likely raise streaming prices as well. Some smaller networks such as Newsmax have raised concerns about consolidation for a different reason: if large broadcast groups force cable operators to carry their affiliated news channels on the basic tier, smaller channels could be pushed off the lineup entirely.

There’s a lot happening at once—shrinking cable audiences, aggressive fee increases, regulatory reviews, and pressure on both distributors and programmers to keep revenue flowing. I’ll continue following these developments. In the meantime, if you have a recent cable rate card, sending it in will help build a clearer picture of what subscribers are facing across the country.