Broadcasters Ask the FCC for a 2028 ATSC 3.0 / NextGenTV Transition Date

The nation’s broadcasters are making a push for the Federal Communications Commission (FCC) to lock in a firm February, 2028 date for the transition to the NextGen TV standard, ATSC 3.0.

I take a look at their filing in my latest video.

The broadcaster proposal includes setting the February, 2028 date for the top 55 television markets to fully switch over, with smaller markets following by February 2030. Along with that date request, they’re asking the FCC to make a number of policy changes to accelerate the transition.

One ask is for the FCC to lift the simulcasting rules that exist under the current ATSC 3.0 rule. Right now, stations are required to offer “substantially similar” broadcasts in both the current ATSC 1.0 and newer ATSC 3.0 formats. Broadcasters want to move their higher value programming to ATSC 3.0 to push more viewers to upgrade their televisions or tuners.

Last month’s long awaited “Future of Television” report indicated significant adoption issues centered around ATSC 3.0 tuner availability. At the moment only higher end TV sets have the new tuners built in and standalone tuners are expensive. and lousy. Broadcasters are asking the FCC to mandate the inclusion of ATSC 3.0 tuners on all new televisions as soon as possible to get more of them out to consumers.

One hurdle to this request is an ongoing legal dispute over patents related to the ATSC 3.0 tuning technology. A company has already won a lawsuit against LG, requiring the manufacturer to pay excessive licensing fees on every television sold with an ATSC 3.0 tuner. The case is currently before an appeals court but will no doubt make a mandate difficult to put in place right now.

Another contentious issue is digital rights management (DRM) encryption that broadcasters are building into the new standard. Broadcasters acknowledge the concerns raised by consumers, but tell the FCC that their existing “encoding rules” allow unlimited recording and storage of TV broadcasts. They fail to mention that these rules only apply to simulcasts of ATSC 1.0 content, not dedicated ATSC 3.0 broadcasts. If simulcasting is phased out, broadcasters would have more control over how content is recorded and accessed. And on top of that there are significant compatibility issues that limit how consumers can access the broadcasts and record them.

Currently, the only tuners capable of decrypting these broadcasts rely on Google’s Android TV operating system and Google’s DRM technology. This means broadcasters, who argue they need regulatory relief to compete with Big Tech, are indirectly reliant on Google’s ecosystem to distribute their content. Additionally, consumers have expressed a strong preference for networked tuner solutions—such as gateway devices that connect to a home network—yet broadcasters have struggled to deliver on their promise to support them.

Cable providers are likely to push back against this transition timeline, due to the costs involved in upgrading their infrastructure to support ATSC 3.0’s DRM along with its new video and audio codecs. Broadcasters argue that setting firm deadlines will give cable companies enough time to prepare and budget but they make no offer to assist cable providers’ transition expenses.

Alongside this requested transition, broadcasters are also asking for policy changes that could impact local station ownership rules and streaming services like YouTube TV.

They asked the FCC to lift restrictions on station ownership, claiming they need the ability to scale up their businesses in order to compete for advertising revenue. Unlike digital platforms that can expand without regulatory barriers, broadcasters face limitations on how many TV and radio stations they can own, both nationally and within local markets.

Another significant request involves treating streaming services that carry local stations — such as YouTube TV and Hulu — the same as cable providers when it comes to retransmission negotiations. Currently, national networks negotiate these deals for streaming platforms on behalf of their locally owned affiliates, whereas cable companies must negotiate with each individual station. If the rule changes, it could drive up the cost of streaming services as local broadcasters gain leverage to negotiate their own carriage fees.

The broadcast industry’s current business model defies basic economic principles: they continually raise prices even as demand for their product declines, while simultaneously making it more difficult for cord-cutters to tune in over the air due to the industry’s insistence on broadcast DRM. This FCC chair has already indicated that there are better uses for TV spectrum, so I predict he will approve broadcasters’ request just to hasten their demise.

Plex Adds HEVC Transcoding (sponsored post)

I spent some time experimenting with a new feature in Plex’s hardware transcoder that allows for HEVC transcoding of media. This means that high quality 1080p streams can be sent remotely at the same bit rate (or less) as a 720p h.264 stream. You can see it in action in my latest monthly sponsored Plex video.

The goal was to see how well this feature performs in terms of efficiency and quality and how easy it is to set up on a Plex server. My test system was a low-cost GMKTec G3 Plus mini PC running Linux, equipped with an Intel N150 processor.

Setting up the feature was straightforward. In the Plex web interface, under the server settings, I enabled the experimental HEVC video encoding option. It was also necessary to ensure that hardware acceleration was turned on. Additionally, Plex provides an option for HEVC optimization, which pre-encodes videos for better playback on low-powered servers.

To test performance, I loaded a 4K HDR Blu-ray movie onto the Plex server and played it back on my laptop. Initially, the video was streamed in full 4K resolution, but I then switched to a lower bitrate of 720p at 2 Mbps to force a transcode. The server responded quickly, and the video quality remained impressive. Due to copyright restrictions, I couldn’t share a direct visual comparison, but the results were noticeably better than the standard H.264 encoding.

Checking the Plex dashboard, I confirmed that both decoding and encoding were being handled in hardware, with the output using HEVC. The CPU usage remained relatively low, hovering between 25% and 36%, which was similar to what I had observed with H.264 encoding. This suggests that enabling HEVC does not significantly increase the processing load, at least on a modern Intel processor like the one in my test setup. With this level of efficiency, I estimate that the system could handle three or four simultaneous transcodes without much issue.

For those considering enabling this feature, you’ll need at least a 7th-generation Intel Core i3, i5, or i7 processor. Lower-end hardware needs to have Jasper Lake or a newer architecture to be fully supported. Even if a system supports hardware transcoding, that doesn’t necessarily mean it will support HEVC encoding, as some older Intel chips lack the necessary features.

Playback device compatibility also plays a role in whether a client can receive an HEVC stream. On Apple and Android devices, including Apple TV and Android TV-based systems, the automatic quality adjustment features defaults to H.264. To ensure HEVC transcoding is used, the resolution and bitrate must be manually selected. Additionally, HEVC playback requires a Chromium-based browser on Windows, macOS or Linux, or Safari on macOS. Other browsers like Firefox and Opera won’t work. Similarly, the Xbox One S doesn’t support HEVC playback but will automatically revert to H.264 when necessary.

The improved efficiency and quality of HEVC make it a useful addition to Plex’s transcoding capabilities. It’s worth experimenting with if you have the right hardware.

Disclosure: This was a paid sponsorship by Plex, however they did not review or approve this content before it was uploaded.

Survey: Half of Americans Still Use Physical Media

Physical media is still going strong according to a recent survey from Consumer Reports. Despite the shift toward digital downloads and streaming services, a significant number of consumers continue to hold on to tangible media, whether out of nostalgia, preference, or practicality. While we typically look at sales data to determine format preferences, this survey reveals what consumers are actually using on a regular basis.

In my latest video, we dive into the survey results and also interview the Consumer Reports journalist who initiated the survey.

The survey, which included over 2,000 respondents weighted to reflect the American population, found that 45% of Americans still listen to CDs. This number surpasses vinyl records, which have outsold CDs in recent years, but not necessarily in terms of actual usage. Even cassette tapes have a notable presence, with 15% of respondents saying they still use them. Surprisingly, 5% of Americans still listen to eight-track tapes, a format that largely disappeared decades ago.

On the video side, DVDs and Blu-rays remain in use by almost half of Americans. Even as streaming services dominate entertainment consumption, many consumers still rely on physical copies, whether for better quality, affordability, or simply because they own large collections. VHS tapes, once considered obsolete, are still watched by 15% of respondents. Even laser discs, a niche format from the 1990s, still have a small but dedicated following, with 3% of Americans reporting they still watch them.

But consumer-generated media has also seen a more dramatic shift away from older formats. Only 9% of respondents say they use a dedicated camcorder, a sharp decline from past decades when handheld video cameras were common in households. The rise of smartphones with high-quality video capabilities has made camcorders largely redundant. DVR usage has also declined, with only 4% of Americans still relying on devices like TiVo.

Classic video game systems remain popular, however, with 14% of Americans still using older consoles. While this number may seem lower than expected given the strong online retro gaming community, it reflects the difference between casual users and dedicated collectors. Many small businesses and conventions continue to thrive around vintage gaming, and many enthusiasts like myself have even returned to using CRT televisions for a more authentic experience. I think we may see this number actually increase over time.

Legacy home office equipment also persists in some households. About a quarter of Americans still use landline telephones, though many of these are now VoIP-based rather than traditional copper-line connections. Fax machines continue to be used by 11% of respondents, and even Rolodexes and floppy disks still have their niche users, with 5% and 4% respectively.

The journalist behind the Consumer Reports article, Jim Willcox, joined me in the video to discuss how he personally added the questions about legacy technology to the survey out of curiosity. He noted that the longevity of physical media often defies industry expectations. While new formats tend to be predicted as the downfall of older ones, the transition is rarely immediate. Communities continue to form around niche formats, and the appeal of tangible media has proven resilient.

Willcox also highlighted the changing landscape of content ownership. With the rise of streaming, consumers have become increasingly aware of the drawbacks—such as the unpredictability of content availability and the necessity of multiple subscriptions to access favorite shows or movies. In contrast, physical media ensures long-term ownership without concerns over shifting licensing agreements or digital rights management.

While digital convenience is undeniable, the enduring appeal of physical media suggests that many consumers still value having something they can hold, play, and collect. Whether it’s a preference for higher-quality audio and video, a sense of nostalgia, or simply wanting control over their media, this survey shows us that physical formats are far from extinct.

GMKTec AD-GP1 External GPU (eGPU) Review

The GMKTec AD-GP1 is a compact external GPU that houses an AMD RX 7600M XT graphics card with 8GB of video memory. Designed for portability, it connects via USB 4, Thunderbolt or Oculink connections. This device is a good external graphics option for those looking to boost the graphical capabilities of an ultrabook while maintaining the flexibility of a lightweight laptop. You can check it out in my latest review.

It is important to note that while the GPU supports Thunderbolt-enabled devices, it does not function with Apple’s silicon-based Macs, limiting its compatibility to certain Intel-based Macs and Windows ultrabooks with Thunderbolt, USB 4 or Oculink connections.

The price point is approximately $469 on GMKTec’s website. Depending on sales you might find a lower cost option on Amazon (compensated affiliate links).

The AD-GP1 features two HDMI 2.1 outputs and two DisplayPort 2.0 outputs, allowing for up to four external displays with resolutions up to 8K at 60Hz. However, despite its compact form factor, the GPU requires an external 240W power supply, which is roughly the same size as the unit itself. This power supply not only supports the GPU but also provides up to 100W of power back to the host device.

In testing, the GPU demonstrated solid performance when paired with an Asus Vivobook S 14 ultrabook with an Intel Core Ultra 7 258V. Running No Man’s Sky at 1080p on high settings, the system maintained a consistent 60 frames per second (fps). At ultra settings, performance fluctuated between 45 and 60 fps. However, in Red Dead Redemption 2, performance gains were negligible due to CPU bottlenecks, highlighting the fact that the GPU’s benefits will depend on how graphically demanding a game is relative to the processor’s capabilities.

Benchmark testing using 3DMark Time Spy revealed a significant increase in graphical performance with the external GPU attached. The laptop’s base score of 4,385 jumped considerably to 9,421 when the AD GP1 was connected, though the improvement was primarily in GPU-intensive tasks, with the CPU performance remaining unchanged.

Additional testing was conducted using a GMKTec Evo X1 mini PC (compensated affiliate link) equipped with a Ryzen AI 9 HX-370 processor. When connected via OCuLink, the external GPU delivered a performance score of 10,026, which was nearly identical to its performance over USB 4, suggesting that the GPU was not pushing beyond the bandwidth limitations of the connection.

Beyond gaming, the external GPU proved beneficial for tasks like local AI processing. Running a distilled version of DeepSeek 8B using the GPU significantly outperformed CPU-only processing.

Fan noise is minimal even when running at full blast for extended periods of time. The 3DMark Stress Test came in at 99.2% indicating that there won’t be much thermal throttling under sustained loads.

While external GPUs like this remain a niche product, they offer a good solution for users who need enhanced graphical power with a lightweight laptop. For those with compatible hardware, it’s an option worth considering for boosting graphics performance at home or in the office.

Kodak Slide N Scan Review – Rapid photo negative scanner

Scanning and digitizing old film negatives and slides is often a daunting task, requiring expensive equipment and meticulous effort. The Kodak Slide N Scan simplifies this process, providing a rapid and accessible way to convert old photo negatives and slides into digital images. I took a close look at this device in my latest review to see how well it performs and whether it’s a viable solution for casual users looking to preserve their film-based photos.

The Slide N Scan is found at many retailers including Amazon and Best Buy (compensated affiliate links) so shop around for the best price.

Unlike traditional scanners that require software and complex settings, the Slide n Scan operates without a PC. Negatives or slides are inserted into the film tray, the device automatically converts them into positives, and a simple push of a button saves the image onto an SD card (not included). It supports various film formats—including 35mm negatives and slides along with 110, and 126.

Quality, however, is where expectations need to be tempered. The Slide N Scan is not an archival-quality scanner. The 13-megapixel sensor interpolates images up to 22 megapixels, but the results lack fine detail and sharpness. Color reproduction is also inconsistent, as the device attempts to automatically determine the correct balance. While there are some limited color and exposure adjustments on device, enthusiasts will be looking for a lot more. This means additional editing is often necessary after scanning to achieve accurate colors and exposure. But for snapshots, the automatic settings will usually be good enough.

In practical use, scanning is incredibly fast. The device writes images directly to an SD card, which can then be transferred to a computer, phone or tablet. The process is similar to using a digital camera—plug in the SD card, and the images are readily accessible. The Slide n Scan also features an HDMI output, allowing users to project the output of its 5″ screen on a larger television screen.

Examining the scanned images, it’s clear this is not a professional-grade solution. Compression artifacts are visible, and while the device outputs large JPEGs, there is no option for saving uncompressed formats like TIFF. The upscaling process to 22 megapixels does little to enhance detail. I found that black-and-white negatives tend to look better when using the lower 14-megapixel resolution setting, especially since popular films like Tri-X are quite grainy and can interfere with the upscaling and image compression process at the 22-megapixel option.

Despite these shortcomings, the Kodak Slide n Scan serves a purpose. For casual users looking to quickly digitize old photos for sharing on social media or archiving in a non-professional capacity, it provides a convenient solution. The speed and ease of use make it appealing, especially for those with a large number of negatives or slides to process. However, users seeking high-quality digital preservation of film-based images will need to explore more advanced scanning options.

A device in the $500–$600 range with a better sensor and uncompressed file-saving capabilities would fill a gap in the market for those who want high-quality results without the time investment of professional scanning solutions. While the Slide n Scan doesn’t meet that standard, it represents progress in the space, providing an affordable and efficient way to convert old film into digital format.

See more products like this here.

A Nifty Smartphone SSD with Real-time Diagnostics! Twopan SSD review

This smartphone SSD was originally going to be part of my next Amazon haul, but its unique features made it worth a dedicated look in a standalone video.

This SSD, from a company called Twopan (compensated affiliate link), offers some interesting functionality. It connects directly to an iPhone or Android phone via USB-C and includes a built-in real-time diagnostic display. That means users can monitor power consumption, read and write speeds, and even temperature in real-time. The drive also features a single port built-in USB 2.0 hub, allowing additional devices to be plugged in, as well as MagSafe compatibility for easy attachment to the back of an iPhone.

However, one drawback is that plugging in power to that USB port to charge the phone causes the drive to reset, potentially disrupting ongoing work. So be sure to plug in power before recording.

One of the most crucial aspects of using an external SSD with an iPhone is power consumption. iPhones cut off devices that draw more than 4.5 watts through the USB-C port, but this SSD consistently operates at around 2 watts, making it a safe option for pro-res video recording. In testing, it handled recording ProRes 4K video at 60 frames per second without issue, maintaining a steady data rate of about 180MB per second.

One of the standout features is the ability to plug in a USB microphone while recording into its USB 2.0 port. When testing with a DJI wireless microphone, the SSD continued to function smoothly, though power consumption increased slightly. This could be particularly useful for mobile video creators who need external storage and high-quality audio input simultaneously.

The drive’s MagSafe compatibility is another convenient feature. With the included angled USB-C cables, it attaches magnetically to the back of an iPhone, providing a more secure connection than just the SSD hanging off the port. However, the package does not include a cable for connecting the SSD to a computer. When plugged directly into a MacBook, it blocked all other ports, making a USB-C extension cable necessary for practical use.

Performance testing on a MacBook using Blackmagic Disk Speed Test showed read speeds close to advertised numbers but write speeds that fell short, averaging around 600MB per second instead of the promised 960MB per second. While this may be due to power-saving measures, it still delivers sufficient performance for ProRes video recording.

Overall, this SSD presents an interesting solution for those looking to record high-quality video on an iPhone. It addresses several pain points associated with external drives, including power management, real-time performance monitoring, and USB accessory support. While having two USB ports—one for power and one for peripherals—would have been ideal, the drive still manages to offer a solid, functional experience. A niche product, but one that solves a very specific problem effectively.

Disclosure: This product came in free of charge through the Amazon Vine program. However, nobody reviewed or approved this content before it was uploaded and no other compensation was received. All opinions are my own.

Pay up: Paramount Threatening to Pull Channels off of YouTube TV

Last week, negotiations between YouTube TV and Paramount broke down, with both sides announcing that if an agreement isn’t reached, Paramount’s networks—including CBS, Nickelodeon, Comedy Central, and local CBS affiliates—could disappear from YouTube TV. Paramount is now running an aggressive social media campaign to put pressure on the streaming service.

Learn more in my latest video!

YouTube TV has already announced that if these channels are removed, subscribers will receive an $8 discount on their monthly bill. That happens to be the same price as the base tier of Paramount Plus, which might not be a coincidence. The offer seems like a calculated move to pressure Paramount, which likely earns more from its YouTube TV carriage deal than from direct streaming subscriptions. Meanwhile, Paramount has been running ads urging viewers to petition YouTube TV to keep their channels—essentially advocating for a rate increase, since any deal in Paramount’s favor will likely result in a higher subscription costs for users.

Beyond the channels disappearing, all user DVR recordings from those channels made with the YouTube TV service would also get deleted when the deal expires. In this digital age we truly control and own nothing.

The trajectory of YouTube TV’s pricing tells a familiar story. When it launched in 2017, it was a competitively priced alternative to traditional cable. Now, at about $83 per month, it’s in the ballpark of what a cable subscription costs. Much of this increase comes from rising content costs, as networks demand higher rates during contract renewals. And the issue isn’t limited to YouTube TV—cable, satellite, and streaming providers all face similar struggles as content owners seek to maximize their revenue, even as traditional TV viewership declines.

Compounding the issue, local broadcast affiliates are currently lobbying the FCC and Congress to negotiate directly with streaming services instead of being bundled into larger deals by the national networks. If that push is successful, it could lead to even higher costs, as each local broadcaster would have the ability to demand separate fees. This mirrors the problem that led to the cord cutting movement in the first place.

The changes aren’t limited to streaming. Over-the-air television, which has long been a free alternative, is also undergoing a transformation. The new NextGenTV standard introduces encryption, meaning that even recordings from an antenna will require authentication to watch and retain. While NextGen\TV promises better picture quality and features, it also represents a shift toward restricting user control, pushing more viewers toward paid services.

As these disputes play out, the power ultimately lies with consumers. Cord-cutters have more options than ever, from free streaming platforms to on-demand purchases, and shifting away from expensive, restrictive services sends a clear message. While networks and providers continue their negotiations, viewers can choose where their money goes—and that choice may be the strongest leverage available.

Is Roku In Trouble? No, but they are pivoting..

Roku has been a dominant force in the streaming market for well over a decade, but recent data suggests that its hardware market share may be slipping. A report from Pixalate, a market analytics firm, indicates that Roku still leads the industry with a 39% market share in the fourth quarter of 2024. However, this represents a significant 16% decline compared to the same period last year. The shift appears to favor competitors like Amazon’s Fire TV and Samsung’s smart TV platform, both of which have gained ground.

We explore what might be going on in my latest video.

Pixalate’s methodology for measuring market share relies on analyzing the placement of advertising across various streaming platforms. Each time an advertisement is served, the firm tracks the device that played the ad, compiling these figures to determine the relative market share of different streaming devices. By assessing this data across an entire quarter, Pixalate provides a snapshot of shifts in user engagement and hardware adoption. This of course only covers hardware that is watching free advertising supported content and won’t account for consumers watching ad-free subscription services.

One possible explanation for Roku’s measured decline is the affordability and availability of alternatives. Amazon, for instance, has the ability to sell Fire TV devices at a loss, making them highly attractive to consumers. At the time the report was compiled, Fire TV Stick 4K was available for as little as $25, offering high-end features like Dolby Vision support at a fraction of the cost of premium devices like Apple TV. With new TVs often coming equipped with built-in smart platforms, fewer people may feel the need to purchase standalone streaming boxes like Roku’s.

Despite the decline in hardware market share, Roku is not necessarily struggling. The company has been shifting focus toward its streaming platform and advertising-based revenue. Roku now reaches 90 million households, including users who access the Roku Channel app on competing devices. This means that even if fewer people are using Roku-branded hardware, the company is still generating revenue through streaming and advertising.

The Roku Channel has emerged as a key asset in this transition. The free, ad-supported streaming service is available on multiple platforms, including Google TV, Fire TV, and Samsung devices. With a growing catalog of original content—including the Emmy-nominated Weird Al documentary and programming from Martha Stewart—Roku is positioning itself as a major player in the ad-supported streaming space. Nielsen data from November 2024 showed that the Roku Channel held a 1.9% share of streaming viewership, placing it on par with Disney+ and ahead of competitors like Pluto TV and Tubi.

Financially, the pivot to streaming and advertising revenue has been paying off. In the third quarter of 2024, Roku generated over $900 million in revenue from its platform business, compared to just $154 million from device sales. The company continues to subsidize hardware to attract users but is increasingly reliant on platform revenue for profitability. By integrating its service into Google and Android TV search functions, Roku is ensuring that its content remains easily accessible across different ecosystems.

Retro Fighters Game Controller Haul! Quick reviews of the D6 Dreamcast, Hunter and BattlerGC Pro

When it comes to modern takes on classic gaming, Retro Fighters has been making a name for itself by designing controllers that blend nostalgia with contemporary features. Their latest lineup includes controllers inspired by the Dreamcast, GameCube, and original Xbox, each offering wireless connectivity to their respective consoles while also working with modern platforms like the Nintendo Switch and PC.

You can see them all in my latest video.

The Dreamcast-inspired D6 is a six-button controller designed with fighting games and shooters in mind. It features mechanical switches for responsive actuation and comes with a dongles that allow it to function wirelessly with the Dreamcast, as well as on modern systems. While the button feel is satisfying thanks to its mechanical Kailh switches, the D-pad’s rolling maneuverability felt bumpier than expected. This could be a pre-release issue, and Retro Fighters acknowledged that it shouldn’t feel this way. Otherwise, the controller offers solid performance with minimal input lag.

An original VMU attached to a Dreamcast console using the Retro Fighters D6 Dreamcast wireless dongle

The GameCube-style BattlerGC Pro brings modern enhancements to the classic design. Its octagonal-gated sticks are hall effect sensors, meaning no drift issues, and the triggers replicate the analog and digital functionality of the original including an digital button push when the trigger is fully engaged. Like the D6, it includes a dongle for GameCube compatibility and also works on the Nintendo Switch, PC, and other platforms with a second dongle. Interestingly, Bluetooth connectivity is an option, which actually provides lower latency on the Switch compared to the USB connection. The controller performs well across platforms, offering a familiar feel for Super Smash Bros. players.

The Hunter controller is a modernized take on the original Xbox gamepad. While visually similar to an Xbox One controller, it’s designed exclusively for the original Xbox, along with PC and emulation compatibility. The controller maintains pressure-sensitive face buttons, a feature for certain original Xbox titles, while integrating hall effect sensors for the sticks and triggers. A slight drawback is the D-pad, which feels restricted by a raised plastic lip, though this is a minor issue given the Xbox’s limited reliance on the D-pad. Unlike the other controllers, the Hunter doesn’t support wired connectivity, functioning solely through its included dongles.

Across all three controllers, latency performance using my method of shooting a display at 240 fps was excellent. Wired connections on each came in at seven frames and wireless around ten to eleven frames for a button push to be registered on screen. Given the industry’s improvements in reducing input lag, these controllers are competitive with the fastest options available.

All in the Retro Fighters controllers bring a welcome update for those looking to get the right ‘feel’ for their emulated console favorites on modern platforms along with the ability to plug these same controllers into the original console that inspired them.

You can find the controllers on Amazon or via the Retro Fighters website (compensated affiliate links). If you purchase direct from Retro Fighters you can get 10% off your order if you use the code lontv.

Disclosure: Retro Fighters provided the controllers free of charge. However, they did not review or approve this content before it was uploaded and no other compensation was received. All opinions are my own.

The Decade Old Nvidia Shield TV Still Gets Updates!

The Nvidia Shield has been a fixture in my home since 2015, and it remains one of the longest-supported devices I’ve ever owned. Even after a decade, Nvidia continues to provide updates—not just security patches but meaningful improvements. The latest update addresses an issue that’s been a long-standing frustration for me: 24p frame rate switching. Check it out in my latest video.

For those unfamiliar, frame rate switching is important as nearly every movie and most modern television shows are presented at 24 frames per second. Without proper framerate switching, a 24p movie playing on a system locked to 60Hz can create a jittery effect that’s noticeable, especially to those sensitive to motion inconsistencies.

Apple TV has long handled this seamlessly for streaming apps, but the Shield has struggled with them even though apps like Plex and Kodi do it properly. This new update doesn’t completely fix the issue, but it brings the Shield much closer to where it needs to be after all of this time.

The process for enabling this feature is relatively simple but requires some setup. If you’re using the 2019 Shield remote, a button can bring up the settings menu where you can toggle the frame rate match feature. For those using older remotes, Nvidia’s mobile app offers an alternative, allowing users to access the menu without purchasing a newer remote. It’s a small but useful workaround. The feature has to be summoned while the content is playing.

I tested this update across various streaming services—Netflix, Disney+, Amazon Prime Video, and Apple TV+. Across the board, the Shield successfully switched to 24p when prompted, something it failed to do consistently in the past. However, the feature still requires manual activation every time content starts, which is less convenient than Apple TV’s fully automatic implementation. Still, seeing it work across multiple platforms is an encouraging sign of progress.

Beyond frame rate improvements, the update brings support for Auro 3D, an immersive audio format. Nvidia also updated security patches, ensuring the device stays protected, even if it remains on Android 11.

What stands out most is Nvidia’s continued support for this hardware. The Shield has gone through hardware revisions in 2017 and 2019, yet the original model still receives updates. This level of longevity is rare in consumer electronics, where most companies push users toward upgrading every few years.

Check out my Shield TV appreciation video for more on the Shield’s history and potential future!

Orico 10 Gigabit Thunderbolt / USB 4 Ethernet Adapter Review

As many of you know, I have this crazy 10 gigabit Internet connection from Comcast. It started out as a 2 gigabit connection which they ramped up to 10 over the last couple of years to keep up with local competitors. As cool as it is to have a connection this fast, you do need specialized ethernet gear to hit those speeds. I’m therefore always on the look-out for the higher speed gear, especially USB-based solutions.

Orico reached out to me recently to check out their new 10 gigabit adapter. This is the smallest 10-gig adapter I’ve tested, and it works with Thunderbolt 3, 4, and 5, as well as USB 4 devices. However, it won’t function with USB 3, which is something to keep in mind before purchasing.

You can check out my full video review here.

For this review, I used a USB 4-based mini PC from Beelink, an AMD-powered machine that has a 40 gigabit per second USB 4 ports on the back. I also tested it on my Mac with Thunderbolt.

At $159, the Oro adapter is priced a bit lower than other 10-gig adapters I’ve seen over the years, though it’s significantly more expensive than the more common 2.5-gig models. The hardware itself is minimalistic, with a USB-C/Thunderbolt port on one end and a standard 10GBase-T Ethernet port on the other.

It runs off bus power, but like most 10-gig adapters, it generates a lot of heat over time. Instead of a large heatsink, this one uses an internal fan, which stays on and becomes quite loud after extended use—louder, in fact, than the mini PC I was testing it with!

The chipset inside is a Marvell AQtion, which I hadn’t worked with before. It worked immediately on macOS, but on Windows, I had to check for driver updates before it functioned properly. If it doesn’t work right away on a Windows machine, running Windows update and rebooting should solve the issue.

Performance-wise, I conducted two different tests. The first was a basic Speedtest.net run, but as expected, the results varied based on internet traffic conditions and the limitations of the test servers. While I have a 10-gig symmetrical internet connection, getting full speeds on a public server is rare. A more reliable measure came from a local iPerf test between two 10-gig connected devices. Here, the Orico adapter delivered consistent speeds of around 9.48 Gbps in both directions, which is in line with expectations when accounting for network overhead.

Functionally, the adapter does exactly what it promises, delivering full 10-gig speeds over USB 4 or Thunderbolt. The trade-off comes with the fan noise. While it’s compact and portable, the fan’s constant hum is hard to ignore. Those who prioritize silence might prefer a larger device with passive cooling, such as the slightly more expensive OWC 10G adapter, which is bulkier but completely quiet.

That said, if portability is a priority and the noise isn’t a deal-breaker, the Orico adapter is a solid choice. Just keep in mind that while 10-gig speeds sound great on paper, actual internet usage rarely maxes out that capacity, meaning a 2.5-gig adapter might be a more practical and cost-effective alternative for most users.

Disclosure: Orico sent this device to my channel for review, but they did not review or approve the content before uploading. No other compensation was received and all opinions are my own.

Revisiting my Top Tech Products of 2015

It’s funny how some devices fade into obscurity while others remain surprisingly relevant. To some degree technology (at least on the hardware side) has been more incremental than revolutionary with each product cycle, dramatically extending the lifecycle of some devices.

In my latest video, we take a look at my top picks of 2015 and see which devices stood the test of time. You can also find the original reviews here.

One of the more unique products from that year was the Kangaroo Mini PC. At $99, it was a fully functional Windows 10 machine, compact enough to fit in a pocket. Manufactured by InFocus, better known for projectors, it was an interesting concept that ultimately didn’t last. The company even experimented with a laptop dock that allowed users to swap out computing modules, but this modular approach never really took off. While small form-factor PCs are still around, modern equivalents offer significantly more power at roughly the same price.

Some products from 2015 still hold value today. The Sony AX33 camcorder was one of the first consumer 4K camcorders, featuring optical image stabilization that made it stand out. Even now, similar models are available, and used units fetch respectable prices on eBay.

Retro gaming handhelds have come a long way, but back in 2015, the GPD XD was an early standout. Running Android, it could emulate up to Dreamcast-level consoles with decent performance. The clamshell design protected its screen, and it even had HDMI output for TV gaming. While GPD has since pivoted to making Windows-based handheld gaming PCs, this device was an early indication of the growing market for portable emulation.

Accessories like the Logitech K830 keyboard have also proven their longevity. With a built-in trackpad, backlit keys, and dual Bluetooth and dongle connectivity, it was an excellent all-in-one input device. Despite being discontinued, it remains in demand, partly due to its compatibility with Meta Quest’s VR desktop experience and Harmony remote controls.

Apple made waves in 2015 with several key product launches. The first-generation Apple Watch debuted, and while it was slow and somewhat limited, it set the stage for the dominant wearable platform it is today. Personally, I didn’t expect to keep using one, but the health tracking features became useful enough to keep one on my wrist for the last ten years.

The 12-inch MacBook was another notable Apple release, an ultra-light laptop with a single USB-C port that sparked discussions about the new USB-C technology’s port versatility. Performance-wise it was pretty slow – even for 2015 – but it survived for a number of years as a convenient and super lightweight portable. It’s a shame Apple hasn’t brought it back given how efficient their processors have become.

Apple’s iPhone 6S was another highlight, demonstrating significant performance gains over its predecessor. Unlike many phones that become obsolete quickly, Apple supported the 6S with security updates and hardware support until 2024, making it one of the longest-supported smartphones ever.

On the Android side, the Moto X Pure offered an impressive unlocked phone experience at a time when such options were rare. With a great camera and solid performance that was on part with most of the flagships at the time, it challenged the idea that flagship performance had to come with a carrier lock-in.

2015 was also the year portable SSDs became mainstream. Drives from Samsung and SanDisk delivered near-SATA speeds at a fraction of the size, transforming workflows for videographers and professionals. Today, these drives have become much faster with USB 4 and Thunderbolt versions, but the fundamental utility remains unchanged.

Perhaps the most enduring product from 2015 is the Nvidia Shield Android TV box. Still sold today, it retains much of the original hardware and continues to be a top choice for streaming and gaming. In an industry where most streaming devices become outdated quickly, its longevity is remarkable.

2015 might be the last year where we had so many gadgets to be excited about – wearables debuting, new powerful tv boxes, awesome developments for 4k video shooting, etc. etc. Today’s tech feels a bit “blah” by comparison. Even though we have seen some major leaps in performance over the last decade, the devices themselves feel largely the same.

DIY AI: Running Models on a Gaming Laptop for Beginners!

When DeepSeek AI burst onto the scene a week or two ago, it shook up the industry by proving that large language models can be made more efficient – in fact it’s possible to get the full DeepSeek model running on hardware that a mere mortal could acquire with a few thousand bucks. This shift raises an interesting question—can useful AI models run locally on consumer-grade computers now without relying on cloud-based data centers?

In my latest video, we take a look at running some “distilled” open source versions of DeepSeek and Meta’s Llama large language models. I’m surprised how far the quality of locally has come in such a short period of time.

To find out, I tested a distilled version of the DeepSeek model on a Lenovo Legion 5 laptop, which is equipped with an Nvidia 3070 GPU and 8GB of VRAM. The goal was to see if local AI could generate useful results at a reasonable speed.

The setup process was straightforward. After downloading and installing Nvidia’s CUDA toolkit to enable GPU acceleration. I then installed Ollama which is a command line interface for many of the available models. From there, it was just a matter of selecting and downloading an appropriate AI model. Since the full DeepSeek model requires an impractical 404GB of memory, I opted for the distilled 8B version, which uses 4.9GB of video memory.

With everything in place, I launched the model and checked that it was using the GPU correctly. The first test was a basic interaction in the command line. The DeepSeek model responded quickly and even displayed its thought process before generating a reply, which is a unique feature compared to traditional locally hosted chatbots. Performance-wise, it was surprisingly snappy for a locally run AI.

To gauge the model’s practical utility, I compared it to Meta’s open-source Llama model, selecting a similarly sized 8B variant. Performance between the two was comparable in terms of speed, but the responses varied. While DeepSeek’s output was structured and fairly coherent, Llama’s responses felt more refined in certain cases.

To take things further, I integrated Open WebUI, which provides a ChatGPT-style interface for easier interaction. This required installing Docker, but once set up, it significantly improved usability.

Next, I tested both models with a programming task—creating a simple Space Invaders game in a single HTML file. DeepSeek struggled, generating a mix of JavaScript and Python code that didn’t function correctly. Even when prompted differently, the results were inconsistent. The larger 14B version of DeepSeek running on my more powerful gaming PC did slightly better but still failed to produce a playable game. The Llama model performed marginally better, generating a somewhat functional version, but it was still far from the quality produced by cloud-based AI models like ChatGPT, which created a polished and working game on the first attempt.

For a different type of challenge, I had the models generate a blog post based on a video transcript. Initially, DeepSeek only provided an outline instead of a full narrative. After refining the prompt, it did produce something usable, though still less polished than ChatGPT’s output. Llama performed slightly better in this task, generating a clearer and more structured narrative after a nudge to get it out of its outlining mindset.

While local AI models aren’t yet on par with their cloud-based counterparts, the rapid improvements in efficiency suggest that practical, high-quality AI could soon run on everyday devices. Now that DeepSeek is pushing the industry to focus on optimization, it’s likely that smaller, more specialized models will become increasingly viable for local use.

For now, running AI on consumer hardware remains a work in progress. It’s come considerably far from where it was just a year ago, so it’ll be exciting to see what happens next.

ATSC 3 / NextGenTV Interactive Features Broke my ADTH Tuner

My ATSC 3 / NextGen TV woes continue.. My CBS affiliate, along with my local NBC station, have enabled interactive content, offering options like on-demand news segments, weather updates, and even the ability to restart live broadcasts. While the potential of this technology is promising, my experience with it has been far from seamless. You can see it in action in my latest video.

For this test, I used the ADTH box, currently the least expensive ATSC 3.0 tuner on the market. This is one of the devices that the broadcast industry is touting as an acceptable device to help people transition to the new standard on a budget.

The interactive features themselves are designed to provide a more dynamic viewing experience. When the prompt appears on a support channel, selecting the interactive option opens a menu where viewers can choose from various content categories. This might include local news updates, weather reports, emergency alerts, and special event coverage. For instance, NBC’s interface included information about the Paris Olympics, although that content was outdated. These features require an internet connection, as they pull in real-time updates from online sources rather than relying solely on the broadcast signal.

However, on my ADTH box, the interactive pop-up became a persistent annoyance. It pops up and stays persistent for a long time on a supported channels. And in the case of my NBC affiliate the interactive prompt prevented me from navigating back to the channel guide without switching to another channel first.

A particularly strange problem emerged when watching unencrypted ATSC 3.0 channels. A persistent large grey play button overlay appeared on these channels, blocking a significant portion of the screen. Oddly enough, this issue did not occur on encrypted channels. The play button glitch is not intentional, but it underscores the broader problem with the current implementation of ATSC 3.0’s encryption system. Broadcasters are misleading the FCC and the public by claiming these cheap boxes are ready for a major broadcast TV transition.

To troubleshoot, I updated the firmware, even trying a beta version. I performed a factory reset, plugged the box directly into my television to rule out HDCP issues, and tried multiple setups. Nothing fixed these problems. The ADTH box, which should be an accessible entry point for consumers into ATSC 3.0, instead became an example of the complications that DRM and unfinished software introduce to the experience.

Despite these issues, I do see value in the interactive features themselves. On-demand access to local news and alerts could be useful, and the ability to restart live broadcasts is a welcome addition. However, the current execution—at least on this hardware—is deeply flawed. The performance lags, interface issues, and DRM restrictions hinder what could be a major advancement in over-the-air television.

Beyond the interface frustrations, there were issues with HDR implementation. My NBC affiliate broadcasts in HDR, but the ADTH box doesn’t seem to tone map correctly, resulting in an overly dark picture. Switching between channels was also sluggish, and once interactive features were engaged, performance slowed down significantly. The delay in accessing menus and content made navigation frustrating, even when just trying to check the weather or local news updates.

With ATSC 3.0’s continued rollout, broadcasters and hardware manufacturers need to ensure that these features work as intended across a variety of devices. If the most affordable tuner on the market struggles this much, it’s hard to see widespread consumer adoption happening smoothly. For now, I’ll keep testing and see if future updates bring any improvements.

Asus Vivobook S 14 (S5406SA) Review: a Great Value at $799

For those looking for a well-rounded laptop at a competitive price, the Asus VivoBook S presents an appealing option. Currently selling for $799 at Walmart (compensated affiliate link), this machine features a Core Ultra 7 258V processor, 32GB of DDR5 RAM, a 1TB SSD, and Wi-Fi 7 support. It also comes with an OLED display, a rarity in this price range.

There’s also a more affordable version available at Best Buy with a Core Ultra 5 chip, 16GB of RAM, and 512GB of storage for $649 at Best Buy (compensated affiliate link).

You can see it in action in my latest laptop review.

The 14-inch OLED screen runs at a 1920×1200 resolution with a 16:10 aspect ratio. It delivers 600 nits of brightness and supports 100% of the sRGB color space, making it suitable for light creative work. However, it is not a touchscreen, and the glossy finish means reflections can be noticeable. Despite that, the display quality is higher than what is typically found in this segment, with vibrant colors and deep contrast.

The keyboard and trackpad are well-designed, featuring a backlit layout with comfortable key travel. The trackpad is responsive, though slightly springier than ideal. Weighing just under 3 lbs (1.3 kg), the aluminum chassis is lightweight and well-balanced, allowing the display to be opened with one finger. The 1080p webcam includes a privacy shutter and supports Windows Hello for facial recognition login.

In terms of ports, the VivoBook S provides a solid selection. On the left side, there is a full-size HDMI output, two Thunderbolt 4 ports, a microSD card slot, and a headphone/microphone jack. While Thunderbolt 5 would have been preferable, Thunderbolt 4 remains capable for most users and provides the option of using an external GPU to boost graphics capabilities. The right side houses two full sized USB-A ports, each running at 5 Gbps.

For everyday tasks, the laptop performs smoothly. Web browsing, streaming, and basic productivity tasks run without issue. The OLED display enhances video playback, though some minor frame drops were noted with 4K 60fps content.

Battery life is respectable, with 10 to 12 hours achievable under moderate use when keeping brightness at around 80%.

The integrated graphics on the new Intel processor provide enough power for light video editing and quick exports. The laptop handled 4K 60fps clips in DaVinci Resolve with smooth playback and efficient rendering.

Casual gaming is another strong point of this Intel hardware. Running Red Dead Redemption 2 at 1920×1200 on the lowest settings yielded 45-55 FPS, showing that it can handle even some AAA titles at reasonable framerates. However, more graphics-intensive titles like Starfield may struggle. Despite its slim profile, fan noise remains relatively subdued, avoiding the loud operation typical of gaming laptops.

Linux users may find the VivoBook S a viable option, though some minor quirks were observed when testing Ubuntu 24.10. Wi-Fi initially showed as disabled despite functioning correctly, likely due to driver support still catching up with the latest Intel chipset. Over time, future updates should improve Linux compatibility.

Overall, the Asus VivoBook S offers a strong value proposition, particularly with its combination of an OLED display, a powerful Intel processor, and ample RAM. The Best Buy variant with a Core Ultra 5 processor and 16GB of RAM remains a cost-effective alternative for users with lighter workloads. While not perfect, this laptop stands out as a compelling choice for those seeking a balance of performance, portability, and price.

Disclosure: Asus provided the laptop free of charge to the channel for a future giveaway. The company did not review or approve this content before it was uploaded and no other compensation was received. All opinions are my own.