Broadcasters Ask the FCC for a 2028 ATSC 3.0 / NextGenTV Transition Date

The nation’s broadcasters are making a push for the Federal Communications Commission (FCC) to lock in a firm February, 2028 date for the transition to the NextGen TV standard, ATSC 3.0.

I take a look at their filing in my latest video.

The broadcaster proposal includes setting the February, 2028 date for the top 55 television markets to fully switch over, with smaller markets following by February 2030. Along with that date request, they’re asking the FCC to make a number of policy changes to accelerate the transition.

One ask is for the FCC to lift the simulcasting rules that exist under the current ATSC 3.0 rule. Right now, stations are required to offer “substantially similar” broadcasts in both the current ATSC 1.0 and newer ATSC 3.0 formats. Broadcasters want to move their higher value programming to ATSC 3.0 to push more viewers to upgrade their televisions or tuners.

Last month’s long awaited “Future of Television” report indicated significant adoption issues centered around ATSC 3.0 tuner availability. At the moment only higher end TV sets have the new tuners built in and standalone tuners are expensive. and lousy. Broadcasters are asking the FCC to mandate the inclusion of ATSC 3.0 tuners on all new televisions as soon as possible to get more of them out to consumers.

One hurdle to this request is an ongoing legal dispute over patents related to the ATSC 3.0 tuning technology. A company has already won a lawsuit against LG, requiring the manufacturer to pay excessive licensing fees on every television sold with an ATSC 3.0 tuner. The case is currently before an appeals court but will no doubt make a mandate difficult to put in place right now.

Another contentious issue is digital rights management (DRM) encryption that broadcasters are building into the new standard. Broadcasters acknowledge the concerns raised by consumers, but tell the FCC that their existing “encoding rules” allow unlimited recording and storage of TV broadcasts. They fail to mention that these rules only apply to simulcasts of ATSC 1.0 content, not dedicated ATSC 3.0 broadcasts. If simulcasting is phased out, broadcasters would have more control over how content is recorded and accessed. And on top of that there are significant compatibility issues that limit how consumers can access the broadcasts and record them.

Currently, the only tuners capable of decrypting these broadcasts rely on Google’s Android TV operating system and Google’s DRM technology. This means broadcasters, who argue they need regulatory relief to compete with Big Tech, are indirectly reliant on Google’s ecosystem to distribute their content. Additionally, consumers have expressed a strong preference for networked tuner solutions—such as gateway devices that connect to a home network—yet broadcasters have struggled to deliver on their promise to support them.

Cable providers are likely to push back against this transition timeline, due to the costs involved in upgrading their infrastructure to support ATSC 3.0’s DRM along with its new video and audio codecs. Broadcasters argue that setting firm deadlines will give cable companies enough time to prepare and budget but they make no offer to assist cable providers’ transition expenses.

Alongside this requested transition, broadcasters are also asking for policy changes that could impact local station ownership rules and streaming services like YouTube TV.

They asked the FCC to lift restrictions on station ownership, claiming they need the ability to scale up their businesses in order to compete for advertising revenue. Unlike digital platforms that can expand without regulatory barriers, broadcasters face limitations on how many TV and radio stations they can own, both nationally and within local markets.

Another significant request involves treating streaming services that carry local stations — such as YouTube TV and Hulu — the same as cable providers when it comes to retransmission negotiations. Currently, national networks negotiate these deals for streaming platforms on behalf of their locally owned affiliates, whereas cable companies must negotiate with each individual station. If the rule changes, it could drive up the cost of streaming services as local broadcasters gain leverage to negotiate their own carriage fees.

The broadcast industry’s current business model defies basic economic principles: they continually raise prices even as demand for their product declines, while simultaneously making it more difficult for cord-cutters to tune in over the air due to the industry’s insistence on broadcast DRM. This FCC chair has already indicated that there are better uses for TV spectrum, so I predict he will approve broadcasters’ request just to hasten their demise.

Plex Adds HEVC Transcoding (sponsored post)

I spent some time experimenting with a new feature in Plex’s hardware transcoder that allows for HEVC transcoding of media. This means that high quality 1080p streams can be sent remotely at the same bit rate (or less) as a 720p h.264 stream. You can see it in action in my latest monthly sponsored Plex video.

The goal was to see how well this feature performs in terms of efficiency and quality and how easy it is to set up on a Plex server. My test system was a low-cost GMKTec G3 Plus mini PC running Linux, equipped with an Intel N150 processor.

Setting up the feature was straightforward. In the Plex web interface, under the server settings, I enabled the experimental HEVC video encoding option. It was also necessary to ensure that hardware acceleration was turned on. Additionally, Plex provides an option for HEVC optimization, which pre-encodes videos for better playback on low-powered servers.

To test performance, I loaded a 4K HDR Blu-ray movie onto the Plex server and played it back on my laptop. Initially, the video was streamed in full 4K resolution, but I then switched to a lower bitrate of 720p at 2 Mbps to force a transcode. The server responded quickly, and the video quality remained impressive. Due to copyright restrictions, I couldn’t share a direct visual comparison, but the results were noticeably better than the standard H.264 encoding.

Checking the Plex dashboard, I confirmed that both decoding and encoding were being handled in hardware, with the output using HEVC. The CPU usage remained relatively low, hovering between 25% and 36%, which was similar to what I had observed with H.264 encoding. This suggests that enabling HEVC does not significantly increase the processing load, at least on a modern Intel processor like the one in my test setup. With this level of efficiency, I estimate that the system could handle three or four simultaneous transcodes without much issue.

For those considering enabling this feature, you’ll need at least a 7th-generation Intel Core i3, i5, or i7 processor. Lower-end hardware needs to have Jasper Lake or a newer architecture to be fully supported. Even if a system supports hardware transcoding, that doesn’t necessarily mean it will support HEVC encoding, as some older Intel chips lack the necessary features.

Playback device compatibility also plays a role in whether a client can receive an HEVC stream. On Apple and Android devices, including Apple TV and Android TV-based systems, the automatic quality adjustment features defaults to H.264. To ensure HEVC transcoding is used, the resolution and bitrate must be manually selected. Additionally, HEVC playback requires a Chromium-based browser on Windows, macOS or Linux, or Safari on macOS. Other browsers like Firefox and Opera won’t work. Similarly, the Xbox One S doesn’t support HEVC playback but will automatically revert to H.264 when necessary.

The improved efficiency and quality of HEVC make it a useful addition to Plex’s transcoding capabilities. It’s worth experimenting with if you have the right hardware.

Disclosure: This was a paid sponsorship by Plex, however they did not review or approve this content before it was uploaded.

Survey: Half of Americans Still Use Physical Media

Physical media is still going strong according to a recent survey from Consumer Reports. Despite the shift toward digital downloads and streaming services, a significant number of consumers continue to hold on to tangible media, whether out of nostalgia, preference, or practicality. While we typically look at sales data to determine format preferences, this survey reveals what consumers are actually using on a regular basis.

In my latest video, we dive into the survey results and also interview the Consumer Reports journalist who initiated the survey.

The survey, which included over 2,000 respondents weighted to reflect the American population, found that 45% of Americans still listen to CDs. This number surpasses vinyl records, which have outsold CDs in recent years, but not necessarily in terms of actual usage. Even cassette tapes have a notable presence, with 15% of respondents saying they still use them. Surprisingly, 5% of Americans still listen to eight-track tapes, a format that largely disappeared decades ago.

On the video side, DVDs and Blu-rays remain in use by almost half of Americans. Even as streaming services dominate entertainment consumption, many consumers still rely on physical copies, whether for better quality, affordability, or simply because they own large collections. VHS tapes, once considered obsolete, are still watched by 15% of respondents. Even laser discs, a niche format from the 1990s, still have a small but dedicated following, with 3% of Americans reporting they still watch them.

But consumer-generated media has also seen a more dramatic shift away from older formats. Only 9% of respondents say they use a dedicated camcorder, a sharp decline from past decades when handheld video cameras were common in households. The rise of smartphones with high-quality video capabilities has made camcorders largely redundant. DVR usage has also declined, with only 4% of Americans still relying on devices like TiVo.

Classic video game systems remain popular, however, with 14% of Americans still using older consoles. While this number may seem lower than expected given the strong online retro gaming community, it reflects the difference between casual users and dedicated collectors. Many small businesses and conventions continue to thrive around vintage gaming, and many enthusiasts like myself have even returned to using CRT televisions for a more authentic experience. I think we may see this number actually increase over time.

Legacy home office equipment also persists in some households. About a quarter of Americans still use landline telephones, though many of these are now VoIP-based rather than traditional copper-line connections. Fax machines continue to be used by 11% of respondents, and even Rolodexes and floppy disks still have their niche users, with 5% and 4% respectively.

The journalist behind the Consumer Reports article, Jim Willcox, joined me in the video to discuss how he personally added the questions about legacy technology to the survey out of curiosity. He noted that the longevity of physical media often defies industry expectations. While new formats tend to be predicted as the downfall of older ones, the transition is rarely immediate. Communities continue to form around niche formats, and the appeal of tangible media has proven resilient.

Willcox also highlighted the changing landscape of content ownership. With the rise of streaming, consumers have become increasingly aware of the drawbacks—such as the unpredictability of content availability and the necessity of multiple subscriptions to access favorite shows or movies. In contrast, physical media ensures long-term ownership without concerns over shifting licensing agreements or digital rights management.

While digital convenience is undeniable, the enduring appeal of physical media suggests that many consumers still value having something they can hold, play, and collect. Whether it’s a preference for higher-quality audio and video, a sense of nostalgia, or simply wanting control over their media, this survey shows us that physical formats are far from extinct.

GMKTec AD-GP1 External GPU (eGPU) Review

The GMKTec AD-GP1 is a compact external GPU that houses an AMD RX 7600M XT graphics card with 8GB of video memory. Designed for portability, it connects via USB 4, Thunderbolt or Oculink connections. This device is a good external graphics option for those looking to boost the graphical capabilities of an ultrabook while maintaining the flexibility of a lightweight laptop. You can check it out in my latest review.

It is important to note that while the GPU supports Thunderbolt-enabled devices, it does not function with Apple’s silicon-based Macs, limiting its compatibility to certain Intel-based Macs and Windows ultrabooks with Thunderbolt, USB 4 or Oculink connections.

The price point is approximately $469 on GMKTec’s website. Depending on sales you might find a lower cost option on Amazon (compensated affiliate links).

The AD-GP1 features two HDMI 2.1 outputs and two DisplayPort 2.0 outputs, allowing for up to four external displays with resolutions up to 8K at 60Hz. However, despite its compact form factor, the GPU requires an external 240W power supply, which is roughly the same size as the unit itself. This power supply not only supports the GPU but also provides up to 100W of power back to the host device.

In testing, the GPU demonstrated solid performance when paired with an Asus Vivobook S 14 ultrabook with an Intel Core Ultra 7 258V. Running No Man’s Sky at 1080p on high settings, the system maintained a consistent 60 frames per second (fps). At ultra settings, performance fluctuated between 45 and 60 fps. However, in Red Dead Redemption 2, performance gains were negligible due to CPU bottlenecks, highlighting the fact that the GPU’s benefits will depend on how graphically demanding a game is relative to the processor’s capabilities.

Benchmark testing using 3DMark Time Spy revealed a significant increase in graphical performance with the external GPU attached. The laptop’s base score of 4,385 jumped considerably to 9,421 when the AD GP1 was connected, though the improvement was primarily in GPU-intensive tasks, with the CPU performance remaining unchanged.

Additional testing was conducted using a GMKTec Evo X1 mini PC (compensated affiliate link) equipped with a Ryzen AI 9 HX-370 processor. When connected via OCuLink, the external GPU delivered a performance score of 10,026, which was nearly identical to its performance over USB 4, suggesting that the GPU was not pushing beyond the bandwidth limitations of the connection.

Beyond gaming, the external GPU proved beneficial for tasks like local AI processing. Running a distilled version of DeepSeek 8B using the GPU significantly outperformed CPU-only processing.

Fan noise is minimal even when running at full blast for extended periods of time. The 3DMark Stress Test came in at 99.2% indicating that there won’t be much thermal throttling under sustained loads.

While external GPUs like this remain a niche product, they offer a good solution for users who need enhanced graphical power with a lightweight laptop. For those with compatible hardware, it’s an option worth considering for boosting graphics performance at home or in the office.

Kodak Slide N Scan Review – Rapid photo negative scanner

Scanning and digitizing old film negatives and slides is often a daunting task, requiring expensive equipment and meticulous effort. The Kodak Slide N Scan simplifies this process, providing a rapid and accessible way to convert old photo negatives and slides into digital images. I took a close look at this device in my latest review to see how well it performs and whether it’s a viable solution for casual users looking to preserve their film-based photos.

The Slide N Scan is found at many retailers including Amazon and Best Buy (compensated affiliate links) so shop around for the best price.

Unlike traditional scanners that require software and complex settings, the Slide n Scan operates without a PC. Negatives or slides are inserted into the film tray, the device automatically converts them into positives, and a simple push of a button saves the image onto an SD card (not included). It supports various film formats—including 35mm negatives and slides along with 110, and 126.

Quality, however, is where expectations need to be tempered. The Slide N Scan is not an archival-quality scanner. The 13-megapixel sensor interpolates images up to 22 megapixels, but the results lack fine detail and sharpness. Color reproduction is also inconsistent, as the device attempts to automatically determine the correct balance. While there are some limited color and exposure adjustments on device, enthusiasts will be looking for a lot more. This means additional editing is often necessary after scanning to achieve accurate colors and exposure. But for snapshots, the automatic settings will usually be good enough.

In practical use, scanning is incredibly fast. The device writes images directly to an SD card, which can then be transferred to a computer, phone or tablet. The process is similar to using a digital camera—plug in the SD card, and the images are readily accessible. The Slide n Scan also features an HDMI output, allowing users to project the output of its 5″ screen on a larger television screen.

Examining the scanned images, it’s clear this is not a professional-grade solution. Compression artifacts are visible, and while the device outputs large JPEGs, there is no option for saving uncompressed formats like TIFF. The upscaling process to 22 megapixels does little to enhance detail. I found that black-and-white negatives tend to look better when using the lower 14-megapixel resolution setting, especially since popular films like Tri-X are quite grainy and can interfere with the upscaling and image compression process at the 22-megapixel option.

Despite these shortcomings, the Kodak Slide n Scan serves a purpose. For casual users looking to quickly digitize old photos for sharing on social media or archiving in a non-professional capacity, it provides a convenient solution. The speed and ease of use make it appealing, especially for those with a large number of negatives or slides to process. However, users seeking high-quality digital preservation of film-based images will need to explore more advanced scanning options.

A device in the $500–$600 range with a better sensor and uncompressed file-saving capabilities would fill a gap in the market for those who want high-quality results without the time investment of professional scanning solutions. While the Slide n Scan doesn’t meet that standard, it represents progress in the space, providing an affordable and efficient way to convert old film into digital format.

See more products like this here.

Pay up: Paramount Threatening to Pull Channels off of YouTube TV

Last week, negotiations between YouTube TV and Paramount broke down, with both sides announcing that if an agreement isn’t reached, Paramount’s networks—including CBS, Nickelodeon, Comedy Central, and local CBS affiliates—could disappear from YouTube TV. Paramount is now running an aggressive social media campaign to put pressure on the streaming service.

Learn more in my latest video!

YouTube TV has already announced that if these channels are removed, subscribers will receive an $8 discount on their monthly bill. That happens to be the same price as the base tier of Paramount Plus, which might not be a coincidence. The offer seems like a calculated move to pressure Paramount, which likely earns more from its YouTube TV carriage deal than from direct streaming subscriptions. Meanwhile, Paramount has been running ads urging viewers to petition YouTube TV to keep their channels—essentially advocating for a rate increase, since any deal in Paramount’s favor will likely result in a higher subscription costs for users.

Beyond the channels disappearing, all user DVR recordings from those channels made with the YouTube TV service would also get deleted when the deal expires. In this digital age we truly control and own nothing.

The trajectory of YouTube TV’s pricing tells a familiar story. When it launched in 2017, it was a competitively priced alternative to traditional cable. Now, at about $83 per month, it’s in the ballpark of what a cable subscription costs. Much of this increase comes from rising content costs, as networks demand higher rates during contract renewals. And the issue isn’t limited to YouTube TV—cable, satellite, and streaming providers all face similar struggles as content owners seek to maximize their revenue, even as traditional TV viewership declines.

Compounding the issue, local broadcast affiliates are currently lobbying the FCC and Congress to negotiate directly with streaming services instead of being bundled into larger deals by the national networks. If that push is successful, it could lead to even higher costs, as each local broadcaster would have the ability to demand separate fees. This mirrors the problem that led to the cord cutting movement in the first place.

The changes aren’t limited to streaming. Over-the-air television, which has long been a free alternative, is also undergoing a transformation. The new NextGenTV standard introduces encryption, meaning that even recordings from an antenna will require authentication to watch and retain. While NextGen\TV promises better picture quality and features, it also represents a shift toward restricting user control, pushing more viewers toward paid services.

As these disputes play out, the power ultimately lies with consumers. Cord-cutters have more options than ever, from free streaming platforms to on-demand purchases, and shifting away from expensive, restrictive services sends a clear message. While networks and providers continue their negotiations, viewers can choose where their money goes—and that choice may be the strongest leverage available.

Retro Fighters Game Controller Haul! Quick reviews of the D6 Dreamcast, Hunter and BattlerGC Pro

When it comes to modern takes on classic gaming, Retro Fighters has been making a name for itself by designing controllers that blend nostalgia with contemporary features. Their latest lineup includes controllers inspired by the Dreamcast, GameCube, and original Xbox, each offering wireless connectivity to their respective consoles while also working with modern platforms like the Nintendo Switch and PC.

You can see them all in my latest video.

The Dreamcast-inspired D6 is a six-button controller designed with fighting games and shooters in mind. It features mechanical switches for responsive actuation and comes with a dongles that allow it to function wirelessly with the Dreamcast, as well as on modern systems. While the button feel is satisfying thanks to its mechanical Kailh switches, the D-pad’s rolling maneuverability felt bumpier than expected. This could be a pre-release issue, and Retro Fighters acknowledged that it shouldn’t feel this way. Otherwise, the controller offers solid performance with minimal input lag.

An original VMU attached to a Dreamcast console using the Retro Fighters D6 Dreamcast wireless dongle

The GameCube-style BattlerGC Pro brings modern enhancements to the classic design. Its octagonal-gated sticks are hall effect sensors, meaning no drift issues, and the triggers replicate the analog and digital functionality of the original including an digital button push when the trigger is fully engaged. Like the D6, it includes a dongle for GameCube compatibility and also works on the Nintendo Switch, PC, and other platforms with a second dongle. Interestingly, Bluetooth connectivity is an option, which actually provides lower latency on the Switch compared to the USB connection. The controller performs well across platforms, offering a familiar feel for Super Smash Bros. players.

The Hunter controller is a modernized take on the original Xbox gamepad. While visually similar to an Xbox One controller, it’s designed exclusively for the original Xbox, along with PC and emulation compatibility. The controller maintains pressure-sensitive face buttons, a feature for certain original Xbox titles, while integrating hall effect sensors for the sticks and triggers. A slight drawback is the D-pad, which feels restricted by a raised plastic lip, though this is a minor issue given the Xbox’s limited reliance on the D-pad. Unlike the other controllers, the Hunter doesn’t support wired connectivity, functioning solely through its included dongles.

Across all three controllers, latency performance using my method of shooting a display at 240 fps was excellent. Wired connections on each came in at seven frames and wireless around ten to eleven frames for a button push to be registered on screen. Given the industry’s improvements in reducing input lag, these controllers are competitive with the fastest options available.

All in the Retro Fighters controllers bring a welcome update for those looking to get the right ‘feel’ for their emulated console favorites on modern platforms along with the ability to plug these same controllers into the original console that inspired them.

You can find the controllers on Amazon or via the Retro Fighters website (compensated affiliate links). If you purchase direct from Retro Fighters you can get 10% off your order if you use the code lontv.

Disclosure: Retro Fighters provided the controllers free of charge. However, they did not review or approve this content before it was uploaded and no other compensation was received. All opinions are my own.

The Decade Old Nvidia Shield TV Still Gets Updates!

The Nvidia Shield has been a fixture in my home since 2015, and it remains one of the longest-supported devices I’ve ever owned. Even after a decade, Nvidia continues to provide updates—not just security patches but meaningful improvements. The latest update addresses an issue that’s been a long-standing frustration for me: 24p frame rate switching. Check it out in my latest video.

For those unfamiliar, frame rate switching is important as nearly every movie and most modern television shows are presented at 24 frames per second. Without proper framerate switching, a 24p movie playing on a system locked to 60Hz can create a jittery effect that’s noticeable, especially to those sensitive to motion inconsistencies.

Apple TV has long handled this seamlessly for streaming apps, but the Shield has struggled with them even though apps like Plex and Kodi do it properly. This new update doesn’t completely fix the issue, but it brings the Shield much closer to where it needs to be after all of this time.

The process for enabling this feature is relatively simple but requires some setup. If you’re using the 2019 Shield remote, a button can bring up the settings menu where you can toggle the frame rate match feature. For those using older remotes, Nvidia’s mobile app offers an alternative, allowing users to access the menu without purchasing a newer remote. It’s a small but useful workaround. The feature has to be summoned while the content is playing.

I tested this update across various streaming services—Netflix, Disney+, Amazon Prime Video, and Apple TV+. Across the board, the Shield successfully switched to 24p when prompted, something it failed to do consistently in the past. However, the feature still requires manual activation every time content starts, which is less convenient than Apple TV’s fully automatic implementation. Still, seeing it work across multiple platforms is an encouraging sign of progress.

Beyond frame rate improvements, the update brings support for Auro 3D, an immersive audio format. Nvidia also updated security patches, ensuring the device stays protected, even if it remains on Android 11.

What stands out most is Nvidia’s continued support for this hardware. The Shield has gone through hardware revisions in 2017 and 2019, yet the original model still receives updates. This level of longevity is rare in consumer electronics, where most companies push users toward upgrading every few years.

Check out my Shield TV appreciation video for more on the Shield’s history and potential future!

Orico 10 Gigabit Thunderbolt / USB 4 Ethernet Adapter Review

As many of you know, I have this crazy 10 gigabit Internet connection from Comcast. It started out as a 2 gigabit connection which they ramped up to 10 over the last couple of years to keep up with local competitors. As cool as it is to have a connection this fast, you do need specialized ethernet gear to hit those speeds. I’m therefore always on the look-out for the higher speed gear, especially USB-based solutions.

Orico reached out to me recently to check out their new 10 gigabit adapter. This is the smallest 10-gig adapter I’ve tested, and it works with Thunderbolt 3, 4, and 5, as well as USB 4 devices. However, it won’t function with USB 3, which is something to keep in mind before purchasing.

You can check out my full video review here.

For this review, I used a USB 4-based mini PC from Beelink, an AMD-powered machine that has a 40 gigabit per second USB 4 ports on the back. I also tested it on my Mac with Thunderbolt.

At $159, the Oro adapter is priced a bit lower than other 10-gig adapters I’ve seen over the years, though it’s significantly more expensive than the more common 2.5-gig models. The hardware itself is minimalistic, with a USB-C/Thunderbolt port on one end and a standard 10GBase-T Ethernet port on the other.

It runs off bus power, but like most 10-gig adapters, it generates a lot of heat over time. Instead of a large heatsink, this one uses an internal fan, which stays on and becomes quite loud after extended use—louder, in fact, than the mini PC I was testing it with!

The chipset inside is a Marvell AQtion, which I hadn’t worked with before. It worked immediately on macOS, but on Windows, I had to check for driver updates before it functioned properly. If it doesn’t work right away on a Windows machine, running Windows update and rebooting should solve the issue.

Performance-wise, I conducted two different tests. The first was a basic Speedtest.net run, but as expected, the results varied based on internet traffic conditions and the limitations of the test servers. While I have a 10-gig symmetrical internet connection, getting full speeds on a public server is rare. A more reliable measure came from a local iPerf test between two 10-gig connected devices. Here, the Orico adapter delivered consistent speeds of around 9.48 Gbps in both directions, which is in line with expectations when accounting for network overhead.

Functionally, the adapter does exactly what it promises, delivering full 10-gig speeds over USB 4 or Thunderbolt. The trade-off comes with the fan noise. While it’s compact and portable, the fan’s constant hum is hard to ignore. Those who prioritize silence might prefer a larger device with passive cooling, such as the slightly more expensive OWC 10G adapter, which is bulkier but completely quiet.

That said, if portability is a priority and the noise isn’t a deal-breaker, the Orico adapter is a solid choice. Just keep in mind that while 10-gig speeds sound great on paper, actual internet usage rarely maxes out that capacity, meaning a 2.5-gig adapter might be a more practical and cost-effective alternative for most users.

Disclosure: Orico sent this device to my channel for review, but they did not review or approve the content before uploading. No other compensation was received and all opinions are my own.

Revisiting my Top Tech Products of 2015

It’s funny how some devices fade into obscurity while others remain surprisingly relevant. To some degree technology (at least on the hardware side) has been more incremental than revolutionary with each product cycle, dramatically extending the lifecycle of some devices.

In my latest video, we take a look at my top picks of 2015 and see which devices stood the test of time. You can also find the original reviews here.

One of the more unique products from that year was the Kangaroo Mini PC. At $99, it was a fully functional Windows 10 machine, compact enough to fit in a pocket. Manufactured by InFocus, better known for projectors, it was an interesting concept that ultimately didn’t last. The company even experimented with a laptop dock that allowed users to swap out computing modules, but this modular approach never really took off. While small form-factor PCs are still around, modern equivalents offer significantly more power at roughly the same price.

Some products from 2015 still hold value today. The Sony AX33 camcorder was one of the first consumer 4K camcorders, featuring optical image stabilization that made it stand out. Even now, similar models are available, and used units fetch respectable prices on eBay.

Retro gaming handhelds have come a long way, but back in 2015, the GPD XD was an early standout. Running Android, it could emulate up to Dreamcast-level consoles with decent performance. The clamshell design protected its screen, and it even had HDMI output for TV gaming. While GPD has since pivoted to making Windows-based handheld gaming PCs, this device was an early indication of the growing market for portable emulation.

Accessories like the Logitech K830 keyboard have also proven their longevity. With a built-in trackpad, backlit keys, and dual Bluetooth and dongle connectivity, it was an excellent all-in-one input device. Despite being discontinued, it remains in demand, partly due to its compatibility with Meta Quest’s VR desktop experience and Harmony remote controls.

Apple made waves in 2015 with several key product launches. The first-generation Apple Watch debuted, and while it was slow and somewhat limited, it set the stage for the dominant wearable platform it is today. Personally, I didn’t expect to keep using one, but the health tracking features became useful enough to keep one on my wrist for the last ten years.

The 12-inch MacBook was another notable Apple release, an ultra-light laptop with a single USB-C port that sparked discussions about the new USB-C technology’s port versatility. Performance-wise it was pretty slow – even for 2015 – but it survived for a number of years as a convenient and super lightweight portable. It’s a shame Apple hasn’t brought it back given how efficient their processors have become.

Apple’s iPhone 6S was another highlight, demonstrating significant performance gains over its predecessor. Unlike many phones that become obsolete quickly, Apple supported the 6S with security updates and hardware support until 2024, making it one of the longest-supported smartphones ever.

On the Android side, the Moto X Pure offered an impressive unlocked phone experience at a time when such options were rare. With a great camera and solid performance that was on part with most of the flagships at the time, it challenged the idea that flagship performance had to come with a carrier lock-in.

2015 was also the year portable SSDs became mainstream. Drives from Samsung and SanDisk delivered near-SATA speeds at a fraction of the size, transforming workflows for videographers and professionals. Today, these drives have become much faster with USB 4 and Thunderbolt versions, but the fundamental utility remains unchanged.

Perhaps the most enduring product from 2015 is the Nvidia Shield Android TV box. Still sold today, it retains much of the original hardware and continues to be a top choice for streaming and gaming. In an industry where most streaming devices become outdated quickly, its longevity is remarkable.

2015 might be the last year where we had so many gadgets to be excited about – wearables debuting, new powerful tv boxes, awesome developments for 4k video shooting, etc. etc. Today’s tech feels a bit “blah” by comparison. Even though we have seen some major leaps in performance over the last decade, the devices themselves feel largely the same.

DIY AI: Running Models on a Gaming Laptop for Beginners!

When DeepSeek AI burst onto the scene a week or two ago, it shook up the industry by proving that large language models can be made more efficient – in fact it’s possible to get the full DeepSeek model running on hardware that a mere mortal could acquire with a few thousand bucks. This shift raises an interesting question—can useful AI models run locally on consumer-grade computers now without relying on cloud-based data centers?

In my latest video, we take a look at running some “distilled” open source versions of DeepSeek and Meta’s Llama large language models. I’m surprised how far the quality of locally has come in such a short period of time.

To find out, I tested a distilled version of the DeepSeek model on a Lenovo Legion 5 laptop, which is equipped with an Nvidia 3070 GPU and 8GB of VRAM. The goal was to see if local AI could generate useful results at a reasonable speed.

The setup process was straightforward. After downloading and installing Nvidia’s CUDA toolkit to enable GPU acceleration. I then installed Ollama which is a command line interface for many of the available models. From there, it was just a matter of selecting and downloading an appropriate AI model. Since the full DeepSeek model requires an impractical 404GB of memory, I opted for the distilled 8B version, which uses 4.9GB of video memory.

With everything in place, I launched the model and checked that it was using the GPU correctly. The first test was a basic interaction in the command line. The DeepSeek model responded quickly and even displayed its thought process before generating a reply, which is a unique feature compared to traditional locally hosted chatbots. Performance-wise, it was surprisingly snappy for a locally run AI.

To gauge the model’s practical utility, I compared it to Meta’s open-source Llama model, selecting a similarly sized 8B variant. Performance between the two was comparable in terms of speed, but the responses varied. While DeepSeek’s output was structured and fairly coherent, Llama’s responses felt more refined in certain cases.

To take things further, I integrated Open WebUI, which provides a ChatGPT-style interface for easier interaction. This required installing Docker, but once set up, it significantly improved usability.

Next, I tested both models with a programming task—creating a simple Space Invaders game in a single HTML file. DeepSeek struggled, generating a mix of JavaScript and Python code that didn’t function correctly. Even when prompted differently, the results were inconsistent. The larger 14B version of DeepSeek running on my more powerful gaming PC did slightly better but still failed to produce a playable game. The Llama model performed marginally better, generating a somewhat functional version, but it was still far from the quality produced by cloud-based AI models like ChatGPT, which created a polished and working game on the first attempt.

For a different type of challenge, I had the models generate a blog post based on a video transcript. Initially, DeepSeek only provided an outline instead of a full narrative. After refining the prompt, it did produce something usable, though still less polished than ChatGPT’s output. Llama performed slightly better in this task, generating a clearer and more structured narrative after a nudge to get it out of its outlining mindset.

While local AI models aren’t yet on par with their cloud-based counterparts, the rapid improvements in efficiency suggest that practical, high-quality AI could soon run on everyday devices. Now that DeepSeek is pushing the industry to focus on optimization, it’s likely that smaller, more specialized models will become increasingly viable for local use.

For now, running AI on consumer hardware remains a work in progress. It’s come considerably far from where it was just a year ago, so it’ll be exciting to see what happens next.

ATSC 3 / NextGenTV Interactive Features Broke my ADTH Tuner

My ATSC 3 / NextGen TV woes continue.. My CBS affiliate, along with my local NBC station, have enabled interactive content, offering options like on-demand news segments, weather updates, and even the ability to restart live broadcasts. While the potential of this technology is promising, my experience with it has been far from seamless. You can see it in action in my latest video.

For this test, I used the ADTH box, currently the least expensive ATSC 3.0 tuner on the market. This is one of the devices that the broadcast industry is touting as an acceptable device to help people transition to the new standard on a budget.

The interactive features themselves are designed to provide a more dynamic viewing experience. When the prompt appears on a support channel, selecting the interactive option opens a menu where viewers can choose from various content categories. This might include local news updates, weather reports, emergency alerts, and special event coverage. For instance, NBC’s interface included information about the Paris Olympics, although that content was outdated. These features require an internet connection, as they pull in real-time updates from online sources rather than relying solely on the broadcast signal.

However, on my ADTH box, the interactive pop-up became a persistent annoyance. It pops up and stays persistent for a long time on a supported channels. And in the case of my NBC affiliate the interactive prompt prevented me from navigating back to the channel guide without switching to another channel first.

A particularly strange problem emerged when watching unencrypted ATSC 3.0 channels. A persistent large grey play button overlay appeared on these channels, blocking a significant portion of the screen. Oddly enough, this issue did not occur on encrypted channels. The play button glitch is not intentional, but it underscores the broader problem with the current implementation of ATSC 3.0’s encryption system. Broadcasters are misleading the FCC and the public by claiming these cheap boxes are ready for a major broadcast TV transition.

To troubleshoot, I updated the firmware, even trying a beta version. I performed a factory reset, plugged the box directly into my television to rule out HDCP issues, and tried multiple setups. Nothing fixed these problems. The ADTH box, which should be an accessible entry point for consumers into ATSC 3.0, instead became an example of the complications that DRM and unfinished software introduce to the experience.

Despite these issues, I do see value in the interactive features themselves. On-demand access to local news and alerts could be useful, and the ability to restart live broadcasts is a welcome addition. However, the current execution—at least on this hardware—is deeply flawed. The performance lags, interface issues, and DRM restrictions hinder what could be a major advancement in over-the-air television.

Beyond the interface frustrations, there were issues with HDR implementation. My NBC affiliate broadcasts in HDR, but the ADTH box doesn’t seem to tone map correctly, resulting in an overly dark picture. Switching between channels was also sluggish, and once interactive features were engaged, performance slowed down significantly. The delay in accessing menus and content made navigation frustrating, even when just trying to check the weather or local news updates.

With ATSC 3.0’s continued rollout, broadcasters and hardware manufacturers need to ensure that these features work as intended across a variety of devices. If the most affordable tuner on the market struggles this much, it’s hard to see widespread consumer adoption happening smoothly. For now, I’ll keep testing and see if future updates bring any improvements.

Asus Vivobook S 14 (S5406SA) Review: a Great Value at $799

For those looking for a well-rounded laptop at a competitive price, the Asus VivoBook S presents an appealing option. Currently selling for $799 at Walmart (compensated affiliate link), this machine features a Core Ultra 7 258V processor, 32GB of DDR5 RAM, a 1TB SSD, and Wi-Fi 7 support. It also comes with an OLED display, a rarity in this price range.

There’s also a more affordable version available at Best Buy with a Core Ultra 5 chip, 16GB of RAM, and 512GB of storage for $649 at Best Buy (compensated affiliate link).

You can see it in action in my latest laptop review.

The 14-inch OLED screen runs at a 1920×1200 resolution with a 16:10 aspect ratio. It delivers 600 nits of brightness and supports 100% of the sRGB color space, making it suitable for light creative work. However, it is not a touchscreen, and the glossy finish means reflections can be noticeable. Despite that, the display quality is higher than what is typically found in this segment, with vibrant colors and deep contrast.

The keyboard and trackpad are well-designed, featuring a backlit layout with comfortable key travel. The trackpad is responsive, though slightly springier than ideal. Weighing just under 3 lbs (1.3 kg), the aluminum chassis is lightweight and well-balanced, allowing the display to be opened with one finger. The 1080p webcam includes a privacy shutter and supports Windows Hello for facial recognition login.

In terms of ports, the VivoBook S provides a solid selection. On the left side, there is a full-size HDMI output, two Thunderbolt 4 ports, a microSD card slot, and a headphone/microphone jack. While Thunderbolt 5 would have been preferable, Thunderbolt 4 remains capable for most users and provides the option of using an external GPU to boost graphics capabilities. The right side houses two full sized USB-A ports, each running at 5 Gbps.

For everyday tasks, the laptop performs smoothly. Web browsing, streaming, and basic productivity tasks run without issue. The OLED display enhances video playback, though some minor frame drops were noted with 4K 60fps content.

Battery life is respectable, with 10 to 12 hours achievable under moderate use when keeping brightness at around 80%.

The integrated graphics on the new Intel processor provide enough power for light video editing and quick exports. The laptop handled 4K 60fps clips in DaVinci Resolve with smooth playback and efficient rendering.

Casual gaming is another strong point of this Intel hardware. Running Red Dead Redemption 2 at 1920×1200 on the lowest settings yielded 45-55 FPS, showing that it can handle even some AAA titles at reasonable framerates. However, more graphics-intensive titles like Starfield may struggle. Despite its slim profile, fan noise remains relatively subdued, avoiding the loud operation typical of gaming laptops.

Linux users may find the VivoBook S a viable option, though some minor quirks were observed when testing Ubuntu 24.10. Wi-Fi initially showed as disabled despite functioning correctly, likely due to driver support still catching up with the latest Intel chipset. Over time, future updates should improve Linux compatibility.

Overall, the Asus VivoBook S offers a strong value proposition, particularly with its combination of an OLED display, a powerful Intel processor, and ample RAM. The Best Buy variant with a Core Ultra 5 processor and 16GB of RAM remains a cost-effective alternative for users with lighter workloads. While not perfect, this laptop stands out as a compelling choice for those seeking a balance of performance, portability, and price.

Disclosure: Asus provided the laptop free of charge to the channel for a future giveaway. The company did not review or approve this content before it was uploaded and no other compensation was received. All opinions are my own.

Is ATSC 3.0 NextgenTV Stuck?

A long-awaited report on the transition to ATSC 3.0, the new over-the-air television technology, was released last week. The report represents the work of a broad coalition of stakeholders, including broadcasters, cable and satellite companies, consumer groups, and manufacturers, alongside the FCC. We talk about the report in my latest video.

What’s clear in the report is that the transition to this new over the air television technology is stuck – largely hindered by new DRM requirements that make it difficult for manufacturers to make affordable devices. Many are opting not to make one at all.

The FCC had initially targeted 2027 for turning off ATSC 1.0 and transitioning fully to ATSC 3.0. However, no stakeholder in the report supports setting a transition date yet. Consumer adoption of ATSC 3.0 capable televisions and tuners remains slow due to expensive devices. Most of the TVs that include ATSC 3.0 tuners are higher end sets, and while some lower-cost models are starting to include them, the technology has yet to reach the broader market. Similarly, cable and satellite providers face costly upgrades to their infrastructure and set-top boxes to handle the new standard, adding another layer of complexity.

Interestingly, the FCC chairman has suggested that TV spectrum could be repurposed for broadband data delivery, especially in underserved areas. Broadcasters are exploring this possibility by looking at how ATSC 3.0 might serve as a wireless data delivery system. However, this shift could force the industry to accelerate the transition or risk losing valuable spectrum to broadband use.

Retransmission fees—a major revenue source for broadcasters—complicate the situation further. Cable and satellite providers already pass significant costs to customers to cover these fees. Adding the expense of transitioning to ATSC 3.0 only intensifies the pressure cable companies face being stuck in the middle of broadcasters and customers. Moreover, legal requirements to maintain signal quality without material degradation present additional technical and financial challenges.

DRM is another contentious issue. Broadcasters continue to push for encryption of over-the-air signals, arguing it aligns with how the internet secures content. But unlike platforms like Netflix, which offer seamless access across devices even with DRM, ATSC 3.0 encryption has created significant consumer inconvenience. Currently, only devices running Android or Samsung’s Tizen TV OS can decrypt ATSC 3.0 signals, severely limiting accessibility.

Allowing gateway devices, like the HDHomerun and Zapperbox’s gateway functionality, could make the transition easier for consumers as they could watch ATSC 3 signals on the smart TVs and streaming boxes they already own. But the promised specifications from the broadcast standards body have yet to materialize.

I was very disappointed to see that the thousands of consumers who have spoken out against DRM on the FCC docket were not represented in this report.

For now, the ATSC 3.0 transition seems to be at a crossroads. With no clear path forward, the technology risks stalling altogether. Broadcasters, policymakers, and other stakeholders will need to address the existing challenges—from cost and DRM to consumer convenience—if they want to see widespread adoption.

LocalSend is a Great Open Source Simple File Transfer App for Android, iOS, Linux, Mac and PC

I came across a free, open source utility called LocalSend that has added a touch of convenience to my daily life. The app works as a cross-platform tool for transferring files between devices, offering functionality similar to Apple’s AirDrop but without being limited to a single ecosystem. It works across just about every platform out there including Android, iOS, iPad OS, Windows, Mac and Linux.

You can see it in action in my latest video.

To send a photo from my Android to my iPhone, I simply selected the LocalSend destination from the sharing options on my Android. The app assigned a random name to the device for identification purposes, which can be customized in the settings. After accepting the transfer on the iPhone, the photo appeared directly in the Photos app.

The app isn’t just for phones; it’s compatible with tablets and computers too. During testing, I used the Mac client to send a folder containing various file types—a PDF, a text file, an image, and an Excel document—to both my iPhone and Android phone simultaneously. The app preserved the folder structure on both devices.

Another useful feature is the ability to share files via a web link or QR code, eliminating the need for the app on the receiving device.

While the app works efficiently, there are a few caveats. Both devices need to be on the same local network, and the app must be open and active to receive files. It also didn’t work over my Tailscale VPN due to how it handles broadcast packets. However, these are minor inconveniences considering the app’s utility.

For comparison, I’ve used browser-based solutions like Snapdrop, which also allow for quick file transfers across devices. However, Snapdrop relies on a browser interface, whereas LocalSend integrates directly into the share button on mobile platforms. This integration streamlines the process and makes it feel more natural, especially for users accustomed to native sharing features.

I discovered LocalSend while helping my daughter with her YouTube channel. She edits her videos on an iPad but uses an Android phone to upload YouTube Shorts because the iPad’s YouTube app lacks this functionality. With LocalSend, she can quickly transfer videos from the iPad to the Android phone, making the entire process much simpler. It’s become a practical solution for us both.

LocalSend is free, open-source, and available on major app stores, including those for Mac and iOS, along with Android. Downloads for other platforms can be found on the Localsend website.

See more videos like this here.

Plex Rolls Out Beta of New TV Interface (sponsored post)

Plex recently announced a preview beta of their new interface for televisions. The beta is currently available on Apple TV but will be rolled out to additional platforms in the coming months. You can see it in action in my latest monthly sponsored video from Plex.

The home screen retains familiar elements, but with some notable refinements. For those running a Plex Pro server or have access to one, you’ll see a new row of servers running horizontally across the top of the screen. You can favorite specific libraries, which pins them to the top of the navigation, minimizing unnecessary scrolling. Browsing options are now slightly rearranged, but everything is as you’d expect—filters, collections, and categories—remains accessible, albeit in new locations.

Next to the library section you’ll find Live TV, which now integrates both live streaming channels and over the air TV if you’re making use of Plex’s DVR features. While integration between different servers and content sources isn’t fully seamless yet as it is in the current UI, the effort to accommodate various setups shows promise.

The next section over is “On Demand” content which includes the thousands of free ad supported TV shows and movies from Plex’s servers along with their recently announced rental section.

The discovery section, watchlist functionality, universal search, and settings haven’t changed drastically but have been given a cleaner, more consistent layout. From account settings to profile configurations, everything feels intuitive and easy to navigate.

One detail worth noting is the way user reviews now take precedence over professional ones when browsing movies and shows. I think this is a great opportunity for budding movie reviewers as you’ll be able to link back to your other social media platforms in your profile. This feature is optional and users have the option to revert back to having professional reviews first, followed by user generated. There’s also an option to turn off each set of reviews individually or fully.

I am receiving a lot of feedback from users on my YouTube comments with many expressing mixed or negative feedback on the new changes. This is an early beta and a lot will likely be changed over the coming months. Plex will be listening to user feedback on their forum page here, so I definitely suggest popping in there and making your voice heard.

Disclosure: This post was a paid sponsorship from Plex, however they did not review or approve this before it was uploaded.

Mayflash F700 Arcade Stick Review

I recently spent some time with the Mayflash F700 arcade stick, a controller that is both hefty and versatile. Weighing in at around six pounds (2.69 kg), it’s a device built with enthusiasts of fighting games and retro arcade shoot-’em-ups in mind. The F700 is priced at $149, putting it firmly in the premium category, but it offers a wide range of features to justify that investment. You can check it out in my latest review.

Right out of the box, the F700 is equipped with Mayflash-branded controls. The joystick has a satisfying click with every movement, offering a tactile, mechanical feel. The buttons, while quieter, are responsive, featuring shallow actuation and a quick spring-back that lends itself well to fast-paced gameplay. If customization is your thing, both the joystick and buttons can be swapped out for Sanwa components. Mayflash also sells a more premium version with those Sanwa controls already installed. Additionally, the controller’s top acrylic panel, held in place by magnets, allows you to personalize its appearance by replacing the background with your own design.

Another customizable feature is the gate for the joystick. The device comes with an octagonal gate as an alternative to the default rounded square one. While swapping gates requires disassembling the controller, it’s a useful option for those who want more precise control tailored to specific games.

The F700 also boasts a variety of connectivity options. The controller can be used via USB, 2.4GHz wireless with a dongle, or Bluetooth. Its built-in 1,000 mAh battery ensures tens of hours of wireless play. For wired connections, the cable is conveniently stored inside the controller, accessible through a small door. A notch on the door allows you to keep the cable partially exposed for quicker access. One gotchya is that the cable is not easily removed or replaced, it’s hard wired inside.

Compatibility is another standout feature. The F700 works with an impressive array of platforms, including the Nintendo Switch, PlayStation 4 and 3, Android devices, Apple products, and retro emulation consoles like the Sega Genesis Mini and Neo Geo Mini. While it’s technically compatible with the PlayStation 5, it only supports games that allow legacy PS4 controllers, which limits its usability on that platform. During my tests, the controller performed seamlessly across several systems, including a PC, a PS4, the Nintendo Switch, and my MiSTer console.

While the controller lacks a customization tool there’s still a bunch of functionality on board. There’s a switch that lets you map the joystick to act as a left stick, right stick, or D-pad, depending on your needs along with the ability to apply turbo to specific buttons. There’s even a headset jack for trash-talking during online matches, though it’s limited to certain connection methods like USB or the 2.4GHz dongle on the PC and PS3/PS4.

In terms of gameplay, the F700 delivers a satisfying arcade experience. I tested it with arcade classics like Teenage Mutant Ninja Turtles on the MiSTer and Street Fighter 2 on the Switch, as well as shoot ’em up titles like the original arcade Zaxxon. The controls felt precise and responsive, with no noticeable input lag when hardwired to a PC or the MiSTer.

For latency testing, I recorded gameplay using an iPhone at 240 frames per second, capturing both the button press and the on-screen response. By analyzing the footage frame by frame, I counted the number of frames it took for the input to register on the display. This approach, while not as scientifically rigorous as connecting electronics directly to the controller for precise measurements, provides a reliable baseline for comparison across different controllers and configurations.

That testing revealed the F700’s strong performance. When wired to a gaming laptop, it registered input in just seven frames at 240 frames per second, a remarkably low number. The 2.4GHz dongle added a mere three to four frames, while Bluetooth added a few additional frames. On the Nintendo Switch, however, latency was higher, with input taking around 18 frames. This result aligns with the Switch’s hardware limitations rather than any fault of the controller.

Overall, the Mayflash F700 is a solid choice for casual and enthusiast arcade players alike. Its extensive compatibility, customizable features, and robust build quality make it a versatile option for a variety of gaming setups.

Disclosure: Mayflash provided the F700 to the channel free of charge. However no other compensation was received and they did not review or approve the video or this post before it was uploaded. All opinions are my own.

I Finally Cut the Cord..

After years of navigating the world of cable TV and rising fees, I finally took the plunge and cut the cord. It’s a decision I’d been putting off for a long time, but the ever-increasing costs and declining value made it clear that the time had come. This move also marked the end of an era for my trusty HDHomeRun Prime, which had been a central part of my setup since 2013.

Prior to 2013, my local cable company, Comcast, made it easy to access digital TV directly through the coaxial cable, but things changed when encryption became standard. Suddenly, renting expensive equipment was the only option, and that’s when I discovered the HDHomeRun Prime. With the help of a cable card (which was mandated by the FCC), it allowed me to decrypt cable signals (legally) and stream them across my devices. For a one-time investment, I was able to bypass ongoing rental fees, and it quickly paid for itself.

Over the years, my setup evolved. Initially, I paired the HDHomeRun with Windows Media Center on a repurposed laptop and used Xbox 360s as extenders. It was a creative solution, but times have changed. With modern apps and DVRs, the technology has moved forward significantly. Yet, as convenient as the HDHomeRun Prime was, the creeping costs of cable—especially the ballooning broadcast TV fees—became impossible to ignore. Spending nearly $500 a year for local channels I could get for free with an antenna no longer made sense.

Transitioning to over-the-air (OTA) TV has been smoother than I expected. I invested in a high-performance antenna recommended by a trusted expert and connected it to an ATSC 3.0-capable HDHomeRun device. This setup delivers crystal-clear broadcasts at no additional cost, and the savings have been substantial. For those channels that remain encrypted, my antenna also picks up older ATSC 1.0 signals as a workaround. You can see more about that journey here.

Cutting the cord hasn’t meant giving up on the news or big events. Local stations often share stories via their websites, RSS feeds, or YouTube channels, making it easy to stay informed without a cable subscription. And for marquee events like the Super Bowl, free streaming services like Tubi are stepping in, offering content in high quality without the added fees.

The broader industry trends are clear. As traditional broadcasters continue to raise prices and push encryption, they risk alienating even their most loyal customers. At the same time, free and flexible alternatives are gaining ground. The shift may not be immediate, but it feels inevitable.

Letting go of cable TV and the HDHomeRun Prime was bittersweet—it had been a reliable companion for over a decade. But the freedom and cost savings of a cable-free setup are worth it. It’s a change I’d recommend to anyone still on the fence. The options for high-quality, free entertainment are out there.

Walmart Onn. 11 Tablet Pro Review (New 2024 / 2025 Version)

In my latest video, I take a look at the Onn 11 Tablet Pro, Walmart’s top-of-the-line tablet, following up on my earlier review of their more affordable option. At $130, this tablet offers impressive value for its price (compensated affiliate link).,

While it’s not as powerful as last year’s Pro model, it brings notable improvements over the lower-cost version.

The 11-inch display, with its 1840×1280 resolution, delivers a crisp 1080p experience, which is particularly noticeable when streaming content. Unlike the cheaper model, which is limited to 480p video playback on services like Netflix, this tablet supports full HD.

The build quality is solid, with a metal back and glass display, though it lacks a fingerprint-resistant coating. Weighing just over a pound, it’s lightweight and comfortable to use, but the touch responsiveness could be improved, as the screen occasionally misinterprets input if fingers rest on the edge.

Audio quality is decent, with four DTS:X-supported speakers providing better sound than expected at this price point. However, the absence of a headphone jack means you’ll need to rely on Bluetooth or the USB-C port for wired audio.

For storage, the tablet includes 64GB onboard with an option for expansion via an SD card. With 4GB of RAM and a Qualcomm 685 processor, performance is adequate for casual use but comparable to the lower-cost 10.1-inch model. Benchmarks reveal no significant speed advantage despite the extra memory, and gaming performance, while passable, doesn’t match higher-end devices or even last year’s Pro 11 model.

The tablet’s cameras are a pleasant surprise, with both front and rear cameras offering 5MP resolution and 1080p video at 30fps. While stabilization isn’t great, the image and video quality exceed expectations for the price. These cameras are functional for video calls, casual photography, and basic video recording.

Battery life is a strong point, with the tablet delivering up to 16 hours for basic tasks like web browsing or streaming. Gaming reduces this significantly to about 5-8 hours, but it’s sufficient for typical daily use or long flights. The device runs Android 14 with a clean interface free of unnecessary pre-installed apps, aside from Walmart’s own pre-installed shopping app.

When it comes to gaming, casual titles like Minecraft and Roblox perform well, though demanding games may struggle. Streaming games via Xbox Cloud Gaming worked without major issues, even with the tablet’s limited Wi-Fi capability, which maxes out at 250 Mbps due to its single-channel AC radio. However, the lack of Wi-Fi 6 and offline GPS support might be dealbreakers for some users.

This tablet stands out for offering a well-rounded media and casual gaming experience at an accessible price. Its sharper display, added RAM, and support for higher-resolution streaming make it a nice upgrade over Walmart’s lower-cost options. It might not be a powerhouse, but it’s an affordable, capable device for users who value functionality over cutting-edge performance.

Taildrop: Simple device-to-device file sharing via Tailscale

Last year, I shared a video about Tailscale, a personal VPN that has been transformative for how I manage my devices. Tailscale allows its users to connect all their devices seamlessly, no matter where they are in the world without opening ports on your router. It’s free to use, operates without requiring firewall adjustments, and adds a significant layer of security. Since that first video, I’ve discovered new ways to make the most of this tool, and one feature that stands out is Taildrop.

You can check out in this video review.

Taildrop makes transferring files across devices incredibly simple. Imagine an Android phone with a photo that needs to be sent to an iPhone. If you’re familiar with Apple’s AirDrop, you know how easy that process is but of course it doesn’t work on Android phones. Taildrop brings that same ease to any device in your network, regardless of the platform. Pull up the image, click the share button on your respective platform, select Tailscale and with a tap the image is transferred to the other device.

During a trip to Las Vegas for CES, I relied on Taildrop to send files from Nevada back to Connecticut with minimal effort. The process worked just as smoothly across continents as it does within the same room.

Getting started with Taildrop is straightforward. After enabling the feature in Tailscale’s admin panel, it syncs across all devices in the network. Once activated, files sent via Taildrop will typically land in the download folder on Android, Windows, and Mac. For iPhones, files appear in the Tailscale section of the Files app. The process is intuitive—just a few taps or clicks, and the file is delivered. Even large files are supported, though transfer speed depends on the available bandwidth.

One limitation I encountered is that the Mac and iOS versions don’t currently allow for file transfer resumption, so any interruption means starting over. Other platforms like Android and Windows are more forgiving, letting you pick up where you left off. This is an area where the feature will likely improve over time.

I’ve tested Taildrop in a variety of scenarios. For instance, on my Windows 11 ham radio computer, I transferred a capture of packet radio data to a MacBook Air in seconds. From the Mac, I sent a file to my Android phone, which appeared in its download folder as expected. With devices authenticated on the Tailscale network, there’s no need for additional approvals—files move securely and directly.

For users with NAS devices like Synology or QNAP, Taildrop requires some initial setup, but the process is well-documented on Tailscale’s support page. Once configured, it works just as seamlessly, depositing files into a specified folder for easy access.

This feature has become indispensable for me. Whether moving files between devices or remotely managing uploads, Taildrop handles it all effortlessly. Combined with the security and flexibility of Tailscale, it’s a tool I now rely on daily.

If you haven’t yet explored Tailscale, it’s worth a look. Beyond Taildrop, it has transformed how I manage my network, locking down outside access while maintaining full connectivity to all my devices and Docker containers. It’s a practical and powerful solution that continues to impress me.