I begin the video with an overview of the great Atari 50th Celebration compilation (affiliate link). The compilation is a virtual museum of all things Atari including their arcade games, computer systems and all of their consoles (including the Lynx & Jaguar!). There are dozens of playable games on the compilation but many of my favorites didn’t make the cut primarily due to licensing issues.
There are lots of great videos on the Atari 2600 on YouTube so I focused on a few favorites from my childhood collection in the second part of my video. Most of the games featured are my original 2600 cartridges! Surprisingly they all booted right up.
For the games I couldn’t find cartridges for I was able to play them using a flash cart called the Harmony Cartridge. The Harmony cart can play just about every game ever released for the 2600 including some titles that make use of special chips like Pitfall 2. One of the things that I love about living in the future is that we have great new hardware for legacy systems!
The Atari 2600 era was a time of great experimentation where every idea was made into a game. Many of these experiments fell flat but many others became timeless classics that influence modern game mechanics.
In a comment, viewer Yuan Chang best summed up the 2600 : “Gaming distilled down to its purest elemental form and even in that form, it provided countless hours of fun.”
The Nvidia Shield TV is one of the longest supported Android devices ever made. The original 2015 Shield TV is still getting updates and support from Nvidia. But it will be losing its Gamestream feature in February of 2023.
Gamestream allows for streaming PC games to the Shield from a PC running with a Nvidia GPU. You can learn more about it in my latest video.
Gamestream is a free feature that works in conjunction with the GeForce Experience on Nvidia GPU equipped PCs. The GEforce experience scans hard drives for compatible games irrespective of where they were purchased from and automatically optimizes the game’s settings for streaming. The game reverts back to its prior local settings after the game quits.
The only officially supported Gamestream client is the Nvidia Shield TV and their now defunct handheld and tablet Shield devices. But the open source Moonlight app has for many years worked with Gamestream to allow this to work with just about any device. To their credit Nvidia did not do anything to restrict Moonlight from doing this even though the feature was designed to sell more Shield hardware.
The Moonlight team is now putting some resources in to help the Sunshine project get its open source server up and running. Once that happens Moonlight won’t be dependent on Nvidia’s software any longer and non-Nvidia GPUs will also work.
For alternatives Nvidia suggests subscribing to their GeForce Now subscription streaming service. The service connects to a subscriber’s Steam, Epic and Ubisoft accounts and streams some (but not all) of their game library from Nvidia’s cloud data centers to the Shield and many other devices. Not every game works with GeForce Now as many publishers restrict streaming of their titles – even for purchased games.
For a free alternative Nvidia suggests installing the Steam Link app that allows for streaming games from a PC’s Steam library. But Steam Link has its limitations and requires additional work to load non-steam games from the Shield.
My gut on this is that Gamestream was not a heavily used feature and a bulk of those who were using it did so with the Moonlight app vs. the Shield TV. Hopefully progress on the Sunshine server will be swift over the next couple of months and we’ll have a better alternative than before. Stay tuned!
I updated yesterday’s blog post on LastPass to indicate that some users got an email about their catastrophic breach. But I did not. I got one email in August when this first broke and that was it. And yes, I checked spam and trash. Nothing since August.
The email that reached some but not all users made no mention of their vaults being in the hands of the threat actor. They expected users to click through to their blog post and read a few paragraphs in to get the bad news.
So ultimately the only people notified were those paying attention. Their crisis PR team is running the show now, not good. LastPass’ corporate interest will now be the priority vs their customers’ security.
Each customer’s risk level depends on their password length and complexity. Lastpass is passing the buck by essentially telling customers it’s their fault if their stolen vault is compromised. Not a good look from a company whose one job was to protect password data and didn’t do it.
I wonder if work from home led to this breach. It sounds like the threat actor’s code possession was enough to convince the social engineering target to turn over the keys to the kingdom. Clearly LastPass lacked human to human authentication protocols and learned nothing from their prior breaches.
I have deleted my Lastpass account and switched to Bitwarden for now. Ultimately all of these services are a juicy target for hackers given the value of the stored data. So next I’ll be experimenting with ways I can create something under my direct control.
LastPass is a popular password manager that was heralded for their security model when it was a small startup. The allure was convenience coupled with security. Unique, random passwords were easily generated for every website a user visited and could seamlessly sync across devices. The password “vaults” that stored the passwords used an encryption model that made it impossible for LastPass or anyone else to access the contents.
Back in August LastPass notified users of a security breach that impacted their development environment but they did not believe any user data was compromised. But yesterday they quietly updated their blog post on the incident and indicated that this was a catastrophic breach. The bad news was buried several paragraphs into the narrative:
“The threat actor was also able to copy a backup of customer vault data from the encrypted storage container which is stored in a proprietary binary format that contains both unencrypted data, such as website URLs, as well as fully-encrypted sensitive fields such as website usernames and passwords, secure notes, and form-filled data.”
LastPass tried to soften the blow by touting how secure their model is but having what appears to be the entirety of their customer’s data in the hands of a threat actor is not a good thing.
No doubt heavily resourced governments will want to get their hands on these vaults to look for vulnerabilities. It also appears from the notice that email addresses and websites associated with these vaults were not encrypted, making it easy to identify its owner along with the service credentials stored in the vault. And because the master password for most users is something easily remembered it’s not out of the realm of possibility that many vaults could be unlocked with off-the-shelf GPUs in a matter of days.
But worse is that this data is now out there floating around. Even if the security model is secure today, that doesn’t mean it will be secure tomorrow. Encrypted data from the 90’s can now very easily be cracked even on a relatively cheap laptop. Patient threat actors will have a treasure trove of data to look at as the steady increase in computing power will reduce the time it will take to crack open these vaults.
In the short term the biggest threat for most users will be phishing attacks. This is because the websites stored in the vault were not encrypted so a threat actor will know that the user has an account at a financial institution for example. If you are or were a LastPass user you should be vigilant about clicking on links for services you visit frequently.
What infuriates me is that they haven’t pushed out any notice to their customers beyond this quiet update of their blog yesterday. Their press department is not responding to inquiries either.
UPDATE: A few folks who follow my Facebook page received an email from Lastpass yesterday but I did not. But those who did get a communication didn’t receive much other than an ask to visit the blog. they should have disclosed a lot more here:
As for me I spent last night changing passwords and transitioning over to BitWarden. Lastpass has lost my trust – not only because of this hack but also because of how poorly they are communicating to users.
In my original review of the 3rd Generation Amazon Fire TV Cube I said that Amazon’s top of the line streamer is not something I can recommend for enthusiasts due to issues with lossless audio passthrough in Plex and similar apps.
Enthusiasts running Plex typically stream rips of Blu-Ray movies with lossless audio tracks containing Dolby ATMOS True HD audio or one of the many flavors of DTS. The only name-brand box that does it perfectly is the aging Nvidia Shield so many enthusiasts were hoping that Amazon would offer something to meet that need as well.
And then I got a DM from my friend Elias Saba at AFTVNews.com who passed along this story about those issues being addressed in a firmware update. So, I bought another box (I sold my original one to a viewer) and posted this followup video to see if they got it fixed.
The good news is that Dolby TrueHD ATMOS audio is passing through correctly now. The bad news is that no flavor of DTS audio is passing through and it looks like Dolby Vision support for enthusiast media that was working before is no longer working. All of my titles defaulted to HDR10 even with an embedded Dolby Vision track. Dolby Vision continues to works fine in streaming apps which is probably 99% of this product’s audience.
I am going to hold onto my Cube though as it appears Amazon is trying to address this enthusiast need. As new firmwares come down I’ll continually test things to see if anything changes. Stay tuned!
I stumbled across a game I used to play in 1992 as a teenager called “Night Raid” the other day. You can find it on the Internet Archive and play it right in your browser!
Night Raid was a take on an old Apple II game called “Sabotage.” The premise of both games is that you’re a lone anti-aircraft gunner fighting off wave up after wave of paratroopers trying to take you out. You earn points with each aircraft and paratrooper hit and lose a point every time the gun is fired.
I hadn’t given the game much thought over the years (I even forgot its name) but the other day something about it popped in my head that sent me down a Google rabbit hole. A few minutes of searching brought me to the Internet Archive and I immediately was back in the 90’s playing a cool shareware game in my browser! I’ve since added it to the DosBox-X instance I run on my Macbook Air.
One of the fun parts about Night Raid was the heavy promotion of the Software Creations Bulletin Board System (BBS) that hosted its download files. During the Intermission scenes a little airplane flies overhead with the phone number for the BBS and the end screen of the game also encourages people to dial in and experience the board’s 50 lines and 6 gigabytes of storage space!
Software Creations was located in Massachusetts and was one of the larger BBS systems at the time. When a hot new shareware game came out you’d hear about it on FidoNet message boards on your local BBS but you’d have to dial out long distance to pick up the files at Software Creations. This was also where I picked up Doom in 1993 right when it was released to the public.
As you can imagine I racked up some major phone bills dialing into that BBS. Night Raid’s zip file came in at around half a megabyte which took roughly 33 minutes to download on the 2400 baud modem I used at the time.
When Doom came out the following year it was a whopping 2 megabytes and took over two hours to download. Unfortunately for me there wasn’t an active shareware gaming user base in my local calling area beyond my buddies and me so long distance was the only way to get at the latest goods.
When I finally had the cash to buy a 14,400 baud modem that same Doom download could be done in 20 minutes. With long distance rates running about 10-15 cents per minute that faster modem offered a huge return on investment!
BBS systems are still out there but are mostly available on the Internet these days via telnet. I did a video on the topic a few years ago if you’d like to get a feel for what it was like on one of those systems.
The whole scene died out pretty quickly once dialup Internet service became available in the mid 90’s. But it’s great to see so many people working to keep not only BBS’ing alive but also some of the networks that connected them together like FidoNet.
The Software Creations BBS was acquired by the “Total Entertainment Network” in 1995 right as BBS’ing gave way to web surfing. Apparently TEN cut their losses as the BBS’ing collapsed and shut the system down only a year or so after the acquisition according to Apogee Software’s Joe Siegler in a 2002 message board post:
It was even more of a surprise to us – as we had our files there. They essentially closed down overnight – we had no warning that it was going to happen.
It would be super cool if one of the Sysops (short for System Operator) had a backup of the BBS somewhere. How awesome would it be to have a time capsule like this accessible via Telnet to experience what the PC gaming scene was like back then.
So Amazon pushed out an update for the new Fire TV Cube that apparently fixes some of its issues with lossless audio and Plex. This of course was after I sold the old one to a viewer! I bought a new one and we’ll see how it does now. Stay tuned!
A few months ago I started look at ways to follow Indieweb principles in how I produce and consume content. On the consumption side I spent some time freshening up my RSS reader with a blob of feeds that I have been tracking for almost twenty years now. As for creation I set up this blog and looked at ways to syndicate content from the blog out to other places.
In my latest video in this series we take a look at how it’s all working six months later. I also look at some ways to decentralize other parts of my work, including video using a federated platform called Peertube.
It’s been fun exploring how open source developers are engineering ways to replicate the experience and reach potential of centralized platforms but in a way that’s completely decentralized. Join a server if you want or spin up your own – either way you’re in control of your content and data. And the best part is that there’s no owner who can pull the plug on it.
The past few weeks have shown the perils of centralization with Twitter’s ongoing drama and the collapse of centralized crypto exchanges. In many ways centralizing things on the Internet runs counter to its design doesn’t it? With the proliferation of much faster upstream broadband there’s a lot of opportunity in the decentralized “fediverse.” I think this will likely be as much of a focus in the 2020s as centralized networks were in the 2010s.
This makes use of a premium Zapier feature so you’ll need to have a paid subscription in order for this to work. Zapier is something that saves me a ton of time so I’m happy to pay for it but other solutions might work better for those with a little more coding prowess.
If you’re interested in how this stuff works check out my video called “owning your content” that shows how I have adopted Indieweb principles for my written content.
It’s crazy how fast the end of the year is approaching! I’ll be taking a little time off just after Christmas to recharge ahead of CES but will hopefully get a bunch of stuff queued up! I lost a day and a half this week due to a minor bout of what was likely COVID (my second or third go-around) but I’m feeling much better today. Definitely not as bad as my experience with it last year!
I’m focusing a lot these days on adopting “indieweb” principles for my content. I’ve talked a little bit about this before but I’ve added some new things to the mix that I’ll likely talk about later this week or early next.
Additionally I’m going to do a short piece providing some context on the Twitter controversy surrounding aircraft ADS-B data and how it all works. I figure if people are going to argue about Twitter bans they should at least know what the argument is about :). This won’t be another hot-take about Twitter (there’s plenty of those) but rather just looking at the basis of the dispute, how private aircraft data is collected, and how anyone can do it for about 30 bucks, no radar system required.
The biggest challenge any of these slide-in controllers have is finding a way to make things fit properly given how every phone is a different size. Phone cases complicate this problem further. Gamevice attempts to solve this problem by including dozens of slide in adapters to ensure a snug fit. They also have a compatibility guide on their website to provide further peace of mind.
I tried a couple of phones, some with cases, some without. I was able to get all of them to fit snugly, unlike the Kishi that always felt a little loose. It’s not all that difficult to slide out the spacers and put new ones in. But you’ll definitely want to hang onto the original packaging so you don’t lose them. Gamevice says they can fit up to the Samsung Galaxy S22 Ultra but it’s not big enough to accomodate larger devices like tablets. So the iPad Mini is a no-go here.
On the Android side you’ll need a phone that has a USB-C port that supports OTG data devices (most meet that requirement these days). The iPhone version uses a lightning connector and it will fit everything from a small iPhone 6s all the way up to the iPhone 14 Pro Max. Both versions offer a passthrough charging port, with the Android version supporting USB-C and the the iPhone version using a lighting connector. You’ll also get an actual 3.5mm headphone jack on the left-hand side of the controller!
The controller interfaces with its USB or lightning connector to the phone, meaning it’s not using bluetooth. It therefore doesn’t need to be charged and it shouldn’t draw all that much power from the host device. This will reduce input lag a bit but the performance will vary based on the phone and the USB controller in use. I have found even some of the best phones are not great when it comes to input latency, however.
From a gameplay perspective the Flex solves a lot of the problems I had with the Kishi. Gone are the analog deadzones and oversized thumbsticks. Controls are very sensitive and begin responding with just a slight movement on the stick. The d-pad is better too but still not perfect. I found that it would sometimes register errant diagonals when playing 8-bit NES games.
All in I found the Flex to be very competitive with my favorite mobile controller, the Backbone One for iPhone. The d-pad is better on the Backbone but the Backbone won’t work with phones in a case. The Flex appears to be a nice improvement over the original Razer Kishi design.
I’ve long been a critic of my local utility monopolies because they’ve put their profits ahead of serving customers and taxpayers. Over the last couple of decades as regulations eased, utilities reduced staff and deferred maintenance resulting in several week-long outages and reliability issues across Connecticut.
Meanwhile, electricity monopoly Eversource enjoys billion-dollar net profit margins, money that comes from consumers who have little choice but to pay whatever rates they decide to charge irrespective of the quality of service being offered. Their corporate culture was on full display at a 2020 hearing when former Eversource CEO Jim Judge ranked shareholders as his top priority above customers and ratepayers.
Progress is being made, however. Connecticut’s Public Utilities Regulatory Authority (PURA), which allowed these reductions in service quality for many years, is finally starting to turn the page and hold utilities accountable. Additionally the recently enacted “Take Back our Grid Act” is another set of accountability measures that can begin to roll back decades of customer neglect.
But PURA’s staffing levels haven’t increased despite these new tools, making it difficult for the authority to document examples of local deficiencies that could lead to enforcement action. But there is an existing mechanism that can help provide PURA those “eyes and ears” on the ground: local Cable Television Advisory Councils.
Decades ago when legislators grew tired of dealing with constituent complaints about cable television service, they created this local council structure that meets regularly with cable television company officials to work out problems before issues rise to regulatory action. Members are appointed by local municipalities and boards of education.
The councils still exist today but their regulatory authority is limited only to cable television and not any of the other utilities that benefit and profit from poles and wiring conduits running on public and private right-of-ways.
It’s time to modernize this advisory council structure and extend its advisory authority to any utility that makes use of poles and conduits in a community. Irrespective of whether the service is regulated, the use of public rights of way to deliver services is a regulated activity.
Extending this oversight authority will cost the state nothing beyond what it’s already doing to support these councils and will help PURA exercise the new regulatory powers granted to them in recent legislation. But, more importantly, this will provide an opportunity for utilities to build productive relationships with local customers and hopefully prevent regulatory action from taking place at all.
While utilities could choose not to attend these local council meetings it would most certainly be in their interest to be there and listen to customers and officials about local concerns. And if a resolution is not possible locally, the council could, through PURA, initiate regulatory dockets for enforcement such as they have with the cable companies for decades.
I have no doubt the utilities will oppose any additional oversight of their businesses. But they no longer have the trust of the public and it’s time to create a process to re-establish relationships and refocus their efforts on those who matter: us.
My latest video is a review of the Anker Powerhouse 90 – this is a portable powerbank that has an AC outlet! You can see my full review here.
If you’re shopping for one of these you’re going to find another one called the “Powerhouse 100.” I think the “90” is the new version of this product with a slightly smaller battery. This may have been required so it can be under the minimum for airline carry-ons, or they were looking so shave some cost. The difference between the two is about 10 watt hours.
The Powerhouse 90 advertises about 87 watt hours – meaning if you had a 1 watt draw it could run (theoretically) for 87 hours. The real-world longevity you’ll get out of the battery will vary but I think if you had a full load of 87 watts through the AC outlet you’ll likely see far less than an hour due to overhead of the inverter, etc.
The powerbank can deliver a maximum of about 160 watts simultaneously budgeted as follows: 12 watts for the USB-A ports, 45 watts for the USB-C port, and 100 watts for the AC outlet. In the video we plugged in a large studio LED light that draws 36 watts via the AC outlet, had a Steam Deck drawing about 33 watts out of the USB-C, and plugged in two smart phones to the USB-A ports. It was able to provide consistent power to each device.
The power bank charges via the USB-C port at a rate of 45 watts. A full charge from empty will take about 3 hours via a 45 watt USB-C power adapter.
So what’s the use case here? For me it’s about charging the devices I have in my production bag that can’t charge over USB. For example my Canon and Sony camcorders I use for remote productions need to be charged using their AC adapters. This pack will allow me to keep things topped off (or even fully powered) as we walk from one location to another when shooting a dispatch video.
This morning I was posting a link back to my Mastodon account on Twitter and got this message: “We can’t complete this request because this link has been identified by Twitter or our partners as being potentially harmful.”
It looks as though this new ban applies to most if not all of the major Mastodon instances that are out there. This means that any link to a Mastodon account or a post on those instances are not even allowed to be posted on Twitter.
This comes in the wake of Elon Musk banning the “Elon Jet” account that was keeping track of the whereabouts of his private jet using publicly available ADS-B data. ADS-B data comes from a transponder required to be installed on most aircraft that transmits the airplane’s tail number, position, altitude, etc. These transmissions can be picked up on the ground with cheap hardware and free software as I demonstrated in this video (tune in at at the 8:53 mark).
Musk says that realtime doxxing (publicly posting private information about a person’s whereabouts) is not allowed and that any account doing it will be removed. He also extended the ban to accounts that link to that information elsewhere. Twitter took this action after Musk says a car carrying his young child was followed home from the airport and potentially blocked by a stalker. Musk posted a video of the alleged stalker along with the alleged stalker’s license plate but did not file a police report as of the time of this writing.
The question to be asked here is whether or not those owning and traveling in private jets have a reasonable expectation of privacy – especially as the position of those aircraft are broadcast unencrypted to other aircraft and stations on the ground.
This move runs counter to the “free speech” direction Musk says he wants Twitter to take. Additionally it appears to be counter to the free market principles that Musk purports to believe in. They could have blocked individual Mastodon links to the Elon Jet account as opposed to restricting any links to the entire fediverse – a competing network that is attracting many Twitter users.
My latest video is the third in a sponsored series exploring how to use a Synology NAS device as a backup solution. This new video focuses on how to back the NAS up offsite once you have data on it through their Hyper Backup solution and via Snapshot Replication. You can see the video here.
Hyper Backup is something we’ve covered in the past. It takes data stored on the NAS and backs it up to external media or cloud destinations. It can be configured to store multiple versions of files so if somebody messes something up it’s relatively easy to “roll back” to a prior version if it was included in the back up job.
I’ve been using Hyper Backup for quite awhile now and have multiple jobs running on my personal NAS. One job maintains a full backup to a USB hard drive at night, while another sends my important work offsite to Amazon S3. The NAS settings gets included in that as well so it’s pretty easy to do a full restore should one be needed on existing or new hardware. The data is encrypted before it gets sent to the cloud provider for added security.
Snapshot replication works in a different way. It will keep everything on one Synology NAS mirrored on another every time a snapshot job fires off. Should there be a hardware failure a relatively quick switchover can occur without the need to run a full restoration process first. You can even switch back to the original hardware once everything gets resolved. And because it works on an incremental basis you can get the initial data load done on your local network and then relocate the destination NAS offsite for smaller incremental updates.
It’s really crazy how many features Synology’s developers have packed into these boxes. I could devote an entire channel to this topic and never run out of things to cover. This series was great because I learned some new tricks that I didn’t know my NAS could do.
You can see my growing collection of Synology videos on this playlist. I want to thank them for their ongoing support of the channel!
Each year I take a look back at all of the products I reviewed and pick a few that I think stand above the rest. Some of these are not terribly exciting but are now a useful part of my workflow! Check out the video here!
From a tech perspective this was the year for handheld devices. The FPGA-based Analogue Pocket and Valve’s Steam Deck are the most innovative devices I looked at this year and are now a regular fixture in my gaming world.
While not as revolutionary, Apple’s M2 Macbook Air was easily my favorite laptop of the year. It checks all of the major boxes reviewers and consumers look for in a laptop: excellent performance, seemingly limitless battery life, and in a thin and light weight package that comes in under 3 pounds. While the performance gains are marginal vs. the M1 Air, the hardware improvements make it a great upgrade from the prior model.
There are a few other items of note in the video including some neat smart home devices, a great screen cleaner, and more.
Let me start by saying I’m a huge proponent of electric vehicles (EV’s). I’ve been driving electric for the last twelve years, starting with a Chevy Volt and now in a Tesla.
Range anxiety is still a big issue for electric vehicles. Despite massive developments EV’s charge relatively slowly vs. a gas powered vehicle fill-up and charging stations for non-Tesla vehicles are few and far between. A bulk of the publicly available EV chargers are mostly “feel good” installations that charge quite slowly. My local grocery store’s charger for example will get me maybe 5 or 6 miles back in the tank after 30 minutes in the store.
For the last decade most EVs have been cars or crossover SUVs built on car platforms. Over the last year manufacturers have introduced electric pickup trucks and SUVs to the market, with Ford, Rivian and GM shipping their vehicles right now and Tesla’s Cybertruck right around the corner. These vehicles are much larger and heavier than the typical electric car, which means they need larger batteries to get the same range as a comparable electric car would. And those big batteries take longer to charge – if you can find a charger at all.
The buried lede in Rich’s video was how hard it was for him to actually get down there. Check out his adventure here, like all of Rich’s videos it was very entertaining:
Because Rivian doesn’t have its own charging network, Rich had to rely on publicly available chargers. Some were very slow. Others were not where they said they would be. At one point he had to ask the owner of a bed and breakfast if he could plug in for a little while to get to his next destination. And when he did find a faster charger the cost to use it was often the equivalent to a tank of gas in a traditional vehicle.
His range anxiety was exacerbated by the cold weather in his home state of Kansas that reduced range even further. It was so bad that Hoover and a friend who co-owned the vehicle decided to sell it and try something else.
They ended up choosing an electric Hummer that Hoover says addresses many of the range issues by using a much larger battery pack with twice the capacity of the Ford. We’ll have to see how it fares after Hoover has had more time in the Hummer.
These range and charging issues indicate that just building and marketing an EV is not enough. Without a charging network that makes the vehicles practical it’s really just half a car.
Tesla addressed this issue a decade ago when they started building out their super charger network. After 8+ years of Tesla ownership I’ve never come close to running out of juice, mainly because there’s always a supercharging station nearby wherever I may be.
Tesla has experienced growing pains with the network (especially in areas like Silicon Valley where there’s a lot of Tesla ownership) but in my experience I’ve always been able to get charge when I needed one here in the Northeast US. Charging is still a bit slower than filling up a gas tank but much faster than even some of the fastest chargers available for other vehicles.
Tesla typically charges drivers market rate for the electricity but they occasionally use the supercharger network as an incentive to clear out vehicle inventory. When I purchased my car, a prior model-year leftover, they gave me “free gas for life” in an effort to get me to sign on the dotted line. Not a bad deal!
At this point I don’t believe the national goal of EV’s representing 50% of vehicle sales by 2030 to be realistic unless some major efforts are made to improve both charging speed and availability – especially for those who do not have the convenience of being able to plug in at home.
I have been looking for an affordable 4k gaming monitor that could go north of 60hz while also support Nvidia G-Sync. The 28″ Samsung G70A fits the bill for me and you can see more about it in my most recent review here.
It can run at up to 144hz and supports both Nvidia Gsync and AMD freesync. In the video we tested it with both a gaming PC with an Nvidia graphics card and an Xbox Series S in its variable frame rate mode. Since the display supports HDMI 2.1 the PS5 should also work but I do not own a PS5 to test.
I was impressed with its raw performance, both in its ability to deliver high frame rate 4k video and its very fast 1ms response rate. Even 8 bit NES games ran with barely a hint of image blur with the lowest input lag I’ve tested so far on a display.
But it’s otherwise a barebones display – something I would expect for the price point. It’s not color accurate for content creation, meeting only 90% of the DCI gamut. At 400 nits it’s not incredibly bright either, but fine for late night gaming sessions. The display does support HDR10 but because its maximum brightness is only 400 nits it gets super dim when HDR modes are enabled. So I’m not going to recommend this for 4k media consumption either.
It does not have speakers on board but does have an audio output for connecting speakers. That’s how I have it configured on my gaming PC right now. But it does have RGB lights where speakers would otherwise be located if that’s your sort of thing.
One thing I learned the hard way during a livestream the other day is that cable choice is super important when pushing 4k video beyond 60hz. So for HDMI connections an HDMI 2.1 rated cable is a necessity while DisplayPort users should look for a 1.4 cable. You may get an image initially out of lesser rated cables but once a game gets booted up you’ll likely see the video drop.