The Nixeus EDG27 was one of the best early 1440p FreeSync monitors you could buy, and at CES 2020, the company’s building on that foundation. The newly revealed Nixeus EDG274K is all that and more: a 4K, 144Hz AHVA display (read: IPS-like) with a bright screen blazing at 350 nits (or 400 in HDR mode), complemented by a stark 1000:1 native contrast ratio. Better yet, the monitor comes with AMD’s gameplay-smoothing, tear- and stutter-eradicating FreeSync built in. If you own a Radeon GPU, it should kick in seamlessly, though you can also manually activate the adaptive sync technology if you own one of Nvidia’s GeForce graphics cards.
Nvidia’s energy-efficient Max-Q technology helps gaming laptops slim down, and at CES 2020 today, they’ve reached a new milestone. The Asus ROG Zephyrus G14 is the first 14-inch laptop ever equipped with GeForce RTX 20-series graphics.
Due partly to thermal concerns, most gaming laptops span 15.6 or even 17 inches, with tiny 14-inch form factors reserved for the likes of netbooks and thin-and-lights. No more, though—the ROG Zephyrus G14 is impressively svelte for a game-ready notebook, measuring just 0.7-inches thick and 3.5 lbs. That’s only a smidge more than the MacBook Air in all dimensions, but Apple’s laptop doesn’t do ray-tracing or 1440p gaming.
When it comes to graphics cards, AMD’s owned the sub-$200 price point for years. Nvidia’s GeForce GTX 1060 and 1660 series graphics cards couldn’t hold a candle to the value proposition of the Radeon RX 570 and 580, nor its predecessors, especially once AMD started bundling them with games galore. Those Polaris-powered GPUs are downright ancient, however, and suck down obscene amounts of power compared to modern graphics cards.
Enter the new $169 Radeon RX 5500 XT, launching today after being teased in October.
The Radeon RX 5500 XT brings AMD’s “Navi” GPU, built using the company’s next-gen RDNA graphics architecture, to the masses. Navi debuted in the Radeon RX 5700 and 5700 XT, which immediately became our go-to recommendations for 1440p gaming. Navi’s loaded with cutting-edge tech: The Radeon RX 5500 XT is one of the first consumer GPUs built using 7nm process technology, and to support the bleeding-edge PCIe 4.0 interface. It’s upgraded to ultra-fast GDDR6 memory. The card packs AMD’s latest and greatest media encoders, and fresh display technologies that enable 4K, 144Hz monitors without the need for messy chroma subsampling (though you won’t game at anywhere near those levels with this humble graphics card). It’s tremendously more power-efficient, too.
The scary headlines started popping up almost immediately. “Thousands of hacked Disney+ accounts are already for sale on hacking forums,” ZDNet proclaimed last week. This morning, the BBC followed up with “Disney+ fans without answers after thousands hacked.” Well, I’ve got an answer for you, and it’s one you’ve probably heard before. Stop reusing passwords across multiple services and websites.
Both sites found “thousands” of Disney+ logins available on the Dark Web—just over 4,000, in the BBC’s case. When the publications reached out to affected customers, some admitted that they reused existing passwords for Disney+, while some denied it. But given the low number of so-called “hacks”—a few thousand confirmed for a service that already boasts over 10 million users is a drop in a pool, as RockPaperShotgun cofounder John Walker points out—it seems likely that bad password behavior is more to blame than a breach on Disney’s part.
Nvidia’s G-Sync monitors cost a steep premium over AMD’s rival FreeSync Displays. But are they worth it, especially now that Nvidia lets you use FreeSync panels with GeForce graphics cards? It’s a complicated issue, but PCWorld graphics guru Brad Chacos unravels what you need to know in the video below.
Both G-Sync and FreeSync panels revolve around the same core variable refresh rate technology, also known as adaptive sync. Standard monitors refresh their image at a constant speed, such as 60Hz or 144Hz. Adaptive sync monitors synchronize your monitor’s refresh rate to your graphics card’s output—hence their name. Doing so prevents ugly screen tearing and stuttering, giving you a delicious, buttery-smooth gaming experience.
Legions of people use awful, easily guessable passwords. Much worse, many people reuse passwords across multiple sites and services. That means that if one site leaks your info, the hackers can potentially gain access to any of your other accounts that use the same login information. Google wants to stop that from happening. Today, the site launched Password Checkup, a new service that checks all of the passwords you’ve saved in Chrome against lists of known leaks to see if your data’s been compromised.
The feature builds on top of the password manager Google built into Chrome late last year.
AMD released a driver update on Monday that added Radeon Image Sharpening support to its Radeon Vega-based graphics cards, including the Radeon VII. Doing so effectively ended a minor kerfuffle surrounding the new and awesome technology.
Radeon Image Sharpening debuted in AMD’s excellent Radeon RX 5700 series graphics cards in July. Here’s how we described it then: “Radeon Image Sharpening uses algorithms to intelligently sharpen only the areas that need it, reducing the blurriness that can pop up when you activate various anti-aliasing methods or run games at a lower resolution than your display’s maximum. Better yet, it does so with next to no performance impact.”
Nvidia wants to show off its cutting-edge real-time ray tracing technology with Modern Warfare. On Tuesday, the company announced that it will bundle the upcoming Call of Duty: Modern Warfare reboot with all GeForce RTX 20-series graphics cards from now through November 18. Any desktops or laptops equipped with Nvidia’s RTX hardware are also eligible for the deal.
Nvidia’s been pushing ray tracing forward through strategic use of bundles. Earlier this summer, its GeForce RTX graphics cards came with free copies of Wolfenstein Youngblood and Control, the latter of which delivers the best implementation of ray tracing yet, with utterly delicious reflections and shadows. (It’s one of the year’s best games too.) Wolfenstein is still waiting for a patch to add ray tracing to the game, but Modern Warfare will include the futuristic lighting technology when it launches on October 25, Nvidia says. The game will also support Nvidia’s Adaptive Shading technology, which can increase performance by dynamically reduce the rendering resolution of less strenuous parts of the image.
Those massive, fully loaded Nvidia G-Sync HDR “BFG” displays might be mostly missing in action—HP’s $5,000 Omen X Emperium aside—but that doesn’t mean the dream of silky-smooth living room gaming is dead. On Tuesday, LG announced that it’s adding Nvidia’s G-Sync Compatible technology to five of its 2019 OLED televisions.
LG’s 55- and 65-inch E9 televisions will get the capability, as well as the 55-, 65-, and 77-inch C9 series. The exact rollout date is unclear, with LG simply saying that the firmware update “will become available in select markets via a firmware upgrade in the weeks to follow.”
AMD promised that custom Radeon RX 5700 graphics cards would arrive in mid-August, and as of Monday, they’re here. The wait was worth it.
Sapphire’s kicking things off with its budget-friendly Pulse lineup. Like its predecessors, it offers some nice features and solid cooler designs at prices that won’t break the bank. But one of Sapphire’s most intriguing additions has nothing to do with hardware. We’re highly impressed by the new performance-enhancing, resolution-tweaking “Trixx Boost” feature in the company’s Trixx software suite, which leverages AMD’s Radeon Image Sharpening technology to great effect for everyday gamers.